While summer tends to be a time of rest for many in education, at AVLI it is our busiest period. This isn’t a lament. Rather, it’s a simple recognition that our member schools find value in what we do, and, thus, we are busy. One of the June tasks this year was the post-COVID return of our Summer Teaching Institute, a 4-day intensive professional development experience. It is a cherished time not only because of the excellent work and leadership our AVLI team provides, but because of the small cohort of educators that gathers to learn, share, and work. I always marvel at the quality and commitment of those in attendance. By the end of the experience, all involved are equal parts energized and exhausted which, frankly, is quite satisfying.
This year’s conference title was Mission True: The Power of Purpose in Negotiating Artificial Intelligence. Here we explored both the potential and pitfalls of the use of artificial intelligence in the fulfillment of our mission and service to our students. More specifically, the shared struggle of participants was discerning when and how to utilize such technologies, and, conversely, when and how to guard against their use. One of the prevailing themes that we unpacked was the idea of non-learning - those times when students are confronted with a valuable learning opportunity and choose to forgo it, instead employing a more expedient option such as plagiarizing or relying on artificial intelligence. I’ll speak more to non-learning momentarily, but allow me a short divergence.
The week prior to the conference, I read a New York Times article entitled How to Turn Your Chatbot Into a Life Coach, the first line of which claims that, “a chatbot can be an effective motivator.” Shortly after the conference, I watched an ABC News special on artificial intelligence where Geoffrey Hinton (the Godfather of AI ) stated, “Everybody will have their own digital assistant that is much smarter than them and does what they want.” The first of these statements makes me a bit sad. Can motivation really be inspired by a machine? Regarding the second statement, I will concede the likelihood that everyone will have an uber intelligent digital assistant, however, I am more skeptical that we humans will be in charge of them. The history of the development of modern media technologies does not support such an assertion.
Why do I bring up these two examples? As someone who works in the digital learning space, I am an advocate of the distributive nature of technology to democratize, share, and advance knowledge. Artificial intelligence has a meaningful role to play in education and many other important pursuits. Too often, however, we fail to recognize when the admirable goals we are pursuing with a particular course of action are producing unintended side effects.
Regarding recent technology advancements in education, most of our collective admirable efforts have focused on making knowledge acquisition easier and, thus, more effective. We’ve digitized texts, reengineered traditional learning aids such as flashcards, created on-demand video tutorials and simulations, produced educational games, and developed AI tracking systems to monitor students’ progress and redirect them to other learning resources when they struggle or get off track. On many levels this has been successful in the democratization of learning content. More people have more access to quality learning resources and this is great news!
Why, then, haven’t learning outcomes improved? Even prior to COVID, U.S. test scores in Math, Reading, and Science had largely flatlined, and COVID has made things worse. My fear is that many of the technologies mentioned above have made education more transactional, where the emphasis is on each student following a bouncing ball toward competence. I want to emphasize here that clear objectives and competencies are invaluable to the educational process. It’s important to recognize, however, that many competencies focus on the same types of things computers are already good at. This isn’t to suggest that people still don’t need to master many of them, but it’s reasonable to expect students to question their relevance.
This brings me back to the action of non-learning. We humans will consistently choose ease and convenience over challenge and work. Thus, in the absence of a compelling need, rationale, or desire, we will almost always choose expediency over learning. Take for example the skill of map reading. Since the advent of GPS there is little incentive to undertake the task of developing this skill. Whether that is a good thing can be debated, but so long as a person’s smartphone holds a charge and maintains cell service, they are never completely lost. Thus when it comes to map reading, most people actively choose non-learning.
This is the AI challenge in education. Twenty-five years ago the non-learning route of cheating often wasn’t too much easier than the real work of learning. Now it is, and the perceived consequences of NOT learning aren’t all that great since the path to “the answer” is often just a few clicks away.
This was the backdrop entering our June conference. Understandably, early on there was a degree of wagon-circling, and, had the participants persisted, there is little our facilitators could have done to prevent this from remaining the focus in spite of a fuller agenda. Instead what transpired was a beautiful reframing of the traditional teacher-student narrative, discussions regarding the objectives and competencies of importance, and a recognition of the lifegiving opportunities that lie ahead. Certainly there will be challenges and messiness, and this too was discussed. However, what became clear is that our teachers are up to the task, and by extension, so are our Catholic schools.
My biggest takeaway from the conference was a reminder that, even with all of the technological efforts to make knowledge acquisition more accessible, true learning necessarily remains hard. It forces us to confront new realities, to look for complexities in that which seems simple, and to discern simplicity in the complex. Too often we choose a path of non-learning. Modern society makes it easy to numbly acquiesce to the changing tides and prevailing winds. Personally, it’s time I seek some discomfort, and perhaps this AI disruption is just the push I needed.
CONTRIBUTOR: Jeff Hausman, AVLI President
vol 6 issue 1