More wicked AI …

By now it’s old news that it’s possible for students (and everyone else in HE) to use GenAI to do many academic tasks. Universities are putting incredible time and effort into developing policies and practices to respond to this at a time when many staff (at least in the UK) are overstretched and angry. This work on education with AI can be wonderfully creative and meaningful but it can also involve building on the systems of distrust that have arisen in recent years and profoundly harm learning relationships (Turnitin and proctoring of online exams anyone …).

As if things weren’t dystopian enough, along comes agentic artificial intelligence. AgenticAI can operate with relative autonomy across complex systems. Cue some students giving their virtual learning environment logins to AgenticAI and asking it to find and deal with tasks and activities that were designed to support students’ learning and development. Then just wait for all of the promises about how AgenticAI will make all sorts of tasks more automated in higher education so that institutions can reduce staff costs further. After all we’ve had *fantastic* experiences up till now dealing with chatbots, institutional technological ‘solutions’ and online journal submission systems, right?

It would be lovely to ask and answer the relatively easy question: “Can we design curricula, learning experiences and assessments to respond well to GenAI and AgenticAI without resorting to more closed book exams and distrust.” The answer to that one is a resounding YES. We just need to remember that transformative learning needs close relationships and dialogues between teachers and learners. We need co-creation of authentic assessments at programme level that mean something to students in the context of their wider lives and imagined futures. We need to build curricula knowing that for students to develop a rich understanding of good academic work is a long term complex conversation working towards more mutually shared understanding. We need to make all our students feel they truly matter and are welcome as they are in our classrooms. We need to guide our students about how to engage critically and ethically with AI.

Unfortunately though, this one is a doozy of a wicked problem because we actually need to try to do all of the above in a context that looks something like this:

  1. Cultures across the Global North that are dominated by hegemonic narratives about seeking personal advantage as isolated individuals in competitive systems. This one trashes the planet and gives everyone all sorts of incentives and encouragement to cheat and cut corners.
  2. Large class sizes, classes with students with wide variation in their backgrounds and preparation for HE, and exhausted and overworked staff.
  3. The oddly persistent belief across much of higher education in the Global North that staff in higher education can offer meaningful learning in systems that default incessantly to simplistic metrics, treat people like interchangeable ‘resources’ and only notice work that fits within specific narrow understandings of what needs done.
  4. Students who have grown up as learners in systems that treat them with distrust and that act as if grades and the competitive advantage they bring are the main thing that matter and the main indicator of their worth.
  5. Extreme inequalities that mean that very few of even the ‘full-time” students can afford to devote much of their time and energy to their studies.
  6. Students who struggle to imagine liveable long-term futures right now – and who can blame them.
  7. The development of these AI technologies and the responses to them being in the hands of for profit corporations (or even more scarily criminal gangs that could blackmail students) rather that being led for the public good.

So can we respond well to AI in the current context of higher education in the Global North. No, I don’t think so. Many wonderful colleagues will try and they will make it work for a time in pockets, especially where class sizes are smaller and people still care enough to work over and above. As things stand, I foresee a trend towards more distrust, more closed book in-person exams, and more students missing out on what should be a wonderful and transformative learning experience in higher education. So we need to change the system. This needs to involve better accountability from developers of AI and rich dialogue between developers and educators. AI could be a positive part of education if we get it right. I think our institutions, especially the powerful ones, need to fight hard and urgently to imagine and work towards better futures. Do you agree?

With thanks to Sian Bayne for suggesting some helpful edits.

Photo by Paul Hanaoka on Unsplash

Leave a comment