I cannot claim great powers of foresight, much though I would wish to.
I first encountered artificial intelligence (AI) in the late 1970s, at the very end of the first “AI winter” of low funding and widespread scepticism about the field’s prospects for success. That winter was initiated in the UK by the Lighthill Report of 1973, which gave a very negative assessment of AI’s ability to live up to the early hype and prompted the government to cut most of its public funding.
I was interested in the field but not more than that. The programming tools, specifically Lisp (and later Prolog), did get my attention, though. I could see the relevance of knowledge-based approaches and so-called expert systems. So, eventually, I became an “Alvey baby”, funded to pursue postdoctoral research as part of the UK government’s Alvey programme to deliver an AI-led fifth generation of computing technologies. This gave rise to a longstanding concern with logic and symbolic reasoning that shaped a good part of my later work in software engineering.
But I was certainly never very engaged with the philosophical and other debates that swirled around AI, as they, of course, do now. Although I appreciate abstraction, I have little appetite for speculation. This is probably a personal shortcoming but one that I am unlikely to be able to shed at this stage.
I recall my first use of GPT. I was profoundly shocked at the behaviour of the system – at what it could do. Indeed, despite the fact I “knew” how it worked in some reasonable detail, I could not comprehend it. I did not understand what sheer scale (amplified by some neat engineering) would yield. I had to repeat to myself that I was not seeing search but, rather, the results of a statistical process giving rise to predictions at the level of words and text fragments. I still do. This shock is important to recognise and to hold onto. We have crossed a frontier.
The impacts of technology and the ways in which innovations are applied have been much studied. For advanced technologies, the translation from lab to broader uptake generally takes an extended period. Although experienced by users or consumers as rapid and disruptive, technology shifts are often, when viewed at a distance and in context, relatively slow. There are, obviously, inflection points and network effects that come into play, but generally applications emerge incrementally and there are lengthy gaps between the early adopters, early majority, late majority and laggards.
This is not what is happening with AI. Take-up is progressing with extraordinary rapidity, productivity opportunities and applications are proliferating, the leverage that can be secured from integration with existing platforms and data resources is evidently very large and more are emerging on an almost daily basis. This is so much the case that in enterprises, impatient with even this accelerating pace of deployment, individual AI use for routine tasks has become commonplace. It is impossible to say with any precision where this might lead, not least because the models themselves continue to develop at extraordinary speed.
So much for reflection. These changes certainly mean substantial large-scale transformation for higher education. We now have the capability to give our students highly personalised educational experiences, individual feedback and sophisticated analysis and problem-solving assistance. Institutionally, we can streamline our business processes to be more responsive and efficient. And our research will benefit from novel forms of scholarly exchange and exploitation of knowledge.
This is not speculation; this is a straightforward account of what is happening. Given also the global expansion of higher education and the associated cost pressures, reflected in the UK’s current financial sustainability crunch, using AI to deliver scaled pedagogy is simply our only available play.
I am not generally a “hyper” of technology but in this case I am content to be mistaken for one. There is no stand-back option: failing to engage with AI is straightforwardly to neglect the mission of higher education.
Sir Anthony Finkelstein is president of City St George’s, University of London. This is an edited version of one of a collection of essays published today by London Higher.