Those of us in university management roles often bear the scars of failed or disastrous IT transformation projects. Automated timetabling systems that have crashed and left students stranded are one example. Payroll systems that have not paid out are another.
Managers have also been frustrated by slow returns on multitudes of overly bespoke, non-scalable “technology-enhanced learning” projects; exciting Moocs that delivered few returns on investment; and the somewhat disappointing reversion to in-person teaching after the great Covid Teams and Zoom experiments.
Nevertheless, we long ago abandoned the idea that IT is simply an adjunct to the delivery of university operations and strategy. We recognise that core enterprise systems and digital technologies are as much part of the fabric of a higher education institution as classrooms, books and labs are. They are the vehicles for the student journey, from enquiry to graduation, and the means, mode and often subject of much of our research.
Hence, almost universally, senior managers accept that digital strategy is our collective responsibility. Accountability for it sits at the top, not in the IT department, and we need a shared overview of the data and technology that underpin key processes. That is why discussions of IT systems, innovation and cybersecurity regularly consume as much time in executive meetings, audit committees and governing boards as finances, estates and HR matters.
Admittedly, many universities are still not fully mature in the ways they manage and govern IT within their overall management frameworks. Management teams typically still lack specialist expertise and are heavily reliant on their chief information officers to carry out the work of explanation and translation, and they are not always willing to take on board carefully benchmarked data demonstrating (as data tend to do in UK higher education) systemic underinvestment in the IT estate. Even when they do, they often cannot make the resource commitments to do much about it.
But we are making progress. We are learning to oversee IT decisions made in the interests of the whole organisation, ensuring that risks are mitigated, resources are deployed effectively and benefits are realised and tracked. We are getting better at anticipating nasty surprises, such as systems that do not integrate with legacy technology, as well as over-customisation and over-complexity.
Above all, we have grasped that technological change is a people-centred phenomenon. We are paying more attention to the labour market and skills scarcity in the IT sector when recruiting, as well as the need to invest in the digital capabilities of our own staff.
Universities like mine also recognise that student strategy and digital strategy are inseparable, whether we are heavily engaged with online degrees or seeking to enhance in-person learning. For some years, we have been talking about digital strategies that deliver a “seamless” student experience – “device-neutral”, interactive, assistive and community-building, combining elements of synchronous and self-paced learning.
We have tried to learn from the best. One example is the University of Arizona’s pioneering work in online learning pedagogies and remote exam proctoring. Others are Imperial College London’s use of augmented reality headsets for medical education and the National University of Singapore’s digitisation of applicant journeys.
At Durham, we have implemented a 24/7 AI assistant, Holly, which has answered thousands of questions and freed staff to add value in other places. Yet we know that this seamless customer experience does not always continue as students enter university and are handed on to less friendly student record systems (the market here being dominated by just two main providers), clunky timetabling systems and variable quality virtual learning environments.
Some universities, recognising students’ native grasp of technologies, have succeeded in positioning students as digital change makers. UCL, for example, recently involved them in tackling the implications of generative AI for our shared educational endeavour.
We are also harnessing the shared power of the sector. We have seen how consortia such as UCISA and Jisc can negotiate greater value for money with big technology vendors. And we have seen how cloud computing offers huge possibilities for sharing resources and services across regional boundaries. For example, the Open University of Catalonia reduced its operating costs by €300,000 per year by moving to the cloud.
We can also use our combined purchasing power and sustainability expertise to devise more efficient cooling and energy capture in high-performance computing centres. At Durham, the UK home of the Cosma supercomputer, we are using a grant of about £1 million from UK Research and Innovation to install photovoltaic power generation. The University of York, meanwhile, is migrating its data to an EcoDataCenter in Sweden. As we continue to host and lead a revolution in research computing power and quantum computing, we will collaborate to improve sustainability.
Universities are, rightly, places of multiple voices and priorities. Yet it is vital that the voice of IT and digital is heard clearly and consistently – and that can’t all be left to the CIO. We need an ethos of collective ownership of this agenda at senior levels. Ideally, boards should include at least one trustee with IT governance expertise, just as they typically include expertise in accountancy and financial management.
A seamless, straight-to-smartphone student experience is still some way off, but we are all part of the “IT crowd” now.
Karen O’Brien is vice-chancellor and warden of Durham University. This is an edited version of her contribution to a collection of essays, Technology Foundations for Twenty-First Century Higher Education, published by the Higher Education Policy Institute and LearningMate.