Nelson Ody explains how monitoring content can allow for sensitive research and protect the campus community
Preventing access to extreme material on the internet is a no brainer, isn’t it? After all, most of that stuff’s illegal.
Well, yes and no.
The uptake of web filtering in universities is patchy. Mention web filtering to higher education institutions and out comes the academic freedom placard.
Most institutions think web filtering is a bad idea, but that hard line softens when I explain how sensitive research can be conducted with web filtering in place and how web filtering can play a role in the well-being of staff and students.
Sensitive research may include subjects such as terrorist recruitment tactics or sexual psychology. With the right permissions from the university and the authorities, it’s possible to unblock content that would normally be filtered out. This can be done for one person, a group of people or for particular machines. For everyone else, web filtering will ensure that no illegal material is accidentally seen, which protects both the curious and the vulnerable.
Safety first
Without the right controls in place for sensitive research, a slip-up is all too possible. Let’s say a researcher is in their office legitimately looking at some illegal content. She closes the laptop and takes it with her for lunch. In the canteen she decides to check her email, opens her laptop and up pops a graphic image in full view of other diners.
It is an offence to expose other people, accidentally or not, to illegal content. My advice is to create a safe space for research. People with the right permission could book to work at desktop computers in an access-controlled room. To get buy-in from researchers, universities can demonstrate the value for well-being: the university recognises such research is valuable and is putting in a process to allow freedom of study, while also taking care of the campus community.
It’s not healthy for anyone to be viewing illegal material, so building care plans for researchers to make sure they’re not detrimentally affected is sensible. Preferably, this should be done in partnership with the university’s occupational health department. Web filtering solutions have built-in reporting systems that can be set up to help with this. For example, they can monitor and control the time an individual spends looking at illegal content.
As a means of taking care of staff, processes such as these are in place at the two organisations in charge of monitoring illegal content in the UK: the Internet Watch Foundation (IWF) and the Counter Terrorism Internet Referral Unit (CTIRU). Universities could easily adopt similar systems.
Watch your back
Web filtering can also protect an organisation’s reputation. Imagine this scenario: a member of staff gets an email that looks genuine but is actually a phishing attempt. One click brings up an illegal image with a message threatening to report the person to the police unless they pay up. Bearing in mind web filtering could have blocked both the link and the image, the member of staff might argue they weren’t properly protected by their employer. Similarly, anyone who saw that graphic image on the researcher’s laptop in the canteen could also complain.
Accessing illegal material is an offence that is likely to be noticed by the authorities and spark an investigation. That would make a great story for the media.
Researchers and their employers can be further protected by keeping a record of when illegal content is accessed. Earlier in my career, I used hacking tools because it was part of my job to unlock servers, so I had an unfiltered account. But I would record the tools I was using, when I used them and why, in case I was ever questioned about it.
A flexible friend
The flexibility in most web filtering systems will enable organisations to control what is accessed from certain machines and who has access. However, controlling what can be opened on a mobile phone is virtually impossible, because the owners don’t have to connect to the university network to use their device. On the plus side, mobile phone providers block illegal content at source.
Web filtering is a protective measure, not a preventative one. It can help protect a network through malware detection, it can protect students and staff from viewing distressing and potentially harmful material and it can protect the reputations of institutions by keeping them on the right side of the law. Web filtering solutions can also be so flexible and tailored that there’s no reasonable argument against using one.
The IWF and the CTIRU each produce a filtering list, which Jisc’s web filtering framework supports.
Nelson Ody is security services manager at Jisc and one of the speakers at Jisc’s cyber security conference in London, 7-8 November 2018.
This article was commissioned by Times Higher Education in partnership with Jisc, the UK body for digital technology and resources in higher education, further education, skills and research.