How decolonised is my curriculum? There’s an app for that

Imperial creates tool to analyse geographic distribution of authors on reading lists and socio-economic status of their country

五月 13, 2021
Smiling robot illustrating a digital tool that can tell academics how decolonised their curricula are
Source: Getty

Researchers have created the first digital tool that can tell academics how decolonised their curricula are.

Imperial College London’s computer program analyses the geographic distribution of journal authors on reading lists, as well as the socio-economic status of their country.

Its creators say that scrutiny of reading lists is an important tool in understanding who is and is not excluded from academic knowledge and how this perpetuates inequality.

Previous analyses of Western reading lists have found heavy bias towards the Global North. A 2019 study at another UK institution found that core texts were “dominated by white, male, Eurocentric authors”.

However, such studies are usually time-consuming exercises, involving manually researching titles’ authors or place of publication.

In contrast, the Imperial tool converts reading lists into machine-readable code, and then cross-references bibliographic and author data with Web of Science and World Bank records.

It calculates a “citation source index” between 0.0049 and 1 for each article, and an overall mean score for the course, with a higher CSI indicating a higher prevalence of authors from the Global North.

So far the tool has been used on Imperial’s master’s in public health, where it found “a skew toward research produced by researchers affiliated with institutions predominantly in high-income countries”, according to a preprint published on SocArXiv. Last year’s reading list had a CSI of 0.8803.

Mark Skopec, a research assistant in Imperial’s department of primary care and public health, said that they were “not trying to say there is an ideal CSI or that if your course has a CSI of 0.96, then it is a bad course, but to show people where the information that they use comes from, as a way to start a conversation about this”.

“[We want people] to think: is this [score] representative of the information that is out there? And could this be changed in any way?” he said.

Currently, the program can only analyse journal articles, but the team is looking to expand it to cover books, too. Imperial plans to make it available to other courses soon.

Kehinde Andrews, professor of black studies at Birmingham City University, said that while the tool was interesting, he would caution against looking at decolonisation as simply a “Global North/South divide”.

“Just adding up numbers is far too simplistic, as it doesn’t include how those reading lists are used. There are also the much broader issues of pedagogy, staffing etc that may be more important when thinking about the university,” he said.

The Imperial team say that they “do not intend quantitative data to supplant the primacy of experience and theory in decolonisation, rather to support it”.

“The program allows us to analyse course reading lists quickly and efficiently, but the data alone doesn’t mean anything until it is interpreted by the course leaders, the students or experts in the field,” said Robyn Price, Imperial’s bibliometrics and indicators manager.

anna.mckie@timeshighereducation.com

后记

Print headline: Computer says: too Eurocentric

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (8)

A somewhat meaningless metric. You can only add to your reading list things that have actually been written, so if authors in the global south/from poorer nations don't write about your subject - or indeed if authors of the appropriate heritage have gone to work in the global north - you'll appear less 'decolonised' than you actually are. It also leaves no scope for academic excellence - do you really want to select what you recommend to your students based on ethnicity or global origin? Of course not, you want them to access the best writings in your domain!
Totally agree. This was being written as I composed my comment below.
Author response - This method is not intended, and indeed cannot tell you the global output of publications in your field or the 'academic excellence' of them. There are already many bibliographic databases that provide those things. Our study has produced data indicating the geographic distribution of authors chosen for inclusion on curricula. This is of interest to some course leaders and students who otherwise have no reasonable method to access this data aggregated at a module or course level, and can use it as an evidence-based insight to encourage discussions and reflection.
In STEM, it is inevitable that most of the resources exist in the most developed countries. Ditto the discoveries of the past. Such problems may (or may not) bedevil the Arts and Humanities but the same is not true of all subjects.
This is not the first such too, as claimed in the article. https://jlsumner.shinyapps.io/syllabustool/ https://doi.org/10.1017/S1049096517002074
The preprint cites the Gender Balance Assessment Tool, which performs a different analysis (name-based gender and ethnicity prediction) to that of our study (distribution of declared author country and country GNI per capita).
Not sure how it would work in somewhere like Australia which is heavily colonized but in southern hemisphere.
Thank you this is an important point. The preprint acknowledges that the data can only have meaning if interpreted within the historical and demographic context of the institution and country so it would be really interesting to apply to a reading list from an Australian institution.
ADVERTISEMENT