Hindsight’s hoard: What academics most wish they had known earlier

The acquisition of wisdom about life and career can be just as long and challenging a journey as any research project. Luckily, many wheels have already been invented. Here, seven academics offer the pieces of advice that could have given them a head start 

November 12, 2020
Delorian car on steps of building
Source: Alamy/iStock montage

Don’t defer life too long

When I was a girl, my mother would say to me, “When you grow up, you can be anything you want.” It was a clichéd platitude, offered in complete sincerity, but I realise now that it wasn’t true. Because, at 33, I’m all grown up and what I want to be is a person with a full-time job.

In the past three years, there have been three creative writing vacancies advertised nationally in Australia. That’s an overwhelming average of one job per year. I applied for two of the positions and, on both occasions, managed to make my way into The Final Four.

After my first interview, I was hopeful. I breezed through my presentation, blitzed the informal conversation, and even managed to crack a joke – which, to my surprise, the emeritus professor on the panel appeared to find funny (although I discovered a few weeks later that he’s completely deaf in one ear). Afterwards, the chair of the panel called to say I was “the new style and future of academia”.

“You interviewed better than the successful candidate,” she said, “but he was a bit more advanced. Let’s talk again when the time is right.”

I completely flunked the other interview, but was confident about my application – a 30-page manifesto that clearly responded to the selection criteria, complete with research awards and supervisions, teaching prizes and glowing testimonials. All the things. However, after six months had passed – along with the commencement date of the job itself – I realised I wasn’t The One.

When Job #3 finally came around, I was in the middle of marking 100 essays on The Uncanny and I ran out of time to apply.

When I recently met with my mentor – a woman only a few years ahead of me in age but light years ahead in terms of success and, more importantly, kindness – she offered some advice. As we chatted over Zoom with our babies on our knees – her one-year-old son, sick with chickenpox, my one-year-old puppy, sick on chicken treats – she said in passing, and without irony, “One more thing: don’t put your life on hold for work.”

Her counsel was offered with the same conviction my mum had offered hers all those years ago. And it resonates.

It resonates because when my girlfriend asks gently, “When will we have a baby?”, I say, with more optimism than I feel, “When I have a full-time job.” And when she asks, “When we will get married?” and “When will we take a holiday?” and “When will we buy a house?”, my response is always the same: “When I have a full-time job.” Because a full-time job is paid maternity leave and annual holidays; it’s the ability to afford a mortgage and safe childcare. It’s also – although Covid is changing things – a form of sponsorship in the Academic Hunger Games.

Family at beach near Tardis
Source: 
Alamy/Getty montage

But my mentor helped me realise that there’s no such thing as the “right” time to pursue what you want – which is comforting because it means there’s no such thing as the “wrong” time either. When you’re a young woman in academia, and you want both a career and a family, you’re competing with two clocks. There’s the ordinary clock that ticks above you as you teach, reminding you that there will come a time when you’ll be considered too old for a full-time job. And there’s the internal clock that ticks so loudly it wakes you up at night. Neither clock stops, and you can’t tune them out or synchronise them.

So, in hindsight, perhaps I should have made a more concerted effort to stop deferring life for work. I should have taken more annual leave and left the marking at home on our anniversary. I should have spent more time writing and less time perfecting my lecture slides. I shouldn’t have worried about rejection because applying for a lectureship is like playing poker – you can do everything right and still lose.

Most of all, I should have stopped searching my horoscope on Google and found myself a mentor sooner.

Kate Cantrell teaches creative writing and literature at the University of Southern Queensland. 


Dr Who with schoolchildren
Source: 
Getty

Write for the reader

Career development as a scientist typically means a narrowing of expertise, from knowing a little about a lot to becoming incredibly knowledgeable about a few specific areas.

A reality inherent in this specialisation is that as your own familiarity and confidence with a topic increase, so does your assumption that others also appreciate and comprehend your particular niche. Recognising this fallacy is key to success given that peer review is the means by which you progress.

In my early career, I didn’t appreciate the mind-numbing effect of my over-zealous use of acronyms. I also thought the data spoke for themselves. Yet presentation can make an enormous difference, whether in a talk, a grant proposal or a manuscript.

It was only when I began to review manuscripts and sit on peer review panels 35 years ago that I realised that my own early failures were more the rule than the exception. Too often, very good grant applications are packed with scientific detail or written at such density that they are impermeable to anyone less familiar with the topic than the applicant – which usually means everyone on the review panel or study section.

Manuscript reviewers and grant adjudicators may be fellow researchers working in a generally similar area, but, inevitably, they have their own precise foci of understanding. It is the responsibility of the author/applicant to communicate the meaning and context of their study in language accessible to the reader.

Another thing I have learned is that many of the best researchers are not only superb scientists but engaging storytellers. That ability doesn’t necessarily come naturally, especially for non-native speakers, but learning to weave a narrative by laying stepping stones to clear a path for the reader is a valuable skill. Like charting a challenging countryside hike, you have to ensure there are a few bridges to help navigate leaps of logic or twists in the terrain.

Why is all this not obvious to people? I think the problem is that the feedback to someone who presents their precious ideas or data as an impenetrable lump does not typically alert them to the issue because reviewers don’t like to admit they struggled to follow the logic, as presented.

If a manuscript or proposal is opaque, the reviewer will struggle and lose enthusiasm. Unless that reviewer is a champion of your field, your chances of approval are slim. This isn’t necessarily fair, but it’s the reality. A relatively easy solution is to ensure you have the work read by colleagues – and the further they are from the topic, the better.

Of course, bias against those unable to communicate with lucidity is not the most egregious of the biases that can blight peer review. But while many reviewers are working hard to overcome racial and gender prejudice, their preference for a good yarn is likely to remain. Easing a path to appreciation and evaluation is likely the best way to ensure your work gets its just reward.

Jim Woodgett is director of research at the Lunenfeld-Tanenbaum Research Institute, Toronto.


 

George Washington’s face superimposed on dancers
Source: 
Getty/iStock montage

Talk about the money

In 2003, I received my first full-time academic job offer – a limited-term appointment with a 4/4 teaching load (four courses per semester). I signed the contract without any discussion of compensation. During my eight years of graduate school, no one had ever mentioned money, contracts or negotiations. The oversight was costly.

Two years later, my faculty union distributed a salary survey. I couldn’t complete the survey because my base salary was $20,000 below the lowest salary range listed. I decided to pay a visit to the faculty union president. Nearly two decades later, I still recall her audible gasp when I disclosed my salary.

An investigation revealed that I wasn’t alone. Another young, female faculty member was also being shortchanged. The faculty union also discovered that we were making less than half the salary of several older male faculty members without PhDs or publication records.

In the end, my short-changed colleague and I were lucky. The union fought on our behalf, and within three months our salaries got a $20,000 bump. In my current workplace – a private US college without a full-time faculty union – I would have lived with this pay differential for life. After all, no one ever talks about salaries, and without public disclosure, there is no way to find out. So the advice I wish I had had at the start of my career is obvious: I wish someone had bothered to talk to me about money and told me how to negotiate a decent contract.

I am now on the other side of the table. As a department chair, I don’t do salary negotiations. I do have conversations about salary expectations, but I don’t just ask. I also tell job candidates what they can reasonably expect to make at my college. If someone asks – but few do – I am also candid about how far their salary will stretch. Since it is New York City, I let candidates know that without a trust fund or rich spouse, they will likely never live within walking distance of our downtown Manhattan campus.

I realise that for my wealthier colleagues, this may sound a bit crass. But herein lies the real problem with academics. The professorship’s ranks may lean to the political left, but they contain an overrepresentation of people privileged enough to never feel a burning need to talk about income. And because talking about money remains taboo, so does talking about privilege and people’s lived material realities.

This silence is precisely what has permitted the profession to become an increasingly untenable career choice for anyone who isn’t already wealthy.

Kate Eichhorn is associate professor and chair of culture and media studies at The New School, New York City.


Edge_Hill university and time machine
Source: 
Getty/Alamy montage

To thine own self be true

Christopher Price was a UK Member of Parliament and chair of the Education Select Committee who went on to head up Leeds Polytechnic as it converted into Leeds Metropolitan University in 1992. I met him just once, as I was taking over as director of Edge Hill College of Higher Education at about that time. He offered just one piece of advice: “Never lose your personality.”

After growing up in a council house, I became the first in my family to attend a university – but it was an unfashionable one. Next, I did a PhD in a polytechnic – albeit benefiting from periodic engagements with researchers at the universities of Oxford and Warwick. Then I found myself in an interview for a postdoctoral position at the University of Cambridge.

I can still smell the furniture polish. I can still see the sweep of the staircase, feel the thickness of the carpet. It was wonderful. But it just was not me. I needed to be in a place where I had to fight. Two days later, I took a lecturing job at Edge Hill.

Having been saved by the then education secretary, Shirley Williams, Edge Hill was now listed for closure by the local authority. But we convinced the council to change its mind, then fought for nine years to get our own degree-awarding powers, followed by university title and eligibility for research funding.

I didn’t meet Christopher Price until a good decade after my Cambridge interview, but it was only after he had spoken to me that I realised what I had felt at the time – and why. Working alongside colleagues from Warwick and Oxford during my PhD had been a wonderful experience, but, for me, being in such places had always felt like an out-of-body experience.

I’ve now worked for 40 years in a place where I feel comfortable and I where know I can make a difference. Edge Hill is a university where 70 per cent of students receive full maintenance loans and most are also the first in their family to experience higher education. The joy of their success is palpable and I feel it very personally. Education changes lives and transforms life chances. Only much later did I realise that I had been part of the Robbins generation, an early beneficiary of the broadening of higher education. I was part of the first cohort on my undergraduate course, and I was the first doctoral student in that polytechnic department.

My job, in these incredibly challenging times, is to give others the chance to grasp the opportunities that come their way, to know themselves and to be true to their principles and values.

There is no more rewarding career than one that demands no distinction between your work persona and your out-of-work personality – or the best bits of it, at least. Let people see you as a person, with all your strengths and flaws, not as a job title or role. And wherever you are, whether it be an Edge Hill or a Cambridge, recognise why you’re there.

John Cater is vice-chancellor of Edge Hill University.

 


A patron is not a prerequisite

Academic Twitter is filled with laments about how much your ability to establish yourself as an academic relies on your being a member of the requisite networks of scholarship and patronage. Indeed, as an international student from India who obtained an MA and PhD in the UK, I already knew this well. What I didn’t know was how much career progress relies on your continued proximity to particular forms of power and privilege.

That was particularly true of the US, where my field of South Asian studies was firmly grounded in the work of “star” academics from elite schools, alongside their coterie of students and sycophants. Indeed, it would be fair to say that the field was guarded by such figures – and, ironically, scholars of Indian origin often engaged in the fiercest forms of gatekeeping. It was clear from conferences, journal publishing and job openings that the US academy was almost impossible to penetrate if you were not studying at an elite university there, under the wings of one of these established patrons – or some other eminent academic “parent”.

At no point did my own lack of a patron matter as much as when it was time to publish my first book. My British doctoral training included little guidance on how to convert a dissertation into a book or where to publish it (we were meant to figure these things out on our own). So I relied on peers, especially those who had just published their first monographs in South Asian studies.

Even as they readily shared advice and portfolios of book proposals, sample material and letters to editors, they warned me about the “two-pile problem”. If your proposal was mediated by someone influential in your field, it would land on Pile A, under the editor’s nose. If it wasn’t, it was destined for Pile B – also known as the straight-to-recycling pile. I realised only much later in my career that cold calling the editor only puts you further down Pile B.

But I also learned something else: that there are, in fact, ways to progress and even thrive in the academy that are not reliant on patronage. Despite my lack of influential connections, my applications – for jobs, grants and books – found their way to editors and readers who took my work seriously. Both the tenured jobs I obtained – first in the UK and then in South Africa – were in departments where I knew no one, but where colleagues found my work interesting and thought I’d fit.

Alongside colleagues and collaborators, I have built my own networks – not of patronage, obligation and debt, but of equality, reciprocity and care. While an academic life without an academic parent can seem impossible, transcending the system of privilege is truly liberating.

Srila Roy is associate professor of sociology at the University of the Witwatersrand.


Guy Pearce in “The Time Machine” (movie still)
Source: 
Alamy

People can’t help believing nonsense

It started with a simple academic debate about how pharmaceutical drugs get into cells.

The evidence here was (and is) so overwhelmingly on one side – the one I profess, obviously – that I could not understand how the other side could even try, with a straight face, to dissent. And this was not the first time I had experienced such intransigence in a scientific dispute.

The question that sits at the heart of most scholarship, whatever the field, is why we believe particular things to be “true”. But there is also a subtly different question, to which academics probably pay scant specific attention: “Why do people believe in things despite all evidence that they are false?”

There are many answers. Faith, by definition, transcends logic. The topical phenomenon of “fake news” presents an alternative reality that political partisans often prefer – or that they are exposed to so often that they come to regard it as the true reality. But what about in domains, such as science, in which ostensibly true facts are freely available and in which logic is supposed to play a dominant role?

Some people have vested intellectual or financial interests, of course, in particular theories, while others may simply not wish to admit that they are or were wrong. But it runs deeper than this.

Researching human belief about five or six years ago led to the watershed moment when I discovered that the main reason that people – all people – are credulous is simply because evolution selected them to be so. If something looks like a sabre-toothed tiger, it may or may not be one, but nature selects those who assume that it is and run like hell without bothering to check.

This is beautifully explained in much more detail in Daniel Kahneman’s wonderful book, Thinking, Fast and Slow. The Nobel prizewinning psychologist and economist designates reacting to an instant impression, such as seeing a possible sabre-toothed tiger, as “type 1” (“fast”) thinking; evolution simply has not had time to let “type 2” (slower, logical) thinking take over as much as we might wish.

Ben Goldacre encapsulates the stark consequence in his book, Bad Science: “You cannot reason people out of positions they didn’t reason themselves into.”

This, in many ways, is the thing I wish I had known when I was younger. Much fruitless time – in scientific seminars, as well as life in general – might have been saved by not trying to reason people out of unreasoned positions. But at least exploring the reasons for all that wasted time made for an interesting research project.

Douglas B. Kell is research chair in systems biology at the University of Liverpool. His book, Belief: The Baggage Behind Our Being, co-authored with Rick Welch, can be freely downloaded at http://osf.io/pnxcs/.

 


Read the newspaper

The best piece of advice that I was given as a student and a junior faculty member is advice that I saw fit to ignore for decades. Today, I offer that same wisdom to my own students – and am also ignored for my troubles.

That advice is simple: read the newspaper. Read it daily; better still, read two – or more. Only when I began routinely reading national news sources did I begin my extracurricular education, which has turned out to be just as critical as my scientific training to my success in academia.

From newspapers, I learned that some US adults reject the most basic scientific principles and often substitute magical thinking for facts. From newspapers, I know that healthcare – my field – is rapidly changing, as are many professions in medicine and pharmacy. And, from newspapers, I know that students are increasingly disengaged from education and that if we as educators want to address this, we should ask what young adults want from a university experience.

I can easily extrapolate the benefits of an education by the press to my students, who want to be physicians, pharmacists and chemical engineers: they, too, need to understand how their desired professions are changing and how best they can fit themselves into that future. However, I cannot convince anyone that this is crucial.

When I give them my words of wisdom, I get polite nods, smiles and gestures of agreement. But each week, when I ask, “What was in the paper about your field?”, I get silence, stares at the floor and close examinations of fingernails.

The irony is that from those same newspapers I cannot get them to read, I learned that this generation of students’ unhealthy addiction to their phones and tablets is having a very worrying side-effect: a rapid diminution in their ability to distinguish sensational clickbait from genuine reporting – which, of course, is all available online, even if it is sometimes behind a paywall.

Even back in 2007, a survey by Harvard University’s Kennedy School of Government found that only 16 per cent of Americans between the ages of 18 and 30 read a newspaper daily. Among those between 12 and 17, the figure fell to 9 per cent; who knows how low it is for the generation after that, who are today’s college students. This situation is bad news. Social media posts cannot replace the hard work of the well-trained reporter, so I worry about the increasing influence of bad information on my students’ biases and belief systems.

From the paper, too, I learn that young adults despise anything old-fashioned. Perhaps that explains why I am subjected by my students to the “Thank you. Next” treatment. Even so, I will go on recommending daily news reading to those who still care to listen. After all, the news is history in the making – and a newspaper is its first edition. Even if history isn’t your field, isn’t that at least worthy of a little of your precious attention?

Jennifer Schnellmann is an associate professor in the College of Pharmacy and associate director of undergraduate studies in pharmaceutical sciences at the University of Arizona.

POSTSCRIPT:

Print headline: What I wish I had known earlier

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored