I was an academic for 10 years. Here’s how I escaped.

This is not a “how to”. I’m an over-privileged white guy lucky enough to have researched and taught in an area that translates fairly well to industry practices. Nonetheless, this is my story. Maybe it can help you think through your own.

I never intended to be an academic.

I studied computer science as an undergrad in Brisbane. It was the late years of last century. I learned to program in Smalltalk (which no-one uses), Ada (which the US Military uses) and C (which everyone uses). Towards the end of my degree I got interested in human-computer interaction (HCI), artificial intelligence and multi-dimensional databases (together these are the design and engineering roots of any “uber-for-X” startup). My honours supervisor said to me that I was “good at this stuff” and should stick around for a PhD. I declined because I wanted to go and earn lots of money being an IT consultant.

After graduating I got a job at a Very Large Consulting Firm. It was 2001 and the dot-com bubble was bursting. There was no work. I spent ten months “on the bench”. Eventually I learned a little ABAP so I could do something. The firm, in their wisdom, decided I would best be used as a programming problem solver for consultants who were actually working on projects. I quit instead.

I ended up in Canberra and for something to do I enrolled in a PhD. (It was now the post-bubble crash so there was no work. I was lucky enough to get a scholarship for my PhD and to be newly married. My wife was working her way up the federal public service ladder so between my scholarship and her salary I was very fortunate to be able to afford to treat my PhD as a 9-5 job. I imagined I would be doing a traditional design-build-test computer science/HCI PhD. My primary supervisor was a speech recognition engineer so I made that my application focus.

As I dug into how people talk to computers it became clearer that I needed more people than engineering in my research and I picked up a sociologist as a co-supervisor. She was a Latourian, more or less, and insisted that I teach sociology of science with her so that I could understand everything that Latour, Callon, Akrich, Suchman and the rest were on about. I learned to talk to end-users of products and software and to analyse it with reference to theory and in a “grounded” way. My PhD thesis ended up being more sociology than computer science or even HCI.

Eventually it was time to get a post-PhD job and we were a bit over the cold of Canberra. A job as a research fellow came up at a university in Brisbane in a design school. The ad used synonyms for what I thought I did, so I applied. I imagine I was up against people with actual design backgrounds but I was the successful applicant.

And suddenly I was an academic. Specifically, I was a research fellow, working with industrial design researchers. I used to say to people that I had all the privileges of being a lecturer without having to actually lecture.

With a background in understanding how people talk to computers, it was natural that the first research I did in my academic job was to try to find out how to determine expertise, or the lack of it, in nurses banding people’s legs. (That’s a joke. It took me about six months of floundering about to trust that this was my job and my professor wanted me to do it my way, rather than my interpretation of her way.)

That project led to others. I was always on someone else’s “soft money” so the projects were varied. I was very fortunate that the contracts were long. My colleagues and I tried to figure out how doctors would use stethoscopes over video conference and what those stethoscopes should be like. My professor and a bunch of others won a huge grant about airports so I followed people around airports for four years and helped five other people get their PhDs in following people around airports, too.

Somewhere in there the soft money ran out. I was still on a contract so the Dean at the time decided I needed to teach to be able to pay my way. I’m organised (through sheer force of will — I’m actually an inveterate procrastinator) so I was able to cope with the management side of teaching. And I’d been tutoring since 1999 so I was fairly accustomed to being in front of a class. It was not as difficult a transition as it could have been.

After teaching for while, and still researching, I applied for what was basically my current job but in a permanent capacity. I didn’t get it (neither did anyone else who applied). When I asked why I was told I was both over experienced and under performing. And so I went full shields down.

I still had a contract. I still had undergrad teaching commitments. I still had PhD students. And I started looking to get into industry.

I had maintained contact with a bunch of people who I had tutored with during my PhD. They had become fairly senior in the user experience (UX) industry in Australia. Some of them had started a conference, UX Australia, that had become highly respected. Because I like to hang out with them, I’d presented at it a few times. Because I’d been on-stage, a lot of industry people knew me, or at least knew of me. That made it vastly easier to start looking for jobs because I could see what it was they did and find parallels in my experience so I could explain it to them.

After near enough to ten years in basically the same job my resume looked a little thin compared to industry people who switch jobs every 1-3 years. But, if I presented my experience in a skills-based way, suddenly I had people paying attention. I was getting interviews at least. I could have moved to Sydney or Melbourne and got work but my wife and I, and our kids, wanted to stay in Brisbane.

In 2015 when I was presenting at UX Australia I took some almost-finished PhD students along and was shopping them around to people I knew. I found myself waiting for a session to finish, making small talk with someone from a UX consulting firm that was fairly established in Sydney and Melbourne and had just opened a Brisbane office. I asked if they’d take me on. Remarkably, they didn’t immediately say that was ridiculous.

Eighteen months of coffee meetings and a Skype interview later, I was in my head of school’s office saying that I’d had an offer and I wanted to quit.

Universities tend to be reluctant to let lecturers go too quickly. In my case there were 80 undergrads, five tutors and a couple of research students I was responsible for. Technically I was required to give four months notice. The firm wanted me to start in two weeks. I negotiated two weeks left full time and then six weeks of part-time at both the university and the firm.

And that’s how I escaped.

The New Liberal Arts

I like the idea of The Liberal Arts. I sometimes say that I studied Computer Science but accidentally ended up with a liberal arts education. Design is supposed to be a new liberal art, too.

The great blog Snarkmarket released a book 8 year ago called “The New Liberal Arts“. It’s kind of like a speculative course book for a fictional university. You can get it as a free PDF or .prc file (which Kindles can read).

I read it the other day. It’s short. Here are some parts I liked.

Attention Economics, by Andrew Fitzgerald

Fitzgerald says that in the new world of scarce attention we’ll need some new skills:

  • multitasking (doing more than one thing at once)
  • ambient consumption (figuring out how to cope with the stream)
  • focus (how to stop multitasking)
  • stillness (how to stop consuming from the firehose and turn it all off)

If you follow any of the social media discussion on task management and mindfulness, you’ll see these broader concepts sneaking in.

Food, by Gavin Craig and Theresa Mlinarcik

Food is a great lens through which to view the world, according to Craig and Milnarcik, because:

Food is intrinsically connected to nature, ecology, politics, pop culture, family values, and economics.

It’s hard to disagree. I think this might be why design students (in my experience), often end up doing projects about food.

Micropolitics, by Matt Thompson

Micropolitics explores the idea that:

So much of the texture of everyday life is hashed out in obscure municipal backchannels, by small groups of engaged citizens getting together on weekday evenings. The buildings you see every day, the restaurants you dine at, the closing time of your neighborhood bar, the bus routes to and from your home—these things are the way they are because of a complex system of professional networks and planning meetings that few have the know-how to navigate.

You can also see this idea of small, kind of hidden, aspects of everyday life having outsized influence in in books like Dan Hill’s Dark Matter and Trojan Horses.

Photography, by Tim Carmody

I am a very bad photographer, but I’m fascinated by it. Carmody says that photography is basically the way that we write the world today. Snap (and Facebook) think of themselves as camera companies which is partly right and a good enough reason to think harder about what photography means.

If I went back to uni, I’d study photography, I think.

Translation, by Rachel Leow

This is fascinating:

Every student would declare at least two languages: their native tongue and one or more languages of their choosing, however firm or tenuous their grasp of them. Seminar groups would consist of students who declared the same two languages, so that discussions could take place in two mutually intelligible languages, at varying levels of ability. These are the groups they’d work in, communicating in online forums and discussion groups, live chat, and video conferences.


Moving past design as a service

Julie Zhou’s post looking back on what she learned in 2016 was widely shared in my social media designery bubble. I thought this was a good insight:

you will always be treated as a service if you assume your role is to wait around for others to come to you with some specific problem to solve. The path to getting out of being a service is to have an opinion about which problems are worth solving and convincing other people of that.

This was good too:

Being a designer is like having a superpower that allows you to show other people the future.

With great power comes great responsibility, of course.

Scientific concepts you should know about

Every year Edge.org asks a bunch of smart people a question. This year they asked “what scientific term or concept ought to be more widely known?”. This year there were 206 responses; these are my favourites.

The Premortem, suggested by Richard Thaler

Assume we are at some time in the future when the plan has been implemented, and the outcome was a disaster. Write a brief history of that disaster.

Affordances, suggested by Daniel Dennett

The concept of Affordances was developed by JJ Gibson in the 1970s and popularised in design by Donald Norman in the late 80s. Dennett says:

The basic idea is that the perceptual systems of any organism are designed to “pick up” the information that is relevant to its survival and ignore the rest. The relevant information is about opportunities  “afforded” by the furnishings of the world: holes afford hiding in, cups afford drinking out of, trees afford climbing (if you’re a child or a monkey or a bear, but not a lion or a rabbit), and so forth.

But Dennet goes further, suggesting that the concept of Affordances helps us try to figure out what consciousness is. I’m still thinking about this one.

The Law of Small Numbers, suggested by Adam Alter

Alter tells a good story about what happens when you try to think about outcomes based on limited information.

The solution is to pay attention not just to the pattern of data, but also to how much data you have. Small samples aren’t just limited in value; they can be counterproductive because the stories they tell are often misleading.

Case-Based Reasoning, suggested by Roger Schank

Reminding, based on the examination of a internal library of cases, is what enables learning and is the basis of intelligence. In order to get reminded of relevant prior cases, we create those cases subconsciously by thinking about them and telling someone about them. Then, again subconsciously, we label the previously experienced cases in some way.

The task for the motivated reader is obviously to find a way to have a single coherent concept that encompasses both cased-based reasoning and the need for a lot of data presupposed by the law of small numbers.

Embodied Thinking, suggested by Barbara Tversky

We are bodies moving in space. You approach a circle of friends, the circle widens to embrace you. I smile or wince and you feel my joy or my pain, perhaps smiling or wincing with me. Our most noble aspirations and emotions, and our most base, crave embodiment, actions of bodies in space, close or distant.

Decentering, suggested by Gary Klein

Decentering is not about empathy—intuiting how others might be feeling. Rather, it is about intuiting what others are thinking. It is about imagining what is going through another person’s mind. It is about getting inside someone else’s head.

Class Breaks, suggested by Bruce Schneier

Something about Class Breaks seems relevant to both case-based reasoning and the law of small numbers. But I haven’t figured it out yet.

Picking a mechanical door lock requires both skill and time. Each lock is a new job, and success at one lock doesn’t guarantee success with another of the same design. Electronic door locks, like the ones you now find in hotel rooms, have different vulnerabilities. An attacker can find a flaw in the design that allows him to create a key card that opens every door. If he publishes his attack software, not just the attacker, but anyone can now open every lock. And if those locks are connected to the Internet, attackers could potentially open door locks remotely—they could open every door lock remotely at the same time. That’s a class break.


My pick is “epistemology”, which tends to get defined as “theory of knowledge”. But I would tend to use it more to think about what counts as knowlege. For example, if your epistemology favours lots of data you probably wouldn’t be persuaded by an example based on case-based reasoning.