Last week, a debate flared up on twitter on working hours in academia and there was the claim that it is irrational to work over 40 hours as output actually goes down. I do not believe this claim.
A few starting notes:
- I am happy to be contradicted with data, but too often I see this issue being discussed with links to web articles citing other web articles, finally citing studies which suffer from the issues listed below.
- Maximum output at work is traded off against other valid personal goals. It is fine to argue that you prefer to produce less and spend more time with family or have more hobbies. Seriously, it’s a good argument. I just want people to make it instead of claiming a free lunch.
- I’m using mIF (mili Impact Factor points) as the unit of academic output below. This is a joke. If you want to talk about the impact factor, we can talk about it, but this is not what this post is about.
- I agree that presentialism (i.e., measuring or valuing how long people are present at a job) is an idiotic system (or cultural trait). This is an even worse system than measuring impact factor points. Again, this is not what this post is about.
I mostly think that every time a scientist says “Research shows…” and they’re wrong or using it to boost their political/personal beliefs, then anti-science activists deserve a point.
Measurement is hard
People lie about how much they work. They lie to conform to expectations and lies go in multiple directions. Thus, even though I do think that Americans (on average) work more than Europeans, I also think that Americans exaggerate how much they work and some workaholic Europeans exaggerate how much time they take off.
Cross-country studies will also often impute the legal work hours to workers in different countries even though these may not correspond to hours worked (officially, I work less now than during my PhD, but I actually work way more now).
Even well-meaning self-reports are terribly inaccurate. People count time spent at work even though they spent a lot of it on non-productive activities. It can even be hard to define the boundary between work and non-work. There is obvious work (me, writing a rebuttal letter to reviewers). There is obvious non-work (me, spending 30 minutes in the morning reading the newspaper online sitting at my work desk). But there is a vast grey zone: me, reading about Haskell bioinformatics libraries, or me writing an utility package in my free time that I end up using intensively at work. Often the obviously productive work ends up using ideas from the not-so-obviously productive bits.
This should lead us down the path of distrusting empirical studies. Not completely throwing them out the window, but being careful before claiming that “research shows …”.
It should also lead us to distrust the anecdotal reports of people who say they work 60 hours per week or those who have impressive CVs and claim to work only 35 hours and take long holidays.
What do you mean by productivity?
Often there is a game that is played in these discussions with the word productivity, as it is not always clear whether it refers to output per hour or output per week. For the moment, let’s be strict and say use it in the output per hour sense.
Marginal productivity starts going down well before it turns negative. Thus, if you are optimizing for average productivity, you end up at a lower number of hour than if you are optimizing for total output. Here is what I mean (see an earlier post on the shape of this curve):
Let’s say that academics produce impact factor points (the example goes for most other knowledge work). Because there are fixed time costs in academia (as in almost all knowledge work), the first hours of the week produce 0 IFs. It will depend on the exact situation but 10 hours a week can easily be spent on maintenance work (up to 20 or 30 if one is not careful). Then, the very productive hours produce 15mIF/hour. As more hours are worked, one can become tired, and the additional hours start producing less than 15mIF (thus, marginal productivity is diminishing). As we take it to the extreme, our academic becomes so tired, he cannot produce anything at all or even produces negative IF (for example, by disrupting other people’s projects).
If you are hiring people by the hour, you want them to work to the point where output/hour is optimized, which is the traditional justification for why companies should have shorter work weeks. However, this can be well below the point at which output is maximal.
Looking at some empirical work, it does seem that while the point of productivity inflection is just about 40 hours per week, the point of maximum output is above 50 hours/week.
Thus, if you are managing a widget factory, you may not want your workers working more than 40-45 hours for your own selfish reasons. But this does not mean that this is the point of maximum output.
Anecdotally, it does seem that many people work 40 hours at their main jobs and still engage in either a second lower-paying job or in non-leisure cost-saving activities (with lower implied wages than their main job, although these are untaxed).
Averages hide variances
Again, work that is directed at managers of widget factories is not necessarily a guide to your behaviour. Perhaps some workers peak (in their average productivity) at 30 hours, others 40, still others at 50. If you are managing as a group, go for the average (look at the spread in the empirical plot above).
Maybe this is not where your maximum is. Maybe too, one can train to increase one’s maximum. Maybe your maximum this week is at 20 hours and the next week at 60.
Also, as I write above, many people take either formal second job or undertake secondary cost-saving activities. Often these can be more flexibly scheduled than their main jobs. For example, someone who regularly does a longer trip to a cheaper grocery store to save a few bucks may skip that “second job” in the weeks where they are tired or have good leisure alternatives. Or they may only get around to fixing their own washing machine when they have a few hours without any better things to do.
As free-range knowledge workers, we get all of this flexibility already (remember the old joke that in academia you can work whichever 80 hours of the week you want). Perhaps this already alleviates many of the drawbacks of going above the widget-makers optimum. I certainly know that I enjoy the flexibility and that, while on average, I do work longer weeks, this is not true of every single week.
In a competition, payoffs can be heavily non-linear
It remains a great injustice that even though I can run 100m in just twice as much time as Usain Bolt, I cannot get even a tenth of his pay.
Sports are the extreme case as they are almost pure competition, but they do make the point clear: in competitive fields, just a bit more output can make a huge difference. In science, getting a project finished in 10 months instead of 11 months may be the difference between getting or not getting scooped. A paper that is just slightly better may get accepted while one that neglected that one extra experiment does not. A grant that scores two percentage points higher gets funding. And so on.
Unfortunately, in most cases, we cannot know what would have happened if we had just added that one extra experiment to the paper or submitted the grant without that bit of preliminary data we we collected just before submission. But saying that we can never know is an epistemological argument, the reality still remains that a little extra effort can have a big payout.
Conclusion
I keep reading/hearing this claim that “research shows that you shouldn’t work as much” or that “research shows that 40 hours per week is the best”. It would be good if it were true: it would be a free lunch, but I just do not see that in the research. What I often see is a muddling of the term “productivity” which does not appreciate the difference between maximum avg. output/hour and maximum output/week.
I am happy to be corrected with the right citations, but do make sure that they address the points above.