I was approached by a client to research the concept of 20% time for engineers, and they graciously agreed to let me share my results. Because this work was tailored to the needs of a specific client, it may have gaps or assumptions that make it a bad 101 post, but in the expectation that it is more useful than not publishing at all, I would like to share it (with client permission).
Side project time, popularized as 20% time at Google, is a policy that allows employees to spend a set percentage of their time on a project of their choice, rather than one directed by management. In practice this can mean a lot of different things, ranging from “spend 20% of your time on whatever you want” to “sure, spend all the free time you want generating more IP for us, as long as your main project is completely unaffected” (often referred to as 120% time) to “theoretically you’re free to do whatever, but we’ve imposed so many restrictions that this means nothing”. I did a 4-hour survey to get a sense of what implementations were available and how they felt for workers.
A frustration here is that almost all of what I could find via Google searches were puff-pieces, anti-puff-pieces, and employees complaining on social media (and one academic article). The single best article I found came not through a Google search, but because I played D&D with the author 15 years ago and she saw me talking about this on Facebook. She can’t be the only one writing about 20% time in a thoughtful way and I’m mad that that writing has been crowded out by work that is, at best, repetitive, and at worst actively misleading.
There are enough anecdotal reports that I believe 20% time exists and is used to good effect by some employees at some companies (including Google) some of the time. The dearth of easily findable information on specific implementations, managerial approaches, trade-offs, etc, makes me downgrade my estimate of how often that happens, vs 20% time being a legible signal of an underlying attitude towards autonomy, or a dubious recruitment tool. I see a real market gap for someone to explain how to do 20% time well at companies of different sizes and product types.
But in the meantime, here’s the summary I gave my client. Reminder: this was originally intended for a high-context conversation with someone who was paying me by the hour, and as such is choppier, less nuanced, and has different emphases than ideal for a public blog post.
My full notes are available here.
To the extent it’s measured, utilization appears to be low, so the policy doesn’t cost very much.
In 2015, a Google HR exec estimated utilization at 10% (meaning it took 2% of all employees’ time).
In 2009, 12 months after Atlassian introduced 20% time, recorded utilization was at 5% (meaning employees were measured to spend 1.1% of their time on it) and estimated actual utilization was <=15% (Notably, nobody complains that Atlassian 20% is fake, and I confirmed with a recently departed employee that it was still around as of 2020).
Interaction with management and evaluation is key. A good compromise is to let people spend up to N hours on a project, and require a check-in with management beyond that.
Googlers consistently (although not universally) complained on social media that even when 20% time was officially approved, you’d be a fool to use it if you wanted a promotion or raises.
However a manager at a less famous company indicated this hadn’t been a problem for them, and that people who approached perf the way everyone does at Google would be doomed anyway. So it looks like you can get out of this with culture.
An approval process is the kiss of death for a feeling of autonomy, but letting employees work on garbage for 6 months and then holding it against them at review time hurts too.
Atlassian requires no approval to start, 3 uninvolved colleagues to vouch for a project to go beyond 5 days, and founder approval at 10 days. This seems to be working okay for them (but see the “costs” section below).
Costs of 20% time:
Time cost appears to be quite low (<5% of employee time, some of which couldn’t have been spent on core work anyway)
Morale effects can backfire: Sometimes devs make tools or projects that are genuinely useful, but not useful enough to justify expanding or sometimes even maintaining them. This leads to telling developers they must give up on a project they value and enjoyed (bad for their morale) or an abundance of tools that developers value but are too buggy to really rely on (bad for other people’s morale). This was specifically called out as a problem at Atlassian.
Employees on small teams are less likely to feel able to take 20% time, because they see the burden of core work shifting to their co-workers. But being on a small team already increases autonomy, so that may not matter.
Benefits of 20% time:
New products. This appears to work well for companies that make the kind of products software developers are naturally interested in, but not otherwise.
The gain in autonomy generally causes the improvements in morale and thus productivity that you’d expect (unless it backfires), but no one has quantified them.
Builds slack into the dev pipeline, such that emergencies can be handled without affecting customers.
Lets employees try out new teams before jumping ship entirely.
Builds cross-team connections that pay off in a number of ways, including testing new teams.
Gives developers a valve to overrule bug fixes and feature requests that their boss rejected from the official roadmap.
There are many things to do with 20% time besides new products.
Small internal tools, QOL improvements, etc (but see “costs”).
Learning, which can mean classes, playing with new tools, etc.
Decreasing technical debt.
Non-technical projects, e.g. charity drives.
Other notes:
One person suggested 20% time worked better at Google when it hired dramatically overqualified weirdos to work on mundane tech, and as they started hiring people more suited to the task with less burning desire to be working on something else, utilization and results decreased.
20% or even 120% time has outsized returns for industries that have very high capital costs but minimal marginal costs, such that employees couldn’t do them at home. This was a big deal at 3M (a chemical company) and, for the right kind of nerd, big data.
Thanks to the anonymous client for commissioning this research and allowing me to share it, and my Patreon patrons for funding my writing it up for public consumption.
Excellent write-up. Thanks, Elizabeth.
I’m a software engineer at a company that implements a “20%”. Every couple of months, we have a one (sometimes two) week sprint for the 20%. As you’ve pointed out, it works out to be less than 20%, and many engineers choose to keep working on their primary projects to catch up on delivery dates.
In the weeks leading up to the 20% sprint, we create a collaborative table in which engineers propose ideas and pitch those ideas in a meeting on the Monday morning of the sprint. Proposals fall into two categories:
Reducing technical debt. E.g. deprecating the usage of an old library.
Prototyping a new idea. E.g. trying out the performance of a new library.
I find the 20% sprints very valuable. A lot of the time, there is work I would like to be done that doesn’t fit well within “normal” priorities. I believe such work to be valuable based on my experience and knowledge. However, it doesn’t have sufficient visibility from the perspectives of the higher levels. Therefore, this sort of work would never make its way into our everyday work without the 20% sprint.
Ironically, it seems to me that “agile” development took autonomy away from software developers, and “20%” gives it partially back.
My guess is that in cases where this is the case it can also carry over to employees who don’t take the 20% time. Merely having the option to do so may increase morale.
Thank you for taking the time to publish this. It’s kind of sad to see companies painting a picture of some kind of internal intellectual vibrancy or freedom or something when in fact it’s more of a recruiting or morale gimmick, or is just dominated in practice by performance demands. I have the sense that utilization numbers are low because it’s actually quite hard to formulate something compelling to work on for oneself, even absent any demands for justification or approval, and one of the reasons that people work at companies is to be given something compelling to work on (though this often isn’t what actually happens).
Your post triggered the following thoughts in me:
At best you might hope that 20% time would harvest insights from “line-level” employees about (1) what tools could be built to improve their own productivity, (2) what features customers would like (especially when the line-level employees are a representative sample of customer base itself), and (3) what super cool things could be built that are just hard to understand unless you’ve pondered it over and over for years. Companies attempt to harvest and filter such insights (to whatever extent they really exist) through the ordinary reporting structure, but there are going to be some such insights that good but systematically fail to make it through, especially in category (3).
So we have employees who are doing a job that gives them, as a byproduct, some kind of insight that we want to get access to. This is really a lot like the problem of eliciting latent knowledge, in which we have some powerful machine learning system that has demonstrated competence in some domain (e.g. predicting the sensor-visible consequences of plans) and due to its competence in that domain we suspect that it has an internal understanding of something that would be useful to us (e.g. knowing whether its own sensors have been tampered with). This really seems like a non-vacuous connection to me. Interesting.
Interesting.
This doesn’t cover my major issues with 20% time:
From most of what I have seen, 20% time mainly ends up being taken by the same people who use free time to work on projects anyway. Which then means that 20% time for employees ends up largely “just” a worse way of working a 4-day week.
Your stats are consistent with this, with only a fairly small percentage of people actually using said policy. (Although they do not rule out other hypotheses.)
All of the same concerns that I have with any vacation policy where there isn’t a policy that everyone must take all of it. (This includes, but is not limited to, every ‘unlimited vacation’ / ‘flex time off’ plan under the sun.) You do tangentially mention some of this, but it’s worth explicitly noting the parallels.
The restrictions that employers put on projects generally mean that you as an employee can’t commit to anything. And this most affects the more careful employees, which in turn means that the more careful employees see this as less of a benefit, which in turn means that you have benefits that are weighted towards attracting less careful employees.
If Atlassian’s policies are truly “requires no approval to start, 3 uninvolved colleagues to vouch for a project to go beyond 5 days, and founder approval at 10 days.”, no wonder they see a giant number of buggy tools.
A bunch of people each working on a software project for 5 (or even 10) days is a great recipe for a maintenance nightmare.
If anything, I’d expect this to increase technical debt.
See also your proposed benefit for an employer of “Builds slack into the dev pipeline, such that emergencies can be handled without affecting customers.”. You say that and my immediate reaction is “so an employee cannot actually treat this as a reliable benefit, because they are expected to drop it at a whim”.
“Careful” is, I freely admit, likely not the correct term. I don’t know of a better one offhand.
20% time when well-implemented can be decent. Having the ability for someone to go “nope, I know you keep pushing off the debug documentation for module X in favor of putting out fires because we can’t debug module X, but I’m going to put my 20% time into fixing this module up into a coherent whole” is great, and often under-appreciated. Having that then shot down because we’re currently too busy putting out fires because we can’t debug module X rather defeats the point. Having that shot down after 5 days, meaning that instead of a single coherent debug document written by one person in a year of 20% time we have a Frankenstein mess put together by ten people each of which gets pulled off in five days because, again, the debug documentation isn’t seen as important? Again, rather defeats the point.
My company wouldnt consider 20% time, but we do have a mechanism for new ideas. There is a fund of set size that you can apply to if you think you have an innovative idea worth exploring. Process is simple and structure of proposal needs to be “fast-fail”—it expects initial proposal will be focussed on feasibily and proof-of-value. Of course it is always over-subscribed but at least many things get to be tried. If first phase comes up promising you can get a lot more than 20% of time funded.
This is very interesting, thank you for sharing it.
I find the 5 day limits (without approval) quite insane. Even assuming that means 5 actual days (and not 20% of 5 days = 1 full day). Lets say you have an employee who has now put 5 days into their preferred passion project. You end it. They then put 5 says into their second-favourite passion project. The end result is an annoyed employee who has half-finished a train of side-projects and is still putting 20% of their time to one side from core duties.
My current work (university) is thankfully very flexible, so maybe I am seeing things from the wrong perspective.
May I ask for more detail on what this means? All I got from this is that since employees couldn’t work from home (on weekends, say, as one might do in a software role), the net effect of 20% was that 3M got 20% overtime from employees for free. And that the returns on the 20% time were very significant because process improvement/intensification (a primary enjoyable skill of of 3M engg talent) has very high RoI in general.