Life hacking and the infinite workspace

One day I walked into the Student Union office and saw Annie with her ginormous Toshiba laptop with its 17" screen of massive resolution hitherto unknown to mere mortals. On it she had one web page taking up the entire screen. Of course, being the savvy Windows user, she knew how to use the Alt-Tab keyboard shortcut to switch between other open programs, like her Word documents, each also expanding to take up the entire screen. As she worked on her paper, she’d keep tabbing back and forth between the web browser and the document, but it never once occurred to her to shrink the two windows down so that she could view them both at once, side by side. Meanwhile concentration, focus, and productivity were just vanishing.

Sunday’s New York Times Magazine contains a great piece on a movement called life hacking, an attempt to better analyze the distractions of the modern workplace and develop strategies for organization and work that make people more productive.

Years ago I thought that wearable computers were just around the corner. I was hideously excited about the idea of a " remembrance agent ," an interactive software program that would constantly monitor my surroundings as I went about my daily routine, recording everything and constantly analyzing, looking for correlations. A small computer worn like just another gadget — or piece of clothing — would provide important and relevant information through a display in my glasses. It would tell me where I was and when I was last there and what I was doing and with whom and why. It would remind me of people’s names and of appointments and bring up old emails and papers and web pages based on its understanding of what I needed at any particular moment. It would know enough to understand when things required my attention and it should be obtrusive, and when it should just fade into the background and leave me in peace. It was the next evolution of computing technology, and it was just around the corner.

Well, its been eight or ten years and we haven’t seen much progress on the wearables front, meanwhile society has continued to get more and more distracting. I need not list off the litany of devices that we all know are sapping our lives away — cell phones, PDAs, PDA cell phones, cameras, PDA cameras, PDA cell phone cameras, BlackBerrys, laptops, iPods. Things keep getting more complex, our environment more demanding, information more pervasive, and yet not many people have put much serious thought into user-focused organization, prioritization, and productivity boosting.

Or, to put it another way, Windows is very concerned about my virus definitions being out of date, but why should I care? Why is it popping up that stupid annoying bubble and breaking my concentration right when I’m in the middle of a sentence? And just when I’m about to get back to what I was doing, the phone rings. An email arrives. Someone IMs. And by now I’m so distracted that I don’t even remember what document I was working on, much less how I was planning to finish that sentence. Or look at how the researcher in the Times article calculated it:

When Mark crunched the data, a picture of 21st-century office work emerged that was, she says, "far worse than I could ever have imagined." Each employee spent only 11 minutes on any given project before being interrupted and whisked off to do something else. What’s more, each 11-minute project was itself fragmented into even shorter three-minute tasks, like answering e-mail messages, reading a Web page or working on a spreadsheet. And each time a worker was distracted from a task, it would take, on average, 25 minutes to return to that task. To perform an office job today, it seems, your attention must skip like a stone across water all day long, touching down only periodically.

So we see the problem, and clearly wearables aren’t going to be coming along anytime soon to start mediating our reality, so what can we do in the mean time? Well, remember my original example with Annie and her laptop, and how she has been conditioned by software to maximize every window with its big candy-colored borders and rows upon rows of buttons and widgets to cover the full screen? Microsoft Research ran a study where they plunked users in front of a normal sized computer monitor and monitored their habits. They found that most users had about eight different windows opened at the same time, and they would spend only 20 seconds or so looking at each before flipping to the next. The writer likens it to a desk so small that you can only view one piece of paper at a time, and have to keep shuffling through the stack as you work. An apt analogy. So what happens when people are given huge screens and things placed all over the place?

Remarkably, or perhaps not, their productivity goes way, way up:

On the bigger screen, people completed the tasks at least 10 percent more quickly – and some as much as 44 percent more quickly. They were also more likely to remember the seven-digit number, which showed that the multitasking was clearly less taxing on their brains. Some of the volunteers were so enthralled with the huge screen that they begged to take it home. In two decades of research, Czerwinski had never seen a single tweak to a computer system so significantly improve a user’s productivity. The clearer your screen, she found, the calmer your mind.

People around the office have Word open and it has a menu bar on the top and then five rows of tiny buttons. Do you even you half of them? A quarter? The first thing I do when I get a new program and pull as many buttons out of the top bars as I can. If the program won’t let me, I don’t use it. There is beauty in something so finely crafted, so in tune with its users, so specialized and powerful in its application that it needs only five or six buttons to do everything you need. Small, simple, highly specialized programs that do what you need and nothing more — that is how to be productive.

When computers moved relatively slowly and the Internet was little used, raw productivity – shoving the most data at the user – mattered most, and Microsoft triumphed in the marketplace. But for many users, simplicity now trumps power. Linda Stone, the software executive who has worked alongside the C.E.O.’s of both Microsoft and Apple, argues that we have shifted eras in computing. Now that multitasking is driving us crazy, we treasure technologies that protect us. We love Google not because it brings us the entire Web but because it filters it out, bringing us the one page we really need. In our new age of overload, the winner is the technology that can hold the world at bay.

At work I set myself up with two monitors next to each other. It is wonderful. On one side I have all of my communications — emails, IMs, a program to jot down quick notes. On the left I have what I’m working on — terminals, documents, web pages. I can feel myself being more productive, and I knew it was working the other day when I thought about how nice it would be to have some more screen space, a third monitor, on which to shove some of my work. Oh, but that would just be ostentatious.

What makes productive people productive? Good organization. Good concentration. Ways of dealing with and channeling distractions. Methods of using the computer as a tool to serve the user, and not the other way around. That’s really the key. Everyone has their own methods, their own ways of organizing data. Rich at Maintex writes everything onto an easel, and remembers exactly where everything is, can flip back twenty pages to the right set of scribblings in the middle of a meeting. Some people use their email, sending themselves notes and reminders and pieces of information, constantly culling their inbox and putting things in folders. A lot of people I know seem to love those little pocket notebooks.

Some of the most highly productive people subscribe to a method known as "Getting Things Done," created by a personal productivity guru named David Allen. I’ve heard about GTD from the blog 43folders, but haven’t yet devoted the time (aha!) to learning more about it. The Times tells me I should.

At the core of Allen’s system is the very concept of memory that Mark and Czerwinski hit upon: unless the task you’re doing is visible right in front of you, you will half-forget about it when you get distracted, and it will nag at you from your subconscious. Thus, as soon as you are interrupted, Allen says, you need either to quickly deal with the interruption or – if it’s going to take longer than two minutes – to faithfully add the new task to your constantly updated to-do list. Once the interruption is over, you immediately check your to-do list and go back to whatever is at the top.

"David Allen essentially offers a program that you can run like software in your head and follow automatically," O’Brien explains. "If this happens, then do this. You behave like a robot, which of course really appeals to geeks."

Places like Microsoft are working on "solutions" to these problems, mostly involving more sophisticated software that attempts to track what the user is doing and tailor distractions based on that. It sounds a lot like my Remembrance Agent, but I worry that it will just end up as one more distraction, a program that never behaves the right way and never figures out its place. I don’t want any more bubbles popping out of my system tray, thank you very much. Rather, an effective system will be one that also controls scheduling, that can auto-negotiate with other people for time blocks, that can make my busy state known to the outside world.

One of the things I really like about the Mac is that software developers have realized there is a market for simple, specialized applications that do one or two things really well. Programs like OmniOutliner, Adium, Quicksilver, and SubEthaEdit are just amazing. It sounds like people on the Linux side are also starting to catch up. But I have less hope for Windows. This sort of thing just doesn’t seem to fit into the Windows "LET ME HELP YOU!" world view.

Then again, the real innovation in this field probably won’t come from the usual suspects. Geeks have been talking for years about breaking out of the "desktop paradigm" for computers, but no one has yet come up with a Better Way™ to do things.

Some experts argue that the basic design of the computer needs to change: so long as computers deliver information primarily through a monitor, they have an inherent bottleneck – forcing us to squeeze the ocean of our lives through a thin straw. David Rose, the Cambridge designer, suspects that computers need to break away from the screen, delivering information through glanceable sources in the world around us, the way wall clocks tell us the time in an instant. For computers to become truly less interruptive, they might have to cease looking like computers. Until then, those Post-it notes on our monitors are probably here to stay.

I suspect that when we least expect it, and probably at some point soon, something really, really neat and innovative is going to come along from a completely unexpected source, and it will be the sort of thing where we all scratch our heads wondering why we didn’t think of it sooner. I’m looking forward to it.