Sunday, February 28, 2010

Microattention for Secondary Screens

Think about a desk calendar, the kind that has a clever quote for every day of the year. Now replace it on your desk with a little screen. The little screen not only has the clever quotes, it rotates through other stuff: headlines, stock updates, Twitter, weather, and so on.

You ignore it most of the time, but it’s great for the occasional glance when you’d otherwise be waiting for something to happen on your computer. Or when a conference call meanders. Or when you just have extra microattention, meaning you can keep doing what you’re doing while sneaking a peek, like seeing a billboard as you drive.

Here is a real-world example: On my desk, away from my computer display, I have a Chumby One, a small WiFi device that I can configure via the Web to display various stuff. Across a few minutes, it will show the likes of:

These are widgets like those you’ve seen on computer desktops (Microsoft gadgets, MacOS Dashboard, and so on). I’ve also got Chumby widgets for Twitter, stock prices, and SmugMug’s most popular pictures of the day.

Initially this all may seem like a big time-waster, a weapon of mass distraction. But from my experience, it can be a time-expander, allowing you to do more with the time you have. To understand why, go back to the analogy of seeing billboards on the highway. What if those billboards showed stuff you chose? You would be driving anyway, so anything useful the billboards communicated would be a bonus, if you felt like looking at them.

That last part—your discretion to look or not—is key. It’s why widgets on a peripheral display like a Chumby are a different experience from widgets on your computer’s normal display. If I had the equivalent of my Chumby’s rotating widgets on my computer desktop, they would drive me crazy because they would not be fully ignorable; they would be too much in my field of view. Yet if I hid them—requiring a click or key-press to view them and then another to hide them again—they would not be as effortlessly glanceable as on a peripheral display.

So will Chumbies take over the world? Chumby is a quirky, early entrant in the “connected screens” market, and I wish it/them the best. But a bigger trend is afoot:

  • Small, external USB displays let you sequester desktop widgets to a peripheral screen.
  • Digital photo frames are evolving into connected screens, with content delivered by companies like FrameChannel.
  • Screenphones, while they sit in a dock on your desk or nightstand, are going the same way.
  • Dashboard screens in cars? They are going there too, if you don’t end up using a dashboard-docked phone instead.
  • Next-generation remote controls for TVs will have screenphone-like screens, so count them in.
  • Idle TV screens on walls or, in the farther future, walls that are screens? Yep.

In other words, there will be no shortage of screen devices that can enable good use of your microattention. With ever-falling costs of flat screens, and greater use of microattention, expect to see second, peripheral screens where you already have a main screen, and other screens in what today would be unlikely places.

That said, it is interesting to ask what else screen devices could do to support microattentive uses. On one hand, interactivity is helpful if you want to get more info on something you see (the Chumby One has a touchscreen, and some widgets have touchable controls). On the other hand, interactivity will be an exceptional use case, like stopping the car to learn more about a billboard’s content. In my case, if I see something on the Chumby that I want to know more about, I just Google it on my desktop rather than interacting with the Chumby.

More interesting to me would be a peripheral screen’s having a video camera that can track when my eyes viewed the screen and what I viewed. For example, if a headlines widget eye-tracked the headlines I saw, it could show other headlines on subsequent rotations.

And to those who make the widgets and/or content in them, what new experiences can be had for an audience that consumes your media via a sequence of occasional glances?

My microattention and I will be eager to find out.

Saturday, February 20, 2010

Cloud and Jaffe’s The Fourth Star

The Fourth Star by David Cloud and Greg Jaffe is about the long struggle, and eventual victory, of an idea that is now the core of the U.S. strategy in Iraq and Afghanistan: that if the enemy isn’t going to fight a conventional war, then we need to win by means other than just fighting. That idea is embodied by General David Petraeus, who took command of U.S. troops in Iraq from General George Casey Jr. in 2007.

The authors use that transition to summarize their larger theme:

Ever since [Vietnam], senior Army leaders had tried, and ultimately failed, to keep their force from becoming too deeply embroiled in messy political wars that defied standard military solutions. It was a pattern that had repeated itself in Haiti, Somalia, the Balkans, Afghanistan, and then Iraq, where generals often focused more on exit strategies than plans for victory. Petraeus wasn’t interested in the drawdown plans often advanced by Casey. Instead he wanted to push U.S. troops into cities and leave them there. Only a heavy and sustained American presence could win the war, he believed.

Moreover, this heavy and sustained presence was not about engaging the enemy. Rather, it was primarily about protecting and policing the population, controlling sectarian violence, and turning the pragmatic majority against the insurgents.

Whereas U.S. forces previously concentrated themselves in highly fortified bases, they would now disperse among smaller posts within population centers. Whereas rebuilding Iraq was often about multiyear, multibillion-dollar reconstruction projects, it would now include immediate band-aids like getting the sewage out of the streets. Whereas conventional tactics were about applying overwhelming force to win the immediate battle, the new thinking included paradoxes such as “Sometimes, the more force is used, the less effective it is.”

To tell the story of this paradigm shift, The Fourth Star follows the careers of Petraeus and three other generals in the Iraq war: Casey, John Abizaid, and Peter Chiarelli. The four biographies make clear that each man was an extraordinary soldier, yet Casey and Abizaid ended up being the last of the old guard; Petraeus and the lesser-known Chiarelli ended up the first of the new.

Abizaid is a particularly tragic figure. An Arabic speaker with a long history in the Middle East, he knew the region better than any senior soldier, not to mention his civilian superiors. Like Petraeus, he had long ago concluded that the military needed to get much better at winning the peace in addition to the war. Yet he took charge in Iraq in 2003, after the initial rout, in the “Mission Accomplished” era. Taking orders from the then-swaggering Bush/Rumsfeld political leadership, he and Casey became the executors of policies that initially were in denial about the insurgency—thus giving it time to build—and then tried to fight it with conventional means.

Conversely, Petraeus was in the right place and right time. As a regional commander in Iraq, he employed his signature counterinsurgency tactics to good effect, all the while working personal relationships with the press to document his rising star. By late 2006—after years of deteriorating conditions, the Republicans’ loss of Congress, and Rumsfeld’s resignation—Petraeus was the obvious choice for change. However, it’s interesting to ask what would have happened if Petraeus was in Abizaid’s role several years earlier, whether Petraeus would have been able to accelerate America’s adaptiveness or whether he would have been cast aside by the civilian leadership, as was General Eric Shinseki when he requested too many troops for post-war Iraq operations.

While The Fourth Star invites such conceptual questions, it’s first and foremost a storytelling book. It is especially strong conveying the stories of what actually was happening in Iraq behind the headlines, from the bureaucratic infighting to the real fighting on the battlefield to the twisted relationships among the U.S. and various Iraqi factions.

As an example of Cloud and Jaffe’s eye for the telling detail, here is an aside about an Iraqi army unit being trained for self-sufficiency:

They couldn’t feed themselves without U.S. help or repair broken equipment. When one of their soldiers was killed by insurgents, the unit wasn’t even able to ship the body home. Instead the battalion commander ordered his men to put the decomposing corpse in a room with the air conditioning turned on full blast. In a scene reminiscent of a Faulkner novel, the Iraqis then passed a hat hoping to collect cab fare for the 500-mile trip to the dead soldier’s family home in Basara. Eventually [the U.S. officer in charge] paid the fare.

As with this quote, the daunting nature of the American—and Iraqi—challenge in Iraq pervades the book, not in a polemic way, just as myriad matters of fact.

Before Petraeus’ promotion in 2007, things were bad, going to worse. By embracing change, and investing in it with an additional troop surge, the momentum reversed. Three years later Iraq has seen progress, but a decisive win is a long-term proposition. Near-term success amounts to transitioning most of the military burden from the United States to native Iraqi forces, while maintaining political and economic stability—any aspect of which will strike a reader of The Fourth Star as a formidable task.

Yet amid the ongoing uncertainty in Iraq and Afghanistan, The Fourth Star makes the case that an important victory of ideas has already occurred. It took unusual leaders like Petraeus, who creatively bucked the system from within, and it took a trip to the edge of failure in Iraq, but the U.S. military learned and adapted. The Fourth Star is an engaging chronicle of that slow, hard path toward change.

[Here is the Amazon link to The Fourth Star.]

Monday, February 15, 2010

When Reporters Don’t Want to Hear It

From an interview with robotics expert Noel Sharkey:

Isaac Asimov said that when he started writing about robots, the idea that robots were going to take over the world was the only story in town. Nobody wants to hear otherwise. I used to find when newspaper reporters called me and I said I didn’t believe AI or robots would take over the world, they would say thank you very much, hang up and never report my comments.

This brought back a memory from my SRI days, when I was regularly interviewed by reporters. I remember a reporter called me on the day of some big news (I forget what it was), wanting my take.

The normal routine would be for me to provide a pithy quote, which the reporter would use as the voice of an independent expert. However, half-way through my commentary, his keyboard stopped clickety-clacking. “That doesn’t get me where I need to go,” he sulked, more to himself than me.

Reporters usually like contrarian views, but apparently the expert slot in this story was already tailored for a concurring opinion. He was on deadline, as most reporters are when they call. It was easier to find another expert than redo the story.

I’m not naming the reporter because this run-in was the exception, not the norm, with him. He was a quality reporter who later became the technology bureau chief for one of the biggest U.S. papers—which makes the point stronger: Even quality reporters can succumb to finding only the facts and opinions for the story they want to tell.

The good news is, in my experience with a wide range of reporters, this situation was rare. But then again, I wasn’t daring to question whether robots would take over the world.

Saturday, February 6, 2010

Review: A History of the World in 6 Glasses by Tom Standage

In Tom Standage’s A History of the World in 6 Glasses, civilization is what civilization drinks: “Just as archaeologists divide history into different periods based on the use of different materials—the stone age, the bronze age, the iron age, and so on—it is also possible to divide world history into periods dominated by different drinks.”

Standage starts with beer, which was fundamental to early agricultural civilizations in Mesopotamia and Egypt. As the liquid form of plentiful grains, and safer than the local water to drink, beer was a staple of diets. It was also a popular form of payment and currency. By its ubiquity, beer became, and to a large extent still is, the drink of the everyday worker.

In contrast, wine was emblematic of Greek and Roman civilizations, which stratified wines by type and age. Everyone drank wine, but the elites had elite wine, the commoners had common wine, and so on in between. This association of wine with connoisseurship persists today, as does the Mediterranean region’s cultural preference for wine as its main social drink.

The other four epochal drinks are distilled spirits (their bang-for-buck compactness made them a key trading currency in the Age of Exploration), coffee (the Age of Reason played out in coffeehouses), tea (British Empire), and Coca-Cola (the American Century and globalization).

Each drink gets a cultural biography that explores “the ramifications of who drank what, and why, and where they got it from.” Standage covers that territory broadly, visiting the histories of agriculture, religion, philosophy, and commerce (among other topics) at opportune moments.

So, for those interested in an offbeat, eclectic take on history, A History of the World in 6 Glasses should slake thy thirst.