Sensor Stories: The Post-PC Era, Context and Wearables
At Atomicdust, we try to think through the entire experience a person will have while using a website or flipping through an annual report. To do this, we use contextual information about our clients and their audiences: who we’re talking to, their needs and wants, their pain points and more all inform the messages we create.
A similar contextual revolution is underway inside the devices we use every day, and their inevitable successors. A smart device today is pretty smart, but a smart device that knows more about the person using it? That’d be something else entirely.
Which is exactly where wearables come in.
What our devices know now
Smart devices are gadgets stuffed full of other gadgets, hardware designed to tell the device about its environment in ways desktop computers and laptops haven’t traditionally been able to achieve.
They’ve got accelerometers, gyroscopes and compasses to tell them about their physical orientation. They’ve got GPS and cameras to tell them where they are, and what’s around. They’ve got microphones and ambient light sensors to tell them more about their immediate environment.
That’s Post-PC in a nutshell: networks that come with you, and computing that reacts to the contexts of how it’s being used
Contexts are powerful in computing — they can shed insight on how, where or why a user wants to accomplish a particular task. An easy example can be found by using a smartphone’s turn-by-turn directions: the device doesn’t have to ask where you’re coming from. It knows where you are and can assume you’re starting there. Camera apps embed the time, EXIF data and GPS coordinates into photos, and users don’t have to remember the location and conditions of a specific memory. Note-taking apps can capture the location the note was created, when and where it was last edited, etc.
Then there’s Siri (or Google Now, or Cortana): she knows where you are and remembers the last few things you’ve asked it to do. With Siri, you don’t have to ask for the weather in a specific city; she’ll assume you mean the city you’re standing in. When those assumptions — all based on device context and user patterns — are correct, the interactions are like magic.
Boom, context at work.
So, today’s devices can learn a lot about their environment. But they learn next to nothing about us, their users.
What wearables could offer in terms of context
Wearables (nerdspeak for “devices that are worn on the body, likely containing an array of sensors) could be a great way to offer context about ourselves that will improve our entire computing experience, rather than just another screen for email notifications. (We don’t really need one of those.) Think Google Glass, smart watches and other similar products.
But what if something like Siri could, in essence, read your mind?
Actually, Siri would be reading your body, but follow me here. Imagine: your computing device gets a feed of raw biometric data from your watch (thanks to a bevy of high-tech sensors). These devices anonymously compare this data to data patterns in the cloud, and a Siri-like service can successfully identify if you’re asking a question without you adding “question mark” to the end of your dictation.
Or, if you’re stressed, Siri could automatically mute your email notifications for a while.
Or, she could tell you it’s time for a walk. You’ve been sitting still for two hours.
Or…I think you get the picture. This could be a fascinating future, and could open up new ways of interacting with technology. If only there was a way for people to grow comfortable wearing sensors.
Biometric data from wearables could offer our Post-PC devices context on us for any number of uses, not just health tracking.
There’s a lot of chatter about wearables and personal health/fitness: the most successful wearable devices (so far) have been geared towards activity tracking, like the Nike Fuelband, Jawbone UP and FitBit’s entire device lineup. If rumors are to be believed, Apple is planning on making major pushes into this area. I find this a little odd: Apple has been cozy with Nike in the past, and the market for health-focused gadgets just doesn’t seem very big compared to Apple’s current markets.
A strict health-focus in any hypothetical Apple wearable device strikes me as too limited in scope. I’ve got to think Apple is focusing efforts around the health applications of this device to lay the foundation for something much bigger and more valuable. It’s a solid strategy, especially from a marketing perspective. The general public can easily understand the benefits of a health-focused device, given a clear enough explanation — and Apple is very good at awareness- and education-focused marketing. The applications of human-context-aware computing are fascinating, but far more hypothetical as of right now — normal people would struggle to understand the benefit. Plus, it’d take time to gather enough raw data to extract behavioral patterns from in a reliable manner.
The health focus wouldn’t be a smokescreen, per se, but fitness tracking can’t be the only reason Apple and other tech giants are interested in wearables. The potential for fundamentally changing how computers understand us is there, and it makes perfect sense for companies to explore it.
As for me? I’m ready for a smarter Siri.