Liz Stinson, in Wired, previews Lightform, a “projection-mapping” device that can read a room and project images (or interfaces) onto any surface, no matter how irregular. In a nutshell, it’s augmented/mixed reality projected directly onto the environment:
Lightform’s technology sets the stage for more complex and immersive forms of interaction. The company aims to develop high-resolution augmented reality projections that track objects and respond to human input in real time. Its ultimate goal: Make projected light so functional and ubiquitous that it replaces screens as we know them in daily life life. “Really what we’re doing is bringing computing out into the real world where we live,” Sodhi says.
What I like about emerging technologies like this one is that the tech comes to you. Your surroundings simply become digital; no need to strap on a headset or peer through a screen.
Writing for MEX last week, Marek Pawlowski made a similar observation:
Virtual, augmented and mixed reality products like HoloLens and Daydream are often seen as being in the vanguard of this evolution, but the level of immersion required by these experiences is a somewhat misleading guide to the future.
The larger concept at play here is the notion that digital capabilities – through projection, augmentation or other more subtle forms of ingress – will become woven into the physical fabric of life. The dream of ubiquitous computing will not come in boxes, but rather will hover and shimmer in transient spaces around us.
“Woven into the physical fabric of life.” This is the exciting opportunity about the physical interface, whether embodied in IoT gadgets, projected UI, or augmented reality: it literally grafts onto the world around us, on our terms. It’s tech that promises to bend to our lives, rather than the reverse.