This month, we’re dialing back to the moment computers learned to see. Welcome back to Input Origins, where we journey through the history of human-computer interaction, tracing the steps that brought us from buttons to gesture control!
Myron Krueger, VIDEOPLACE, 1985, at the University of Connecticut's Artificial Reality Lab.
It Started with an Artist in 1970
What if computers could understand movement the way humans do?
In the early 1970s, Myron Krueger answered that question with VIDEOPLACE, a digital playground where your body became the controller. With nothing but a camera and some clever code, Krueger created a world where gestures shaped virtual environments - decades before touchscreens or motion tracking. It was interactive art, but also a glimpse into the future of human-computer interaction.
The Microsoft Kinect - Sold over 35 million units.
Three Decades Later...
It took three decades after VIDEOPLACE for computer-vision control to finally go mainstream. In November 2010, Microsoft introduced Kinect for Xbox 360, allowing players to control games using their entire body - no controllers needed. Equipped with an RGB camera, depth sensor, Kinect could detect movement and respond. While Kinect was a gaming sensation, selling over 8 million units in its first 60 days, it also hinted at something bigger: a world where gestures could control everything from TVs to robots.
David Holz, co-founder & CTO of Leap Motion at a CNET article from 2012
Leap Motion - In The Palm of Your Hand
Humans have always used their hands to shape the world - crafting, building, and expressing. When computer vision first recognized full-body movement, it was a revelation, but it was never the final destination. Leap Motion, introduced in 2013, redefined interaction by recognizing the hands for what they are—an extension of thought, capable of shaping both physical and virtual spaces. No controllers, no gloves—just natural, intuitive movement. For the first time, we could interact with the digital world as effortlessly as we do the physical one.
And with it, the foundation was laid for the future of AR and VR, where our hands wouldn’t just interact with the virtual world - they would control it.
Meta's Oculus Quest
The Future of AR & VR is in Your HANDS, Literally.
Our hands are our first and most powerful tools. They build, create, and express with effortless precision. Yet, for years, computers reduced them to clicks and keystrokes—an unnatural limitation. Leap Motion shattered that boundary. In 2013, it brought our hands into the digital world, making them the ultimate controller.
it was a return to the most natural form of interaction we’ve ever known.
As AR and VR gained momentum, tech giants took notice. Microsoft’s HoloLens introduced hand-tracking for mixed reality, letting users grab and manipulate virtual objects. Magic Leap pushed the idea further, blending digital overlays into the real world with gesture-based interaction. Meanwhile, Meta Quest refined controller-free tracking, making hands a primary input method for VR. And then came Apple’s Vision Pro, elevating gestures into the heart of spatial computing, where a simple pinch could summon menus or move content with precision.
But, do we really need cameras for gesture control?
"Look Mom, No Cameras."
Beyond Cameras: Neural Input Wristbands
Cameras enabled gesture tracking but came with limitations. They need ideal lighting—too bright or too dim, and tracking fails. Move your hands out of frame, and you lose control. Even slight latency can break immersion. Privacy is another concern—always-on cameras scanning hands and surroundings raise security issues. Plus, processing gesture data requires significant power, leading to heavier, more power-hungry devices like the Vision Pro.
What if there was a better way?
Enter Neural Input—a technology that doesn’t just see movement but understands intent.
By detecting subtle electrical signals from the wrist, neural interfaces like the Mudra-Link and Meta's Orion neural wristband allow for seamless, touch-free control—whether your hands are in view or not. This shift from external tracking to internal intent is redefining how we interact with digital worlds, bringing us closer than ever to frictionless, intuitive control.
Order your own Mudra Band or Mudra Link now!