Skip to content
Mudra BandMudra Band
Input Origins: Gaze Tracking - Controlling With Your Eyes (#16)

Input Origins: Gaze Tracking - Controlling With Your Eyes (#16)

Input Origins: Controlling Devices With Your Eyes

Welcome back to Input Origins, our monthly time-travel through the input methods that changed how we control our devices.

This month: Eye Tracking, AKA Gaze Tracking. Point your eyes at something on a screen, hold your gaze for a moment to click, no hands needed. While not widely known, this technology is a lifeline for people with physical disabilities, ALS patients who can't move their hands rely on it daily. And today? It's gaining traction in the XR-space, where input methods are either converging... or vying for control, pun intended.

Editor: Ariel Amar.

 

From Jason Orlosky's YouTube Channel 

 

It Started in 1879 with Microphones on Eyelids

It all started in the 1879 with a simple question: How do our eyes actually move when we read? Early researchers got... creative. And by creative, I mean crude. They taped microphones to eyelids to pick up each microscopic jump during reading. Yeah. Microphones. On eyelids.

But it worked. (The Psychology and Pedagogy of Reading (1908) - Page 25-26). Psychologists and educators discovered that tracking eye movement was a powerful tool for understanding perception, attention, and learning patterns.

Fast forward to 1980: NASA took it to the next level. They conducted a groundbreaking study analyzing pilots' eye movements to understand how they scan cockpit instruments mid-flight. The goal? Improve display design and operating procedures to reduce pilot error. The study was called "Instrument Scanning and Controlling: Using Eye Movement Data to Understand Pilot Behavior and Strategies" (NASA CR-3306, 1980). For all you UX designers out there? This is required reading.

 

1981 - Richard Bolt's demo at SIGGRAPH 

 

The First Real Use: Aiming with Your Eyes

 So... if you could track where someone's eyes are looking, why not use that as a command? That thought popped into many minds. And it didn't take long for someone to act on it. Not surprisingly, the first real-world application came from the military. During the 1970s, engineers at Honeywell, a defense contractor built eye-tracking systems for fighter jets. Pilots could aim weapons just by looking at targets on their cockpit displays.

Then Came the Creative Side, in 1981, researcher Richard Bolt showed a demo at SIGGRAPH (the annual Conference on Computer Graphics and Interactive Techniques) that flipped the script. Multiple video windows played different content simultaneously. The system tracked the user's gaze in real-time and used it as an input signal - automatically giving focus or zooming in on whichever window the user was looking at, the system also switched which audio stream was active based on your gaze!

 

Tobii Dynavox’s TD I‑Series enables eye‑gaze communication and independence for people with conditions such as cerebral palsy, Rett syndrome, or ALS

 

Talking with Your Eyes

The most impactful use of eye tracking has been giving people the ability to communicate back. People with ALS, cerebral palsy, or severe paralysis, who lost the ability to use their limbs but could still move their eyes could finally use this technology to write. To compose a message, a user simply looks at specific letters or icons on a screen, "clicking" them by dwelling their gaze on a key for a set amount of time.

It started in the early 1980s with the EyeTyper at Carnegie Mellon University and LC Technologies' Eyegaze System. By the 1990s, projects like EagleEyes at Boston College used this technology to capture tiny eye movements and map them to mouse control for children with limited mobility. In recent years, systems such as GazeDriver have even let users steer power wheelchairs with their eyes alone, using a screen-less interface. Today, Tobii Dynavox and Windows Eye Control help to make the use of technology more accessible by enabling PC gaze tracking.

 

 

When XR Got Eyes

Eye tracking initially made its way into VR as a way to save on processing for rendering graphics. Developers discovered they could render sharp graphics only where the user is looking and blur the rest, making VR run faster without anyone noticing.

But eye tracking also became a way to control XR. It started with research teams in the early 2010s. Groups at Microsoft Research and NVIDIA experimented with using gaze to select objects in VR, pairing eye position with a button press to confirm actions. By 2016, a Japanese startup called FOVE released the first commercial headset built specifically for gaze control, users could aim, select, and interact just by looking. HTC followed in 2019 with the Vive Pro Eye for enterprise and training.

Today, it's everywhere: Meta Quest Pro uses it for menu navigation, PlayStation VR2 for UI targeting, and Apple Vision Pro made it the primary input, look, pinch, and select. in the AR space, Everysight uses gaze tracking to navigate digital overlays completely hands-free.

Medusa's Midas touch

When Everything You Look at Turns into a Click

Medusa would have a hard time with eye tracking. If every time you look at something, the system thinks you want to click it... well, you've got a problem. Whenever you're you're reading, scanning or exploring.

This is called the Midas Touch problem, named after King Midas from Greek mythology who turned everything he touched into gold. This was pointed out in the 1990s by researcher Robert Jacob. So how did they solve it?

One solution used Dwell-time activation, the system only triggers if your gaze stays on a target for a set duration, filtering out quick glances. Another one, the Multi-modal input approach, your eyes provide the spatial coordinates, and another input method decides when to commit the action.

 

Input Convergence?

Eye tracking may bean excellent input solution for XR , and perhaps the future might see different input methods working together rather than competing.

But what if you could also point and click with more than just gazing?

The Mudra Link turnn hand movements into pointing commands, and a simple tap to click.

Cart 0

Your cart is currently empty.

Start Shopping