Ph.D. Candidate · KAIST UVR Lab
Ph.D. candidate working on one of the most deceptively hard problems in HCI: making XR interactions good enough for everyday life.
Real-world use demands more than clever demos. Interactions need to be quick, subtle, and robust against false positives, especially when you're wearing smartglasses for hours, not just five minutes in a lab study. That's the thread running through my work on gesture, EOG, touch, and force-based input.
Advised by Prof. Woontack Woo · Collaborators: Thad Starner, Kai Kunze, Hui-Shyong Yeo
Each research thread is driven by a concrete problem in everyday XR use.
Glasses are already on your face. As an always-available, eyes-free platform, smartglasses offer input without the friction of reaching for a device.
The wrist is persistent, socially acceptable, and always within reach. A subtle wrist gesture is faster and less disruptive than pulling out a phone.
Gestural input enables eyes-free, hands-busy control β letting technology respond to how you naturally move, without demanding your full attention.
Electrooculography turns eye movement into intentional input. Unlike gaze tracking, EOG works in bright or dark environments and does not require a front-facing camera.
Not every command needs to be visible. Subtle interaction keeps users in context β no loud voice commands, no conspicuous gestures that draw attention in public.
An interface that misfires on accident is worse than no interface. Robust activation cleanly separates deliberate commands from the noise of everyday movement.
Tools, programs, and resources I help build and maintain.
As a fan of the Wearable Computing, managing the page with DB of publications from ISWC
Aggregating NASA-TLX workload scores across HCI studies for meta-analysis.
Managing the CT-AR program at KAIST.
Web Development & Academic Program Support | KAIST Graduate School of Metaverse