Content area
Extended Reality (XR) encompasses Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) --- technologies that blend virtual and physical worlds to create new forms of spatial computing. However, while XR can deliver rich spatial content, users struggle to interact naturally with it due to reliance on abstract controller mappings and transplanted 2D interaction metaphors, creating a significant interaction bandwidth bottleneck. To address this challenge, I propose Embodied XR, which accelerates the sensing-interaction cycle by leveraging the full affordances of the human body as the interaction medium, enabling natural interaction across diverse user populations. My dissertation approaches this vision from two complementary perspectives: Interaction and Sensing. From the interaction perspective, turning the body into interfaces enables users to transfer their innate bodily knowledge to novel XR interactions. From the sensing perspective, advanced sensing technologies unlock richer design spaces for embodied interaction. My contributions expand the scope of embodied interaction design from concrete digital entities (Hand Interfaces) to abstract digital entities (Finger Switches) and physical entities (Arm Robot), while developing novel sensing technologies, including contact-free force sensing (ForceSight) and addressing accessibility challenges (Embodied Exploration). These efforts collectively create more intuitive, efficient, and accessible interactions for next-generation spatial computing.