Ching-Wen Hung | 洪靖雯 (HOSI)

Ching-Wen Hung is a Ph.D. candidate from the Interactive Graphics Lab and the Communications and Multimedia Lab at the Graduate Institute of Networking and Multimedia, National Taiwan University, advised by Prof. Bing-Yu Chen. Her current research interests focus on Embodied Interaction, Haptic and Gesture Interfaces, Virtual/Augmented Reality (VR/AR), Inclusive Design, Perception and Behavior, and Novel Techniques and Interaction in the Human-Computer Interaction (HCI) field.

Email: ching-wenh@sigchi.org
[CV] / [Google Scholar] / [ORCID] / [LinkedIn] / [Twitter]

Research Direction

Designing Embodied Interaction for Accessible and Expressive Systems

I am interested in designing embodied interaction systems that treat the human body not only as an input interface, but also as a medium for shaping perception, experience, and social connection.
My research focuses on three complementary dimensions: embodied input (gesture), embodied output (force), and accessibility-oriented perception. Through gesture-based interaction, I explore how bodily movement can serve as intuitive and expressive input for virtual and physical systems. Through force and haptic feedback, I investigate how physical sensations can shape users’ perception, cognition, and behavior. Beyond interaction efficiency, I am also interested in how embodied systems can support accessibility and inclusive design by leveraging alternative sensory modalities and adapting interaction to diverse users’ abilities.
Across virtual reality, wearable devices, and everyday interactive systems, my work aims to create technologies that foster meaningful engagement, reduce barriers, and support shared experiences between people with different abilities and backgrounds. I believe embodied interaction can move beyond abstract control mechanisms and become a foundation for more inclusive, expressive, and human-centered computing.

Publications

ElderPlay: Supporting Age-Inclusive Gameplay for Older Adults via Real-Time Gesture-to-Controller Translation

Ching-Wen Hung, Che-Wei Hsu, Wei-Tang Hsu, Tzu-Chin Chiu, Li Lin, Yao Cheng Lee, Ting-Wu Chang, Hsien-Hui Tang, Bing-Yu Chen, Mike Y. Chen
DIS 2026 Paper (to appear)

TBD

Publications

TacNote: Tactile and Audio Note-Taking for Non-Visual Access

Wan-Chen Lee, Ching-Wen Hung, Chao-Hsien Ting, Peggy Chi, Bing-Yu Chen
UIST 2023 Paper

[paper] [video] [30s video]

Puppeteer: Exploring Intuitive Hand Gestures and Upper-Body Postures for Manipulating Human Avatar Actions

Ching-Wen Hung, Ruei-Che Chang, Hong-Sheng Chen, Chung Han Liang, Liwei Chan, Bing-Yu Chen
VRST 2022 Paper

[paper] [video] [30s video]

OsciHead: Simulating Versatile Force Feedback on an HMD by Rendering Various Types of Oscillation

Ching-Wen Hung, Hsin-Ruey Tsai, Chi-Chun Su, Jui-Cheng Chiu, Bing-Yu Chen
MobileHCI 2022 & PACMHCI 2022 Paper

[paper] [video] [30s video]

ElastOscillation: 3D Multilevel Force Feedback for Damped Oscillation on VR Controllers

Hsin-Ruey Tsai, Ching-Wen Hung, Tzu-Chun Wu, Bing-Yu Chen
CHI 2020 Paper

[paper] [video] [30s video] [presentation]

Projects & Demos

FingerPuppet: Finger-Walking Performance-based Puppetry for Human Avatar

Ching-Wen Hung, Chung-Han Liang, Bing-Yu Chen
CHI 2024 Late-Breaking Work

[paper] [video]

Puppeteer: Manipulating Human Avatar Actions with Intuitive Hand Gestures and Upper-Body Postures

Ching-Wen Hung, Ruei-Che Chang, Hong-Sheng Chen, Chung-Han Liang, Liwei Chan, Bing-Yu Chen
UIST 2022 Adjunct

[paper]

Tonopoly: A Monopoly-like Board Game with Toio

Chi-Huan Chiang, Ching-Wen Hung
UIST 2021 Student Innovation Contest

[paper] [video]

Demonstration of ElastOscillation: A VR Controller Providing 3D Multilevel Feedback for Damped Oscillation

Ching-Wen Hung, Tzu-Chun Wu, Hsin-Ruey Tsai, Bing-Yu Chen
CHI 2020 Extended Abstracts

[paper]