I am set to embark on a journey as a PhD candidate, joining the team of Prof. Dr. Xin Tong at HKUST(GZ). Previously, I served as a research assistant at DKU HCI Lab , under the supervision of Prof. Dr. Xin Tong . I earned my master's degree in Digital and Interaction from Politecnico di Milano, where I completed my thesis under the supervision of Prof. Dr. Xin Tong, Prof. Dr. Ray LC , and Prof. Mario Covarrubias Rodriguez. My research interests encompass Extended Reality (XR), generative AI, and embodied interaction design. For instance, I specialize in developing dynamic prototypes such as projection-based AR for language learning and AI-driven virtual pets. If you share a passion for these areas, I'm enthusiastic about connecting to exchange ideas and collaborate.
In processing of Chinese CHI 2023
We designed the virtual pets' visual appearances by following the design guidelines from the Five Factor Model (FFM) in voxel format. We conducted a study to investigate people’s perceptions of virtual pets’ personalities and appearances through two user studies.
In processing of Chinese CHI 2023
Research on typical development (TD) children's privacy learning often neglects autistic children's unique needs. Therefore, our study aims to understand their challenges in learning privacy and design an effective privacy education game for them. We designed a serious game, RedCapes, and recruited 9 autistic children and 6 TD children to evaluate the game.
Proceedings of the 22nd Annual ACM Interaction Design and Children Conference
Explored the design of a prototype named WooGu with tangible user interfaces (TUI) and embodied interactions, which aims to improve young children’s food literacy. WooGu presents three design features: a cube displaying user interfaces, step-by-step tasks guiding children to learn food from farm to table, and hands-on props made by cardboard empowering embodied interactions.
2023 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)
In this work, we propose PetGen, an application that can generate varying virtual pets' appearances following the Five-Factor Model (FFM) based on AI algorithms. The 3D voxel virtual pets' appearances were initialized under four initial archetypes and then underwent recombination, dyeing, and texturing stages, where AI algorithms were applied as filters.
2022 International Conference on Cyberworlds (CW)
In this research, we designed a 360° VR video-based prototype environment to depict real-life scenarios of a refugee camp in Southern Bangladesh. Our study consists of 2 conditions: active (i.e., selecting thematic video clips freely) and passive (i.e., watching sequence-determined clips passively).