Michelle B. Xu
Using EEG sensors, we aim to capture fluctuations in brain waves associated with calmness, which will activate transformative changes within the XR environment. These changes may include walls visually "cracking" upon touch and the emergence of natural elements like trees and flowing water. This project explores the creation of dynamic spaces that adapt in both function and form, inspiring users to interact with their surroundings in innovative and mindful ways.
March - April 2024
- EEG Sensor
- Unreal Engine
- Blender
- Meta Quest 3
-
AR Development
- Signal Processin
Credits:
Lead Artist/Technologist:Michelle B. Xu, Wei Wu
Unreal Artist: Yoki Ding
- Yale Universily CCAM Ultra Space Symposium, New Haven, April 2024
- NYC XR Guild and Global Events, New York, Oct 2024
Project Overview
In an age where digital and physical realms increasingly converge, our project reimagines human-space interaction within extended reality (XR). Inspired by the philosophies of Arakawa and Gins, we envision a biologically responsive environment that blurs the traditional boundaries between indoor and outdoor spaces. Using EEG sensors to monitor brain activity associated with calmness, this environment adapts dynamically to its users.
This project responds to a question posed by the Yale University CCAM Ultra Space Symposium.
Theoretical Framework
Experience Diagram
Challenges and Solutions
While XR excels in crafting responsive environments, the bulkiness of current headsets remains a significant technical barrier to achieving truly immersive interactions.
Project Details
Presentation Video