MIT.nano Immersion Lab UROP Openings

The MIT Undergraduate Research Opportunities Program (UROP) cultivates and supports research partnerships between MIT undergraduates and faculty—whether you join established research projects or pursue your own ideas. The following are open UROP positions at the MIT.nano Immersion Lab for Fall 2021.

Note: You must be logged in to elx.mit.edu with your MIT kerberos to view links.

1. Physiological manipulation of XR avatars for sports training

(Tags: Biofeedback, XR, Simulation, Motion Capture)
Department: CSAIL

The goal of this project is to create a virtual training environment for fencing. Movements from experienced fencers will be captured using motion capture technology at the Immersion Lab, and presented to beginner fencers in VR. Movements and physiological responses of beginner fencers in the immersive environment will be compared to their responses in a more traditional learning setting.

The UROP student will be responsible for processing motion capture data, and transferring the movements of experienced fencers onto virtual avatars captured in the lab using photogrammetry. The student will also be responsible for manipulating the movements of the virtual avatar to create a self-paced training app for beginner fencers. Mentorship and guidance will be provided along every step.

2. Event triggered multi camera capture photography 

(Tags: Photography, Bullet-time, Freeze Frame, Motion Capture)
Department: EECS

In this project, we aim to create a networked raspberry pi camera system with over 100 cameras that can capture key moments during performances, and/or sports training. Moments to be captured will be defined by motion capture and wireless physiological monitoring technology - for example, when the activity of a certain muscle increases beyond a specified threshold. Hardware and software required to achieve triggered captures is already set up in the lab.

There is a technology-oriented, and an art-oriented UROP position available with this project. The technology oriented UROP student will be responsible for linking the triggering to the sensing equipment in the Immersion Lab, which include motion capture and wireless electromyography, electrocardiography, and force sensors. The art-oriented UROP student will be responsible for unleashing their creativity to make full use of this technology. 

3. Immersive game development for lab based training and visualization 

(Tags: Unreal, 4D data, Microscopy, XR, Simulation)
Department: MIT.nano

The goal of this project is to develop lab based training simulations developed to train researchers, scientists and students about best practices and safety measures for complex and costly research methods like Cryo-EM, MRI, and other clinical and non-clinical techniques. 

The UROP student will be well equipped and knowledgeable about laboratory instruments with ideas on how to gamify and create immersive experiences to train users on what to focus on during experiments, where issues can arise, and the dangers that can come through negligence or mistakes.

4. Physiology and musical performance

(Tags: Guitar, Motion Tracking, EKG)
Department: Biology

In this project, we are aiming to capture the physiological effects during a musical performance. Several humans find playing music inherently rewarding, and this project is inspired by some preliminary observations in the lab that point to a physiological state common to musical performance and meditation. For this study, several guitarists will be recruited to perform at the Immersion Lab while their movements, physiological responses, and audio are captured. Data will be analyzed to identify biomarkers indicative of enjoying one’s own performance, and biomarkers indicative of stress induced by the guitar strap.

The UROP student will be assisting with data collection during musical performances, data organization, and analysis of physiological signals (using python).

5. Immersive environment for manipulation of social experiences

(Tags: XR, Discrimination, Social Psychology)
Department: CSAIL

The purpose of this study is to understand the implications and psychological phenomenon that take place through interactions among people in immersive simulations and virtual social experiences. The study seeks to answer the behavioral and physiological inferences that can be measured during photorealistic social situations and its influence alongside virtual human interactions. Much of the data will include avatar representation and users decisions during experiences from eye tracking, head movements and other physiological monitoring. 

The UROP student will be responsible for constructing specific social experiences for users to take part in and to assess the reactions and behavior of participants. Realism is of high importance, so the student will be able to configure many of the social behaviors that will be used within the simulation using real motion tracking data, facial tracking and haptic feedback for the development. Dynamic development will also be necessary based on decision making of the participant as eye gaze interactions and heart rate monitoring should alter the state of the simulation.

6. Influence of music on human locomotor coordination

(Tags: Biomechanics, Physiology, EMG)
Department: Mechanical Engineering

The purpose of this study is to develop a more comprehensive understanding of the effects of rhythm on human locomotor coordination. muscle activation and muscle length during everyday movements such as walking, running, jumping, and reaching. The intuitive motivation behind this study is the enjoyment of training (e.g. running) with music. In this study, we are interested in monitoring state changes in the physiology of the muscle, and understanding how that relates to efficiency and coordination. State changes in the muscle length modulation will be monitored using motion capture, and state changes in muscle activity will be monitored using electromyography.

The UROP student will help their mentor with subject recruitment, data collection, processing and analysis. This is an excellent project to get hands-on experience with human subjects research.

7. Biofeedback user interfaces and their impact on movement training

(Tags: Delsys, Programming, XR, UI)
Department: EECS

In this project, we are looking to develop efficient and easy to use User Interfaces unique to biofeedback in simulation training and XR experiences. Movements and details about human physiology can be captured using Motion Tracking and biosensors like EKG and EMG. Having a user interface where users can visualize and interpret their body movements can provide real time biofeedback for training purposes in sports, music, and education and can assist in displaying how the body is changing during events.

There are two UROP positions available for this project - one to develop (developer) the user interfaces, and another to assess the efficacy of biofeedback (researcher) on movement training. The developer will shadow their mentor when interacting with sports coaches to understand what types of biofeedback is useful for improvement in their field. Then, they will be responsible for programming in Unity/Unreal engine to bring that idea to fruition. The researcher will develop a study with their mentor to assess if the biofeedback is indeed creating an improvement in the athlete, and be responsible for recruiting athletes, data collection and analysis.

8. Influence of training on posture

(Tags: Photogrammetry, 3D Mesh Alignment, Anatomy)
Department: CSAIL

Tracking the posture and body shape of an individual over time can yield useful insights into habits and training that exert a positive influence on our body and perhaps well being, track progress of recovery in case of injuries, etc. In this project, we aim to acquire 3D scans of a body at different time points and align the scans to quantify and track changes to posture over time. 

The UROP student will be taught to use the photogrammetry scanner to acquire raw data, create 3D models from the data, and align 3D scans acquired at different time points, and extract quantifiable information from the 3D models.