Pilot phase: 2019-2020
In Fall 2019, MIT.nano began to fit out the Immersion Lab with infrastructure, hardware, and software to introduce the space as a working laboratory.
The Immersion Lab will evolve with advances in technology and user needs. If there are tools that you would like to see in the Immersion Lab, please let us know.
Current tools & capabilities
- OptiTrack system
Virtual Reality (VR) head-mounted displays and controllers:
- HP G2 (can be connected to Vive Pro Eye or Oculus Quest) with hot-swappable batteries
- Alienware Workstations
- HP Z8 Data Science Workstation (coming soon)
3D digital asset creation & photogrammetry:
- Lenscloud for scanning human bodies and large objects (coming soon)
- Matterport Pro2 for interior environments
Video and multimedia creation tools:
- Green screens
- Video Cameras
- Lighting kits
Digital Twins of the Immersion Lab:
- Matterport scan. Email us to request access.
- Unity: MIT.nano has a virtual model of the Immersion Lab available as a Unity build. This model is registered to the physical dimensions of the lab to allow accurate tracking and boundaries. You can use it as a template to start your own projects that will use the Immersion Lab space. To get access to the model, please:
- Create an MIT github account if you do not already have one (http://github.mit.edu)
- Send an email with your github account name to email@example.com to request to be added to the MIT.nano Immersion Lab github: https://github.mit.edu/mit-nano-immersion-lab/
- Once you’ve been added, you will see two versions of the model:
- ImmersionCVBE: Built for HTC Vive Pro Eye with Optitrack and SteamVR plugins
- ImmersionCVBE_QuestAndroid: Built for Oculus Quest (coming soon)
- You may create your own fork in github and start developing right away!
Don't see a tool you need for your research? Email us with feedback.