The process of creating a 3D scene designed to be displayed inside an immersive environment has been a research topic since the first VR devices. We tried to address the case fulldomes that can host up to tens of spectators. Creating for such environment requires frequent back and forth between their workstation and the dome, due to the video format. Similar concerns apply to sound as fulldomes often feature multi-channel sound systems.
In this talk, we present a platform built around Blender named EIS, which aims at shortening content production pipeline by giving creators the possibility to prototype while inside the dome. Using VR controllers, the user can import objects, images and sounds, transform them, sculpt and paint, navigate through the scene, and record an animation.
The EIS platform is a Blender addon that relies on other addons for: