AR Performance art using Game Engine (Python solution?)

Game Engine, Players & Web Plug-in, Virtual Reality, support for other engines

Moderators: jesterKing, stiv

jbrem003
Posts: 4
Joined: Tue Jan 01, 2013 11:39 am
Location: NYC

AR Performance art using Game Engine (Python solution?)

Postby jbrem003 » Tue Jan 01, 2013 11:59 am

Hello board, I have been brainstorming an idea for quite some time and am hoping to toss it out and see if anyone has any ideas for a solution to a problem I've been facing in my work as a video designer and director of theater.

I have been working on developing a case study for "augmented reality theater" using projectors to create immersive environments in my work for about 2 years now and have reached a limit of my knowledge in coding and engineering.

What I would like to accomplish, is using a 3d rendering program (like Blender) and a data manipulation program (like Isadora) to run a theatrical show that allows an actor to exist within a virtual world that is then video mapped with projectors into an actual live space/theater.

While researching the capabilities of Blender and Python this is the proposed solution I have come up with:

Utilize Blender to create a scale model of the theater/lab space

create both a suit and prop that transmits OSC position data (think wiimote)

Use Isadora to receive and transmit position data into game engine for Blender (compositing position data with game effects)

Hang projectors in lab space to match the position of virtual cameras in the game engine in order to video map the environment in the game engine directly to projectors that would then recreate the "in-game" environment to real space in real-time. (Thinking of using Syphon framework into Isadora which then send video out to projectors.)

Theoretically, this would allow a character to exist within and control the virtual world created in Blender. I have a theater in NYC downtown area that I am able to workshop these ideas in but have reached my limit in the programming end. If anyone feels like offering constructive input to this proposed setup (improvements/changes) or would like to offer assistance in building this framework I am in eager need of help with this. In the meantime I will be scanning the forums for any other aid and doing what I can to learn more Python :p


Reference:
Isadora
Syphon
Video Mapping Wiki Article
Videomapping Demo Video

Thanks!
-Jbrem003[/url]
Lucidity and know-how are required to be a revolutionary

~Eugenio Barba

jbrem003
Posts: 4
Joined: Tue Jan 01, 2013 11:39 am
Location: NYC

Tracking people location in a room Thread

Postby jbrem003 » Tue Jan 01, 2013 5:27 pm

Found this in a later posting and is also very applicable. Looking into the methods described here. Some of this capability is in Isadora, but not all of it. A visit to MIT sounds in the works :)

CoDEmanX wrote:Paper about closed world tracking:
http://vismod.media.mit.edu/pub/tech-reports/TR-403.pdf

Paper Multi-cam multi-user tracking:
http://research.microsoft.com/en-us/um/ ... inal01.pdf

Paper Multi-Object tracking:
https://docs.google.com/viewer?a=v&q=ca ... 7AsQFAxpeA

Lucidity and know-how are required to be a revolutionary



~Eugenio Barba

StompinTom
Posts: 5
Joined: Mon Jan 30, 2006 4:47 pm

Postby StompinTom » Sat Jan 26, 2013 6:32 pm

Hey jbrem003, I'm working on something very similar, also through the game engine.

I've currently got perspective-corrected projections in a room (via a Kinect camera) and am trying to move the code over to the Game Engine to better control it in real-time.

Would be interested in chatting further about what you're up to. Feel free to email at tom.svilans@gmail.com.

Best,

Tom Svilans
tomsvilans.com

rty
Posts: 6
Joined: Fri Jan 25, 2013 6:31 pm

Postby rty » Sat Jan 26, 2013 8:34 pm

it depends, I don't know the details of your project but Blender uses Python, and Python, or CPython to be fair, is probably the slowest scripting solution that i know.

The blender game engine is also really old an it adopts a deprecated approach to old OpenGL APIs so what you produce with the Blender Game Engine it's not even granted to work.

gillespr
Posts: 1
Joined: Sun Feb 17, 2013 6:15 pm
Location: Eugene, Oregon, USA

Re: AR Performance art using Game Engine (Python solution?)

Postby gillespr » Sun Feb 17, 2013 7:51 pm

jbrem003 wrote:Hello board, I have been brainstorming an idea for quite some time and am hoping to toss it out and see if anyone has any ideas for a solution to a problem I've been facing in my work as a video designer and director of theater.


I am a playwright and software engineer who is interested in similar problems. It might be interesting to compare notes.

I am working on slightly more modest problems with Blender and theater.

First, making a 3D model of the theater in Blender.

Second, constructing backdrops, props, et cetera with monochromatic fabric/paint.

Third, make a video of a staged performance with cameras that are calibrated with respect to the location of the stage.

Fourth, using the known geometry of the objects to create UV maps which are used at compositing to project images or video onto the objects. A mask is created so that only the portion of the UV map which corresponds to a specified chroma key is used (so that actors occluding the backdrops and props would not have the UV mapped image composited over them).

So far I've done proof of concept work around doing each of these steps manually, but automating the process is something of a challenge.

Although I'm continuing to look into using Python within Blender for the whole process, I'm also seriously considering just using Python to export UV map information in some fashion to use with CImg or FFMPEG or some other more C++ centric platform. As a Software Engineer, developing large applications in Python offers some serious challenges that are easily overcome by using C++.

I am also interested in the whole process of Video Mapping in terms of using projectors within a traditional proscenium and outdoors at festivals, however I have been looking into a control process that is more along the lines of having the sound person (who has more free time than the light person) cue the specific video sequence using a customized panel of buttons on a dedicated computer based on annotations in the script (similar to the way Foley is done).

Blender figures into the Video Mapping process as a way of projecting onto irregular objects and/or accurately mapping onto the backdrops based upon limitations of projector placement.

Anyway, your project sounds more ambitious from a control standpoint.

One of the things I am trying to get at is efficiently being able to composite the video from the stage with the UV mapped images so that the video of a production on the stage can be composited and routed to the web as quickly as possible. Right now finding some way of using FFMPEG seems to be the most promising approach.

Would love to discuss your project further.

Cheers!

Robert

jbrem003
Posts: 4
Joined: Tue Jan 01, 2013 11:39 am
Location: NYC

Re: AR Performance art using Game Engine (Python solution?)

Postby jbrem003 » Tue Jun 25, 2013 7:43 am

gillespr wrote:I am a playwright and software engineer who is interested in similar problems. It might be interesting to compare notes.


Completely agreed! Sorry I disappeared for a bit, The thread didn't get too many hits for awhile so I hadn't been checking back as often as I shoul'dve. feel free to email me at jbrem003@gmail.com, would love to hear how your project ended up going off. I've been using a program called Isadora to meet my needs for now, but hope to insert some mocap graphics with an attached physics engine at some point. The video mapping you described is something that Isadora is built to handle decently well.

Best,
-Jon
Lucidity and know-how are required to be a revolutionary



~Eugenio Barba


Return to “Interactive 3d”

Who is online

Users browsing this forum: No registered users and 0 guests