I don't know if this is the right place to post here, so sorry if not
3D becomes more popular now and I wonder if there will be the possibility to render more than one cam in one scene.
This is necessary for stereoscopic 3D images.
Therefore one has to prodouce one image for the left eye and one image for the right eye and combine both for e.g. anaglyph (red/cyan) images.
Now I use two scenes with linked objects. One scene (LEFT) has an active camera on left eyes position and another scene (RIGHT) has one active camera on right eyes position. Then I use the node editor to combine images of both scenes to an anaglyph image which can be viewed using red/cyan glasses.
This workflow is timeconsuming since all objcects need to be copied and linked to both scenes. All scene seetings one has to do twice for left scene and right scene.
It would be very helpful that Blender allows rendering of two or even more cameras in one scene. This has not to be at same time one by one is fine for first approach. Then images of all cameras must be available in node editor (in a kind of render stack) to combine them.
Do you think this can be done in one of future releases of Blender?
Thanks for any information!
seems it is wrong thread, but anyway look in search engines.
the first link I got was
thanks for reply, but sorry the tutorial you posted did not really help.
It is still necessary to use TWO scenes to get the render result for lefty eye cam and right eye cam. And this is what I wanted to avoid!
So I keep my feature request
Just add both Cameras to the same scene and switch between them. Parenting them both to an Empty makes it easier to position them.
thanks, but how can I add two cameras to same scene if this is exactly the problem?
Two cams in one scene I can only render one by one, and this is the problem for the node editor.
How should I take the render result of both cameras into the nodes at the same time?
I can create image of first cam and save it then put this image into a node, but this is also very complicated compared to two-scene-solution I am using now.
Just a new feature would be great to render more than one cam in same scene.
Blender should render the cams one by one and put the render result in some kind of image stack. This stack then must be available as input source for the node editor.
This would help a lot, so not a complicated 3D stereo cam I need (with focal plane, vergence, camera distance, parallell or focused cams and so on) just this render stack would help a lot, the stereo cam I can create by myself using an empty and so on!
So this request is still open
I'm wondering how Blender's stereoscopic imaging can be used with VR goggles like the Vuzix VR920 (including head tracking). Anyone know?