Previous Thread  Next Thread

chat icon "Mount" a Blender camera as system device

Skadge

Posted: Fri Sep 12, 2008 10:22 am
Joined: 08 Mar 2008
Posts: 6
Hello!

I'm scientist in the robotics field, and we're thinking of starting a new simulation tool based on the BGE.
I was wondering if it *could* be possible to "mount" a Blender camera on the file system as a standard V4L device (ie a device accessible through /dev/video).

Is it currently doable (...I don't think so, but...), or do you think of a "path" to implement it?

It would be awesome to provide our robot a video stream similar to "real world" vision.

SÚverin
Reply with quote


malCanDo

Posted: Fri Sep 12, 2008 7:08 pm
Joined: 21 Oct 2002
Posts: 206
Hi there,


You could check out this link, which relates to getting video working within the GE...
http://www.ash.webstranka.info/?page_id=4

Here is a video showing it in action...
http://www.ash.webstranka.info/?p=41

( Ash has also worked on getting Augmented Reality working within the GE also, using ARToolkit ).


I'm not sure if the code is compatible with the latest version of Blender / GE, but I'm sure if you contact the author he could give you more info ( especially if you are able to fund his time to update it, if it's not compatible ).

It's great to see the GE being considered for more scientific projects!
Mal
Reply with quote


spamagnet

Posted: Fri Sep 12, 2008 7:20 pm
Joined: 01 Jul 2008
Posts: 18
Skadge wrote:
I was wondering if it *could* be possible to "mount" a Blender camera on the file system as a standard V4L device (ie a device accessible through /dev/video).


Blender has a built-in frame server you may be able to hook into. I have no idea if it works with the BGE.
_________________
OpenFilmmaking / BlenderAVC
Reply with quote


malCanDo

Posted: Fri Sep 12, 2008 7:51 pm
Joined: 21 Oct 2002
Posts: 206
Hi there,

I read your message wrong there...

> "mount" a Blender camera on the file system as a standard V4L device (ie a device accessible through /dev/video).

So, you want to have the output of the GE appear as a real-time video, which you can feed into your robot?

I'm not sure if you can access the OpenGL frame buffer directly from the GE/ Python, but it is definitely possible. You could then feed that OpenGL image a frame at a time, either directly to the visual input of your robot, or to the input of a intermediate virtual camera device.

Mal
Reply with quote


Skadge

Posted: Fri Sep 12, 2008 11:57 pm
Joined: 08 Mar 2008
Posts: 6
Thanks for these ideas. I'll investigate them more in depth!
Reply with quote


myselfhimself

Posted: Mon Sep 15, 2008 11:05 am
Joined: 30 Jan 2005
Posts: 7
Hi,
I've worked on a robot simulator (comparable to a small car video game) for the Eurobot 2006 competition. with the BGE in a view to try and help our robot association get some ideas on their IA development (but the project was never used by them because I was discovering Blender day after day).

This camera thing is really something I've been looking for and I thing could help our association as well.

here's a setup I've thought of early in a Eurobot's competing robot development toolkit:
* model & texture the environment and robot in Blender with things set up to follow physics. (ie : if there are balls, make them physics enabled, if your robot has wheels, make it controllable...)
* make something being able to control the robot externally and fetch signal (camera images and captor data) out of blender.

I wrote this quickly and this is a mess.
Basically, you have Blender as an entry point for : controlling the robot, reading back some details from a fake camera and fake captors.
And the brain of that is written outside blender in python/C which can be then ported to microcontrollers/processors.

@Skadge
looking your name up in google shows you're part of planete sciences.
If you develop some toolkit similar to the one I've exposed, would you make it open? I think that many engineering schools would be fond of that and use it and not juste make up another development set up each year for Eurobot.

sincerely,
jonathan
Reply with quote


Skadge

Posted: Mon Sep 15, 2008 11:29 am
Joined: 08 Mar 2008
Posts: 6
Hello Jonathan,

Yes, I'm part as well of Planete Sciences (actually, if you attended some of the Eurobot or Coupe de France events over the past years, you probably know me... the "pink hairs").

Of course, if we start a simulator in Blender, it will be GPLized, as any software from the LAAS lab, where I work.
Reply with quote


myselfhimself

Posted: Mon Sep 15, 2008 5:22 pm
Joined: 30 Jan 2005
Posts: 7
Ok nice ! thank you

I've found this :
http://www.thedirks.org/v4l2/v4l2dwg.htm
see also : http://www.cryptomath.com/~doug/2007-eos/V4L.pdf
http://v4l.videotechnology.com/
you (or people...) could write a v4l2 driver (that's what you proposed... but maybe v4l2 is easier to writer/cleaner to write than v4l).
OpenCv does support v4l2 and v4l as I've seen : http://osdir.com/ml/lib.opencv/2006-01/msg00699.html
maybe you use something else for image processing.

Jonathan
Reply with quote


 
Jump to:  
Powered by phpBB © 2001, 2005 phpBB Group