Prototyping a robotics simulator: some questions

Game Engine, Players & Web Plug-in, Virtual Reality, support for other engines

Moderators: jesterKing, stiv

Post Reply
Skadge
Posts: 0
Joined: Sat Mar 08, 2008 4:39 pm

Prototyping a robotics simulator: some questions

Post by Skadge » Wed Oct 15, 2008 5:18 pm

Hello,

I'm working in a French robotics lab (LAAS-CNRS), and we are currently surveying several technologies to develop a new simulation platform.

I use Blender since a while for other purposes and I'm now investigating the BGE for this simulation project.

2 technical questions first: my first attempt was to simulate a "laser scanner" (a device which casts laser rays and get back distances of objects around the robot). First a simple Python script generates a mesh (an half disc made of, let say 20 vertices). Then, I use an "Always" sensor linked to another script to update the mesh according to collisions with surrounding objects (I use the KX_GameObject.rayCast() method of Blender 2.47).

Here's the code:

Code: Select all

def updateLaser():
	global owner

        # I get the laser beam mesh
        laser = Blender.Object.Get('RobotLaserScanner')
        mesh = laser.getData()
        
        v = [[0.,0.,0.]]
        
	# Update the mesh's vertices
	for v in mesh.verts:
		rayDirection = [v.co[0] + owner.getPosition()[0],v.co[1] + owner.getPosition()[1],owner.getPosition()[2]]
		hit = owner.rayCast(rayDirection , owner, 20.0, "")
		if hit[1]: # -> smthg collided
			v.co[0] = hit[1][0] - owner.getPosition()[0]               
			v.co[1] = hit[1][1] - owner.getPosition()[1]
	mesh.update()
This works perfectly, but, once in the Game Engine Mode ("P"), the mesh is not updated. I've to quit it ("Esc") to see the updated mesh.
Do you know a way to dynamically update the mesh?

Another question: I'd like to user the modular GUI of Blender to visualize the various sensors data and robot's cameras.
But, if I launch the simulation ("P"), only one viewport starts the simulation. Is there a way to globally start the simulation? (ie, on all viewports)?

Then, I've more general questions:
- is it possible, with Bullet, to closely follow the real time (I mean, the physical time)? We want to be able to do hybrid simulation (with both simulated and real robots), and it requires the simulator to be able to skip simulation steps in order to keep synchronised with the physical world.
- do you already have a nice IPC (over network) set of tools? or should I start to implement something (for instance based on Google's very efficient Protocol Buffer) We'll need, amongst others, to send images from Blender to clients (robots or simulated robots).
- last (more technical question): how can I store on a disk images from a Blender camera via Python?

Thanks a lot for your answers,
Severin Lemaignan

spikegomez
Posts: 0
Joined: Sat Nov 08, 2008 7:21 am

Post by spikegomez » Sat Nov 08, 2008 7:32 am

Hello let me search for your problem man... i hope that i could help you better...

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

Post by ideasman » Sun Nov 09, 2008 1:38 am

You cant use the 'Blender.*' modules in the BGE for anything other then read only data - and it wont be updated for changes from the BGE.

For editing a mesh see -
http://www.blender.org/documentation/24 ... class.html

Post Reply