Tracking people location in a room

Game Engine, Players & Web Plug-in, Virtual Reality, support for other engines

Moderators: jesterKing, stiv

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Tracking people location in a room

Postby FabioFrosaa » Fri Oct 05, 2012 6:31 pm

Hello everyone.

I'm trying to make an instalation with blender. For that I need to track people location in a room, a flat location with x and y, no z needed. I'm wondering how would that be possible and what do I need to study to achieve it.

would a camera or more than one camera be enought using blenders tracking system?

or would ultra-sound sensors with arduino be more like it?

I'm not asking for a clean solution. A diretion of what I should study is my goal.

thanks in advance.

CoDEmanX
Posts: 894
Joined: Sun Apr 05, 2009 7:42 pm
Location: Germany

Postby CoDEmanX » Fri Oct 05, 2012 10:56 pm

Blender isn't a good choice for realtime tracking of persons...

it can recognize motion, but not persons

maybe google for face tracking / openCV

Microsoft Kinect is an interesting option too, but it can recognize max. 2 persons and only in a certain frustum
I'm sitting, waiting, wishing, building Blender in superstition...

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Postby FabioFrosaa » Mon Oct 08, 2012 10:26 pm

Thanks. OpenCV looks like a good way, its been a headache to install it anyway. I'm no programmer and it asks for C/C++, compilation and all stuff. But I'm reading it all and trying to understand.

I would not need exactly a facial tracking. Something closer to an object tracking seems more like it. I imagine a camera on the ceiling pointing down and counting people as dots, using its movement to move objetcs around space.

I'll post progress if I have any.

Kinect seems like a solution, but it would require me to buy one and more programmer knowledge, that I do not have.

I have a wiimote, if there is a chance to be used.

CoDEmanX
Posts: 894
Joined: Sun Apr 05, 2009 7:42 pm
Location: Germany

Postby CoDEmanX » Mon Oct 08, 2012 10:46 pm

I'm sitting, waiting, wishing, building Blender in superstition...

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Postby FabioFrosaa » Tue Oct 09, 2012 12:43 am

I'm really grateful for your indications, CodEmanX

That is some deep study to get into.

I'll coment what I get on in the process, but it seems I'll be gone for some days to get pass trough it all.

raoramesh
Posts: 1
Joined: Tue Oct 16, 2012 11:45 am
Location: USA

Postby raoramesh » Tue Oct 16, 2012 11:50 am

Could you tell me what is a good choice for realtime tracking of persons.... ?

Also, if blender recognizes realtime motion, where can I find info as to what input I can give from an external sensor - camera or otherwise
Thanks... :)


CoDEmanX wrote:Blender isn't a good choice for realtime tracking of persons...

it can recognize motion, but not persons

maybe google for face tracking / openCV

Microsoft Kinect is an interesting option too, but it can recognize max. 2 persons and only in a certain frustum

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Postby FabioFrosaa » Wed Oct 17, 2012 8:38 pm

Well... I'm am no programmer as I said. I am finishing visual arts graduation and I'm trying to find some solution more directed to people like me.

After some long search I found the eyeweb software.
http://www.infomus.org/eyesweb_eng.php

It has a visual programming interface and it uses a library in wich you can click and drag to workspace to start "programming".

I found also that people have connected it to blender.
And it has a wiimote block.

So I was wondering about using the wiimote and IR leds to track peoples location. I dont need exactly a tracking system. I need a system to know where people are at the moment, no need to know where they have been.

Until now I got the eyesweb to recognize my wiimote, and I used an output matrix block, "MatrixValueVsIndexDisplay", to show the information and it worked. I got a graph, not a matrix with dots showing the leds location as I expected.

So I'm far from the solution I hope to get.

When I get some progress I'll post here.
If anyone already knows solutions, like how to connect eyesweb to blender, please help.

Thanks.

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Postby FabioFrosaa » Tue Oct 23, 2012 12:40 am

reporting the last researches, hope it helps someone in future or helps someone to help me now!

I discovered pure data. Its nice because it works on linux, unlike eyesweb, and it seems to have been used connecting wiimote and blender more often then eyesweb.


It seems you need to install "cwiid",
http://abstrakraft.org/cwiid/
which is a wii library for linux
it comes with "wmgui", which you can execute and test wiimote. With this one I can see that all of mine wiimote functions are working properly

I followed this step by step made by Winko Erales to install pure data and cwiid and to make the pure data open the wiimote file.
http://www.winko-erades.nl/index.php?op ... nt&catid=1

In this site I found a step by step to make pure data communicate the wiimote with blender:
http://wiki.labomedia.org/index.php/Wii ... _pure_data

It has another pd file that seems to be more up to date then the other site.

Altought with both pd files I could get the buttons, leds and rumble to function, but not the IR nor the accel

Still don't know if it is possible to use the IR blobs of pure data as inputs for x and y position of objects inside blender, but its output is already closer to what I imagined then the eyesweb's graph that I got.

NZavaloff
Posts: 1
Joined: Fri Oct 26, 2012 10:52 am
Location: Астраханс&

Postby NZavaloff » Fri Oct 26, 2012 10:53 am

Kinect seems like a solution, but it would require me to buy one and more programmer knowledge, that I do not have.

I have a wiimote, if there is a chance to be used. Image

FabioFrosaa
Posts: 17
Joined: Tue Mar 06, 2012 1:01 pm

Postby FabioFrosaa » Mon Oct 29, 2012 3:06 pm

NZavaloff wrote:Kinect seems like a solution, but it would require me to buy one and more programmer knowledge, that I do not have.

I have a wiimote, if there is a chance to be used. Image


I meant that I would have to learn much more about programming AND I would have to buy a Kinect.

NZavakiff, you can write your useful post now.

jbrem003
Posts: 4
Joined: Tue Jan 01, 2013 11:39 am
Location: NYC

Re: Tracking people location in a room

Postby jbrem003 » Tue Jun 25, 2013 8:02 am

FabioFrosaa wrote:Hello everyone.

I'm trying to make an instalation with blender. For that I need to track people location in a room, a flat location with x and y, no z needed. I'm wondering how would that be possible and what do I need to study to achieve it.


Definitely possible, but Blender might not be your only solution. I'm currently looking into connecting Blender to a program called Isadora. Isadora allows for camera manipulated point tracking using what are called "blobs" A camera alone, without any processing can't track anything in a room, but, if your camera takes a "snapshot" and then compares that snapshot to what the camera is currently seeing through a Difference filter, then it will detect anything moving and give its shape. If that shape is a top down, slightly blurred person, then it is a recognizable point that can be tracked in two axes. Getting this data is all possible in Isadora (if the lights stay pretty much the same) but getting it into Blender as a value is the tricky part.

Best of luck!
-Jon
Lucidity and know-how are required to be a revolutionary

~Eugenio Barba

stiv
Posts: 3646
Joined: Tue Aug 05, 2003 7:58 am
Location: 45N 86W

Postby stiv » Tue Jun 25, 2013 5:27 pm

Interesting topic. A minor note: OpenCV, like many libraries, has Python bindings. This makes it easier to integrate other code with Blender. (don't forget: Blender uses Python 3)


Return to “Interactive 3d”

Who is online

Users browsing this forum: No registered users and 1 guest