Texturing idea: TextureMontage style from SigGraph05

The interface, modeling, 3d editing tools, import/export, feature requests, etc

Moderators: jesterKing, stiv

Post Reply
Posts: 0
Joined: Fri Apr 22, 2005 10:33 am

Texturing idea: TextureMontage style from SigGraph05

Post by NeARAZ »

While reading the publicly available Siggraph 2005 preprints, I found the "Texture Montage" paper/video from Microsoft Research and Caltech guys. Check it out here: http://research.microsoft.com/users/kunzhou

In essence, the workflow idea is this: you have a 3D model and a set of input images (e.g. photos). Now, you specify correspondences - mark surface area on the mesh, mark area on one of the images. You don't have to cover the whole mesh with these marked areas.

Then, the system textures the mesh, calculates unwrapped UVs, semi-automatically textures the unmarked regions, matches colors/gradients along area boundaries, etc. etc.

For me that seems like a big step forward, compared to the traditional "UV map your model, draw the texture and somehow make it look good along the seams" approach.

I'm not a good Blender user (I'm a coder, just used blender to uv-map some objects because LSCM is cool) though. How difficult would it be to implement this into Blender (I understand the algorithms themselves, even while they're relatively complex; but I don't have knowledge of Blender's internals)? To me, it seems like this would require:

* The UI to mark "areas" on the mesh (select vertices, the system would find shortest edge paths between them).
* The UI to mark "areas" on the image (draw 2D polygons - in which window type?)
* The UI to nicely select multiple images (they don't have to be materials/textures, just the input images).
* The system/algorithms themselves. This is pretty Blender-independent.

Posts: 0
Joined: Thu Oct 07, 2004 11:05 pm

Post by tiggs »

Pretty cool, thanks for the link. I think this has pretty good chances to replace the traditional means some time.


Post Reply