interactive lighting for rendered images

General discussion about the development of the open source Blender

Moderators: jesterKing, stiv

TorQ
Posts: 29
Joined: Wed Jan 29, 2003 2:03 am

interactive lighting for rendered images

Post by TorQ » Tue Aug 09, 2005 8:54 pm

Really cool paper from this years Siggraph. Would be nice to see this in Blender in the future!!

http://tinyurl.com/9nb9u

http://www.vidimce.org/publications/lpi ... ressed.pdf

http://www.vidimce.org/publications/lpi ... ggraph.mp4


Abstract

In computer cinematography, the process of lighting design involves placing and configuring lights to define the visual appearance of environments and to enhance story elements. This process is labor intensive and time consuming, primarily because lighting artists receive poor feedback from existing tools: interactive previews have very poor quality, while final-quality images often take hours to render.

This paper presents an interactive cinematic lighting system used in the production of computer-animated feature films containing environments of very high complexity, in which surface and light appearances are described using procedural RenderMan shaders. Our system provides lighting artists with high-quality previews at interactive framerates with only small approximations compared to the final rendered images. This is accomplished by combining numerical estimation of surface response, image-space caching, deferred shading, and the computational power of modern graphics hardware.

Our system has been successfully used in the production of two feature-length animated films, dramatically accelerating lighting tasks. In our experience interactivity fundamentally changes an artist's workflow, improving both productivity and artistic expressiveness.

malefico
Posts: 43
Joined: Mon Oct 14, 2002 6:51 am

Post by malefico » Thu Aug 11, 2005 4:17 pm

:shock: :shock: :shock: :shock: :shock: :shock: :shock:

Toon_Scheur
Posts: 0
Joined: Sat Nov 06, 2004 6:20 pm

Not a chance

Post by Toon_Scheur » Thu Aug 11, 2005 5:03 pm

I always skip those hardware/ GPU assisted papers. You can't put that in Blender.

I think that this will stand a much better chance to be implemented in Blender:
http://graphics.ucsd.edu/papers/plrt/

This is like radiosity on steroids. I've read the paper and the solution is well explained and is very ellegant.

solmax
Posts: 86
Joined: Fri Oct 18, 2002 2:47 am
Contact:

Re: Not a chance

Post by solmax » Fri Sep 09, 2005 10:50 pm

Toon_Scheur wrote:I always skip those hardware/ GPU assisted papers. You can't put that in Blender.
why is that?

z3r0_d
Posts: 289
Joined: Wed Oct 16, 2002 2:38 am
Contact:

Re: Not a chance

Post by z3r0_d » Sat Sep 10, 2005 2:14 am

solmax wrote:
Toon_Scheur wrote:I always skip those hardware/ GPU assisted papers. You can't put that in Blender.
why is that?
because it will not work everywhere

[to some extent there are features in some versions of blender not in others, like windows codec support only exists in windows... but this would be annoying, what if someone has an ati card? what about a geforece 2 instead of 6800? what about intel chipsets? sis? a voodoo card?]

joeri
Posts: 96
Joined: Fri Jan 10, 2003 6:41 pm
Contact:

Post by joeri » Sat Sep 10, 2005 10:13 am

Don't make more out of it than what it is.

This, or something like this, would be perfectly possible in blender.

Don't let the 0.1 sec misguide you, that's the second render pass. The first one also takes 1000 sec.

It's a "where shall I put the light" tool.
You render an image, without lighting, store that, than render a hardware only light image (standard openGl) multiply those two, now the artist can adjust the light and see realtime what that will do to the image.

rcas
Posts: 0
Joined: Tue Aug 31, 2004 6:08 pm
Location: Portugal
Contact:

Post by rcas » Sun Sep 11, 2005 2:27 pm

joeri wrote:Don't make more out of it than what it is.

This, or something like this, would be perfectly possible in blender.

Don't let the 0.1 sec misguide you, that's the second render pass. The first one also takes 1000 sec.

It's a "where shall I put the light" tool.
You render an image, without lighting, store that, than render a hardware only light image (standard openGl) multiply those two, now the artist can adjust the light and see realtime what that will do to the image.
Isn't lighting the most consuming part of any render ???
How to use a Blender:
Put your model, rig, animation and textures in the Blender, turn the Blender on and wait for it to Render, then turn the Blender off and show it to your friends.

Toon_Scheur
Posts: 0
Joined: Sat Nov 06, 2004 6:20 pm

Post by Toon_Scheur » Sun Sep 11, 2005 4:06 pm

Isn't lighting the most consuming part of any render ???
Shadow calculation is the most expensive part.

Joeri:To explain those GPU assisted solutions: It cannot be implemented in Blender. GPU means Graphics Proccesing Unit, which means that Pixar had custom hardware designed and built for that particular implementation.

Even if it is an off the shelve hardeware, it isn't garantueed that is available for all platforms, all OS'es all users etc.

Since Blender is CPU rendering only (thus one of the factors that makes it platform independant), I cannot see how this could be ever implemented in Blender.

LetterRip
Posts: 0
Joined: Thu Mar 25, 2004 7:03 am

Post by LetterRip » Sun Sep 11, 2005 8:21 pm

Toon,
GPU means Graphics Proccesing Unit, which means that Pixar had custom hardware designed and built for that particular implementation.
I seriously doubt they used custom hardware. Any decent graphics card contains a GPU.
Even if it is an off the shelve hardeware, it isn't garantueed that is available for all platforms, all OS'es all users etc.
Ton has stated that GPU accellerated is acceptable as long as there is a non GPU accellerated path.

Also there are high level GPU programming languages that can compile to CPU code if an adequate GPU is unavailable.

LetterRip

joeri
Posts: 96
Joined: Fri Jan 10, 2003 6:41 pm
Contact:

Post by joeri » Sun Sep 11, 2005 10:27 pm

" Joeri:To explain those GPU assisted solutions: It cannot be implemented in Blender. GPU means Graphics Proccesing Unit, which means that Pixar had custom hardware designed and built for that particular implementation. "

I don't get your point.
Why would blender not use anything like Cg ?
Ton once told me blender would never run under windows...
It just needs to become beyond standard and then he will goble behind the fact.

btw "GPU means Graphics Proccesing Unit..."
Thats not what they use, they use Cg (shader language), blender already uses GPU, thats why the wireframe, gouraud and texturing is realtime, without your nvidia/ati board even your blender gui would be set back to a grinding slow.

Toon_Scheur
Posts: 0
Joined: Sat Nov 06, 2004 6:20 pm

Post by Toon_Scheur » Mon Sep 12, 2005 1:41 am

Blender doesn't use GPU for rendering, it uses GPU though through openGL ( needless to say is a graphics library avaibale for every platform), but it is used for its interface and 3D view. Wuuuu Gouraud shading. That is so 80's.

On a serious note, right now it is possible to do some GPU bump mapping and other somewhat ordain stuff with a good ATI or NVIDIA, and I heard that the PS3 will have GPU translucency. But when I've said before that GPU assisted technology will not make its way in Blender, it is in the spirit that there is no of the shelve (or widely available at most) videocard that could produce SSS, full G.I., Caustics, Path tracing and what not. Please correct me here if I'm mistaken. So if I've read a research paper about a GPU assisted caustic, SSS or whatever, I bet my life that they are talking about custom build hardware.

joeri
Posts: 96
Joined: Fri Jan 10, 2003 6:41 pm
Contact:

Post by joeri » Mon Sep 12, 2005 11:56 am

Toon_Scheur wrote:Blender doesn't use GPU for rendering, it uses GPU though through openGL ( needless to say is a graphics library avaibale for every platform), but it is used for its interface and 3D view. Wuuuu Gouraud shading. That is so 80's.

On a serious note, right now it is possible to do some GPU bump mapping and other somewhat ordain stuff with a good ATI or NVIDIA, and I heard that the PS3 will have GPU translucency. But when I've said before that GPU assisted technology will not make its way in Blender, it is in the spirit that there is no of the shelve (or widely available at most) videocard that could produce SSS, full G.I., Caustics, Path tracing and what not. Please correct me here if I'm mistaken. So if I've read a research paper about a GPU assisted caustic, SSS or whatever, I bet my life that they are talking about custom build hardware.
I'm sorry, I think there is a mixup on the word rendering.
Currently blender does use the gpu for image rendering, just press the little picture icon for an image and shift-picture icon for animation.
It is perfect to render for example masks. Or to render an extra light pass. (maybe you don't use it that often, that doesn't mean it's not there).
So,... "GPU assisted technology" (tm) is already in blender.

Cg makes it possible to program any shader you want on a factory gfx card. And multi passes make it possible to combine any material setting, I don't see why pixar should build there own gfx-board. So I would not bet your life on it if I where you.

Another issue in that pdf document is colorspace. The images look very the same in that document, but that's also due to jpg colorspace. Pixar renders for film, not 24bit video. That's one of the issues blender will face in project Orange. Hmm,... I'm getting of track here.

The point is to render an image very fast that looks like the image you are going to render later on. So the artist can interactively position the lights. And I don't see why the blender developers can't come up with a solution for that, using or not using the gpu. The need for it is maybe not that big, as blender images don't take 4 hours to render. Once they do (with SSS, GI, Caustics and X-hype) then surely a pixel shader programmer will have joined the blender dev team?

Money_YaY!
Posts: 442
Joined: Wed Oct 23, 2002 2:47 pm

Post by Money_YaY! » Mon Sep 12, 2005 8:57 pm

my question is why cant anyone just 'try' and build a blender that uses the gpu cards extra features? Whatever platform it is even if it does not work on all cpus, just get a prototype working and show what it can do?

+Cause from what I have seen is, with tolling they can reallly reallly show some really nice images even at low poly.

So many people have good cards even basic ones for 100-200$ can run pixelshaders now =, that means realtime normal maps and antialiasing... Two HUGE time savers for detail, even lighting and other stuff..

LetterRip
Posts: 0
Joined: Thu Mar 25, 2004 7:03 am

Post by LetterRip » Mon Sep 12, 2005 9:15 pm

my question is why cant anyone just 'try' and build a blender that uses the gpu cards extra features?
If you find someone with the time and inclination, then they can. All programming work takes time, time spent implementing such a feature is time that could be spent implementing another feature. So until a programmer comes around who has a desire and skill to implement it, it probably won't happen.

LetterRip

Money_YaY!
Posts: 442
Joined: Wed Oct 23, 2002 2:47 pm

Post by Money_YaY! » Mon Sep 12, 2005 9:40 pm

LetterRip wrote:
my question is why cant anyone just 'try' and build a blender that uses the gpu cards extra features?
If you find someone with the time and inclination, then they can. All programming work takes time, time spent implementing such a feature is time that could be spent implementing another feature. So until a programmer comes around who has a desire and skill to implement it, it probably won't happen.

LetterRip
I know i know :P but i have seen a few coders talk about trying it and haveing the free time but it gets detailed down so much from other coders that it ckind of scares them away.

The fact that blender even has that render gpu render button shows that there is some kinda link that could be made with a little tweaking.

hell I am even trying again.. Thats not much to say but I am more than willing to try to get gpu power in, cause d@m I cant stand waiting to see a one hour render screw up.. Even a 10 minute render pisses me off.

Post Reply