Hardware assisted rendering

Blender's renderer and external renderer export

Moderators: jesterKing, stiv

Post Reply
Posts: 8
Joined: Tue Feb 17, 2004 11:29 am

Hardware assisted rendering

Post by unixminion » Fri Feb 20, 2004 1:47 am

I learned Cg over the weekend and I thought it very cool... (have you guys seen the totally sweet demos w/ the fresnel water and that totally cool refracting bunny.. ok.. well..

If I could assemble a team to do this:

We could write a series of blender shaders as fragment shaders Cg or GLSlang (or both)
Then I guess we could write a vertex shader for smoothing

Then, with the aid of OpenGL and the GL_ARB_fragment_program and GL_ARB_vertex_program extensions to write a very fast opengl renderer.

Since we are not going for realtime we can take many liberties when it comes to speed in writing the shaders. Also these shaders are executed by the GPU (Note: Hardware Assisted), and from my experiences the GPU is quite fast. :-D

I know both Cg and C++, I'll probably end up doing most of the blender coding and the rest of the team will be busy with the shaders. ;)

and yes this is possible, I've had much success with Cg on my FX5200 ultra.

Please reply with your responses on this matter.. I'd LOVE to hear them!

Posts: 1522
Joined: Wed Oct 16, 2002 2:38 am

Post by z3r0_d » Fri Feb 20, 2004 5:57 am

raytracing is just as feasable on the gpu as on the cpu, and probably wouldn't be much (if any, iirc gpus are optimized for small instructions on large amounts of data, not lots of jumping as would be required for raytracing)

however, if I know it well enough, all of the features from the pre-2.32 renderer could be done on a video card, though the loss of precision (what precision does blender work at behind the scenes) may cause some interesting artifacts

Posts: 329
Joined: Sun Oct 13, 2002 7:47 pm

Post by sten » Fri Feb 20, 2004 9:46 am

this thread is about Rendering and not GUI enhancements or Tools,
so I will move this.

please check closely before you post and to which forum the topic belongs to...


Posts: 19
Joined: Thu Feb 12, 2004 11:44 pm

Post by Darqus » Fri Mar 05, 2004 12:03 pm

i say go for it. here's some food for thought. http://graphics.stanford.edu/papers/photongfx/ my only point of concern is that CG is nvidia only and probably windows only to, but it's a great starting point anyways. it should set things in motion.

if you would like some contribution of the green kind (cash, not weed! :lol: ) then i'd be happy to help. just set up a project page with a paypal account and some kind of devellopment forum and you should get both brainpower and some funding - after all, time = money

Posts: 19
Joined: Thu Feb 12, 2004 11:44 pm

Post by Darqus » Fri Mar 05, 2004 1:02 pm

oh i almost forgot - the http://www.gpgpu.org/ is THE place to start. there is a large group of gpu devellopers there, not to mention the reviews and references of gpu programming tutorials, books, resources and whatnot. :idea:

Posts: 81
Joined: Sun Oct 13, 2002 8:04 pm

Post by green » Fri Mar 05, 2004 3:26 pm

stay clear of cg... its dead, even nvidia doesnt want to touch it anymore.

Posts: 309
Joined: Fri Oct 18, 2002 2:47 am

Post by solmax » Fri Mar 05, 2004 4:58 pm

this sounds great in terms of speed. i imagine the following scenario:
- blender's pre-RT renderer runs completely on GPU.
- selective raytracing is done by the CPU.

if this combination is possible, we'll have almost realtime production quality renderings. does this sound realistic?


Post Reply

Who is online

Users browsing this forum: Bing [Bot] and 1 guest