I learned Cg over the weekend and I thought it very cool... (have you guys seen the totally sweet demos w/ the fresnel water and that totally cool refracting bunny.. ok.. well..
If I could assemble a team to do this:
We could write a series of blender shaders as fragment shaders Cg or GLSlang (or both)
Then I guess we could write a vertex shader for smoothing
Then, with the aid of OpenGL and the GL_ARB_fragment_program and GL_ARB_vertex_program extensions to write a very fast opengl renderer.
Since we are not going for realtime we can take many liberties when it comes to speed in writing the shaders. Also these shaders are executed by the GPU (Note: Hardware Assisted), and from my experiences the GPU is quite fast.
I know both Cg and C++, I'll probably end up doing most of the blender coding and the rest of the team will be busy with the shaders.
and yes this is possible, I've had much success with Cg on my FX5200 ultra.
Please reply with your responses on this matter.. I'd LOVE to hear them!
raytracing is just as feasable on the gpu as on the cpu, and probably wouldn't be much (if any, iirc gpus are optimized for small instructions on large amounts of data, not lots of jumping as would be required for raytracing)
however, if I know it well enough, all of the features from the pre-2.32 renderer could be done on a video card, though the loss of precision (what precision does blender work at behind the scenes) may cause some interesting artifacts
this thread is about Rendering
and not GUI
enhancements or Tools
so I will move this.
please check closely before you post and to which forum the topic belongs to...
i say go for it. here's some food for thought. http://graphics.stanford.edu/papers/photongfx/
my only point of concern is that CG is nvidia only and probably windows only to, but it's a great starting point anyways. it should set things in motion.
if you would like some contribution of the green kind (cash, not weed!
) then i'd be happy to help. just set up a project page with a paypal account and some kind of devellopment forum and you should get both brainpower and some funding - after all, time = money
oh i almost forgot - the http://www.gpgpu.org/
is THE place to start. there is a large group of gpu devellopers there, not to mention the reviews and references of gpu programming tutorials, books, resources and whatnot.
stay clear of cg... its dead, even nvidia doesnt want to touch it anymore.
this sounds great in terms of speed. i imagine the following scenario:
- blender's pre-RT renderer runs completely on GPU.
- selective raytracing is done by the CPU.
if this combination is possible, we'll have almost realtime production quality renderings. does this sound realistic?