I would like to see the scanliner stay for preview renders and as angelo said, for animations as well.
The scanline render will still be very powerful for animations if others, (not me, cause I am no coder), can get it to work like 3DS Max's scanline render, which can do raytracing.
For the GI/Caustics/Raytracing alone, I think the best way is to be able to internally select an external renderer, (POV-Ray, Lightflow, etc.), and then just press the render button. But that is just my opinion and is probably very difficult to implement. But we will see.
How about both... scanline and raytrace... and user can choose witch one to use... but not external renderers... thats plugins for... export and render whit other software example BMRT...
It would only be necessary to be able to write povray files, which shouldn't be that difficult, considering there are several python scripts that do that already (tho not very well).
I suppose it would be possible to directly output some kind of intermediate format and skip the file altogether, but the POVRay syntax is fairly simple and well-documented.
RiNiK wrote:How about both... scanline and raytrace... and user can choose witch one to use... but not external renderers... thats plugins for... export and render whit other software example BMRT...
I think it's a good idea ("and user can choose witch one to use") , but I think ... if we can use also external renderers , it'll be very good .
There was some discussion on this during the conference, and also during the last days of NaN. AS I said before, I'll help out if help is wanted, but somehow I didn't receive much feedback about continuing work on the renderer... I gather other renderers would be welcome. Is anyone interested in getting new stuff in the current renderers?
Anyway, putting in support for multiple renderers is a good idea. There's no reason to kick out the old renderer. Making tight couplings between blender and other apps is mainly a point of how difficult it is to embed the other program in external code. You can always throw around some system calls in python, perl, .... so if the renderer can be called from the command line, it should be good enough. Another question is how tight you _want_ to embed. The render daemon also only used blender from the command line, so why not other renderers?
I'm not an artist, so I'm not very well acquainted with data formats of other renderers. If anyone can help me out with that, I can help out with the code. Any URLs? I prefer very formal, mathematical specs :p (If you have python scripts for translating scenes, that's easily translated into c/c++...)
One thing that will become very important very soon is how to translate the material properties, lamp settings, etc... For renderman: what is a nice interface to define shaders? Not everyone will want to code his/her own shaders, but some will... So: what should change in the interface to enable multiple, possibly very different renderers without making it a mess, or a horrible headache to use? Does anyone have somethign to add to this?
You can start looking in src/blender/renderconverter and src/blender/render if you feel like it, and you'll need to understand what the main datastructures look like (hint: look at src/blender/blenkernel/BKE_main.h, and the stuff it uses from src/blender/makesdna). Um, enough for now... /me shuts up.
It's best, IMHO to keep the scanliner for quick previews and animation but for stills (what I'm most interested in) a good raytracer (either internally or externally) is necessary. Lightflow seems to be a good one that's still available (as opposed to BMRT) and has both C++ and Python interfaces. Aqsis and others support the Renderman format but what I've seen of them, they are not as far along as Lightflow is. Whatever is supported it should be free so that it's inline with the rest of the development of Blender.
Due to the interfaces, Lightflow is probably much easier to support and some work has been done however, materials and lighting were very difficult to get right in the previous efforts.
I have nothing against using Lightflow, I have seen examples, and it looks good.
I know a little about POVRay; it has a clean C style file syntax and, and that means no complicated app integrating needed. It works well, is proven, and very well supported, and POVRay files are easily hand-edited/created.
The POVRay export scripts take a recursive approach, and are run directly from within blender. I can upload them and post links if anybody is interested.
that way when blender is ported to other oses like unix, irix and solaris, EVERYONE will be able to have a raytracer, not just linux os x and windows users.
i wish we could have an accurate poll dipicting the usage stats. on differant oses, so we could make a good descision on what to do.
What about render plugins? Lets make blenders renderer the default plugin and with the time write other plugins for blender. Compared to interface plugins this is quite sane plan. And every plugin should define it's own material interface and render window. So with BMRT we could edit bmrt materials. Additionally every plugin should have fallback subset material data for default plugin and exporting.
I thing, that some raytrcer is really necessary, but I dont't thing, that pure povray and pure blender working together is good solution, because there are many thnigs, that aren't "compatible". For example lights: in PovRay could not been square spotblundes created. Povray have many things, that would not be usefull working together wit blender too. For example scripting in povray, ...
I thing, that PovRay could be good base for some NEW ray-tracer.
1) The renderer should be improved with better (and probably faster) antialiasing
2) more export formats (.RIB, .POV, .VIB) should be added IN blender, not just a script.
i think that for stills atleast we should definitely be using raytracing. and i don't think it's necessary to reinvent the wheel, not when there are MANY excellent free renderers out there (virtualight, 3delight, bmrt, povray, aqsis, lightflow.)
if we're going to integrate Blender with other renderers, the FIRST STEP, i think, is to somehow allow the export of materials as RENDERMAN SHADERS!!! this is so important, most of the best free renderers out there are renderman-compliant anyway. once blender's materials can interface with good renderers, that's when we should start worrying about integration. just my opinions
Yeah, renderman shaders along with multiple outputs would solve most of the problem anyway. A renderer doesn't NEED to be integrated, as long as it's easy to output supported files for it and then run it externally.