I think the reasoning goes something like:cessen wrote:The internal Blender renderer is based on an old rendering architecture, and is thus limited in a lot of ways. However, there are still ways in which it can be improved without changing the rendering architecture itself.
In my case, I don't really feel like bothering with external renderers at the moment, and thus I am trying to improve Blender's internal renderer by means of adding a shading system and other such features.
RenderMan shaders are very nifty, but they are horribly organized in terms of types of shaders and how they work together. For instance, they make no distinction between texture-maps and BRDF's ("materials") in RenderMan shaders. And I have no idea why they consider geometric displacement to be a shading concept. Bah... that's my rant for the day.
I've always had this sort of love/hate relationship with the RenderMan Interface Spec'. On the one hand, it has a lot of neat--and important--concepts in it. On the other hand, it's old enough that it's getting very patch-worky... and if there's one thing I hate, it's patchy standards/programs. I have a whole theory about patch-work programs and standards, which I won't go into in detail right now. But the basic concept is that in order to avoid self-inconsistancies and general disorder, a program/standard has to be re-written from scratch every once in a while.
In the end, what I'm really looking forward to is Blender 3.0. Everything can be thought through and re-done, including rendering.
There are no textures, everything is eather math or files. So what you would call a texture is simply a file. Would there be a need to not being able to call a file from within a shader? And for patterns and math, its very often when you write shaders that you have tight integration between the shading model and the math(only natural, since the shading model in 99% of the cases is very mathy), if you want to separater the process you can use include files. This is the basic standard with how shaders are written for other apps such as mentalray aswell, it might not look nice from a "programming of the architecture" pov, but its very sweet when you as an artist actually create the shaders
Displacement shaders are shaders because you only havto copy the surface shader to a new file and make the output affect the normal instead of the color/specularity/etc. bmrt even let you effect the normal in a surface shader, so it both worked as a surface shader and a displacement shader. This again makes the job for the shader writer quite nice. they can write all the functions in a include file and only havto do the shading model in the surface shader and the bump mapping/displacement in a displacement map file.