N.G.B or Future 3D technology DISCUSSION (We need one)

General discussion about the development of the open source Blender

Moderators: jesterKing, stiv

cmccad
Posts: 0
Joined: Mon Apr 07, 2003 11:58 pm

Post by cmccad » Wed Jun 04, 2003 12:13 am

First, let me say there's a lot of good stuff going on here. Personally, I am happy with the direction the current blender technology is headed. I haven't really formed an opinion on Verse yet: I think that, if handled correctly, it could be a good thing.

On to the topic of the NGB's technology. Since CAD was mentioned as one of the uses of of NGB, I thought I would comment on that part of the design.

I think that the only aspect of CAD that should be supported is I/O of the common formats (IGES, DXF, STEP, etc). I believe that making a fully featured CAD system which would be competitive with existing technology (the way that blender is competitive with modeling/animation software) would require a HUGE shift in design philosophy, so huge that the result just wouldn't be blender. And verse in its current form (as I understand it) would be almost useless for CAD operations. To avoid a long post, I'll not mention just how huge the rift between modeling/animation and CAD is. If anyone is interested in why I think so, I'll write it up and post a link to it :) .

Don't get me wrong, the world needs a good open source CAD package, but I don't think that blender is (or ever will be) it.

Just my $0.02
Casey

dreamerv3
Posts: 119
Joined: Wed Oct 16, 2002 10:30 am

Post by dreamerv3 » Wed Jun 04, 2003 3:26 am

Edit(sorry about size)
{
Sorry about the size of the post, I'll link as soon as my web host comes back up.
}

Did you know that mozilla is not one app?

Did you know that internet explorer has no single executable? (even though it looks and feels like there is one)

When you clik on IE or Mozilla or any one of a dozen of the modular applications, what you're really loading is a shared ojbect structure from which different modules enter and exit based on user demand (you use what's needed not load the whole program.)

Similarly if blender is turned into a mozilla like suite of 3d technologies, based around a kernel and modules for different functions, then a CAD program can branch out of this FAMILY of blender apps.

Just like galeon and phoenix share the mozilla rendering engine, so could a CAD app share the blender kernel services and rendering modules, whilst calling A gui front end core package for display management and providing thier own CAD specific modules for CAD work, this way we consolidate technology into a family which shares and thereby free each other up to code more and different things.

Sure the core package would be a bit large but we're living in an era of 100GB hardrives and multimegatransistor 3d accelerator cards that can do per pixel shading in realtime)
It makes more sense to make rapid application development easier and grow a family then to keep all apps monolithic saingular and small. That makes everybody rewrite code and rewrite bugs in the code which have to be refixed.

Maybe a CAD application would then find this blender environment a fertile place to grow.

It's about how you think about blender, blender has tons of potential, there could even be a Blender::Genesis or Alpha (although I prefer the former)
Which would look and feel like the original blender complete with the original gui et al.
It would function much differently under the hood though away from the users' eyes...

If you don't have to write a kernel and renderer and other modules (file I/O, geometry creation, drawing tools) then you're freed up to focus on the gui of derivative members of the blender family.

Mozilla is an excellent model of how code can be shared thus speeding development of peripheral applications...

CAD/CAM/ Element analysis, etc....

There are sooooo many scientists using linux that they would fall head over heals in love with such core blender technologies, this would make it attractive to code 3d for linux, heck the rendering engine could even power games on linux, imagine that! an openGL 2 rendering engine on linux with tons of shaders ready for use.

Combine all this with the content creation/management scheme I'm working on, and this modular concept feels not so bad at all...

I think in that respect, an NGB effort would do well to establish core modules like the kernel and realtime rendering engine, and then branch out from there with the core group maintaining the core modules and other coders adding features via thier own plugins...

Opensource everything and it would become famous...

Blender would become a prefix in this context,


Enabling technology packages(enabling)
{

Blender::Kernel -- The core every module links to and uses for connection to services.
Blender::Core --Contains all the basic tools which include(Core)
{
Modelling tools
Modification tools
Materiel tools
Lighting tools
Animation tools
Rendering tools
2D image tools
(other tools I cannot think of right now...)
}
Blender::Render -- The OpenGL 2 realtime rendering engine
Blender::Karma -- The fully configurable OpenGL 2 powered gui front end
Blender::Force -- The Opensource physical simulation engine
}

dependant applications(dependents)
{
Blender::Cine -- An collection of modules written to facilitate 3D Filmmaking
Blender::Edit -- An Awesome video editor powered by OpenGL 2 for realtime effects
Blender::Draft -- The premier CAD solution based on the blender suite of services.
Blender::Verse -- A Virtual Reality/Digital Content Creation environment based on the Verse networking protocol. //See there's room for everybody!
Blender::(your idea here)
}

Games could link to the front end for easy gui setup and the rendering and simulation engines for awesome gameplay.
They could even tap into the power of the core package and pull the tool collection out to do original things inside the game with the content creation tools...

Doesn't that sound exciting?

It would turbocharge interest in linux, and any other platform on which it is unleashed.
It would bring blender as a brand into the mainstream.

It just makes sense....

Just think about it...

Jamesk
Posts: 239
Joined: Mon Oct 14, 2002 8:15 am
Location: Sweden

Post by Jamesk » Wed Jun 04, 2003 8:47 am

dreamerv3 wrote:It just makes sense....
Just think about it...
It makes a lot of sense, of course. Not to mention how relatively easy it would be to hook on an external scripting host to pass instructions for anything and everything straight into the core and kernel. MEL-scripting, here it comes!

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Wed Jun 04, 2003 1:38 pm

Lots of concepts sound exciting but you have to keep in mind that at the end someone will be coding NGB and if it's not you, he/she will probably use other concepts and only code stuff he/she thinks is worth spending time on. Once NGB is well into its development process the only thing to change it the way you want is to get down and dirty yourself.

All this talk is great for getting ideas and stuff but as long as you don't actually contribute to the code I think that many developers wont take you seriously (and thus wont consider your concepts).

And I think that first it should be started small and after a few things are working there will be more experience we can base decisions/thinking on.

dreamerv3
Posts: 119
Joined: Wed Oct 16, 2002 10:30 am

Post by dreamerv3 » Wed Jun 04, 2003 11:05 pm

absolutely

cmccad
Posts: 0
Joined: Mon Apr 07, 2003 11:58 pm

Post by cmccad » Thu Jun 05, 2003 1:49 am

The long version: click me!

The short version:

The toolsets for CAD and the one for modeling/animation do not have a tremendous ammount of overlap. The only things which they share in common are geometry display, file translation, low-level modeling descriptions, and physics simulation (for FEA and certain interference (collision detection) checks).

The core feature of a CAD engine is parametric, feature based modeling. IMHO it would be difficult to implement this functionality outside of the core (as a CAD package based on blender would most likely have to do).

I have no objection to using blender components in a CAD app (in fact, I've been thinking of using GHOST and some low-level blender libs in a CAD package), but I require convincing that the result still would be "Blender" in anything other than name.

And when I say CAD, I don't mean Computer Aided [Detailing/Drafting] (making drawings), I mean Computer Aided Design. A detail/drafting package could probably be implemented from the current blender code.

Casey

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Post by thorax » Thu Jun 05, 2003 1:59 am

xype wrote:Lots of concepts sound exciting but you have to keep in mind that at the end someone will be coding NGB and if it's not you, he/she will probably use other concepts and only code stuff he/she thinks is worth spending time on. Once NGB is well into its development process the only thing to change it the way you want is to get down and dirty yourself.

All this talk is great for getting ideas and stuff but as long as you don't actually contribute to the code I think that many developers wont take you seriously (and thus wont consider your concepts).

And I think that first it should be started small and after a few things are working there will be more experience we can base decisions/thinking on.
Yeah but coders can be stupid too.. I think I will write a random coding program to persistently screw up the sources so that we can test the potential outcome of hacking sources at random.. Not serious..

But anyhow if you don't have a design, it might as well be that way..

Some ideas for doing a specific design (I mean to such a level
that the coders can't imagine anything better):

- impossible to patent (its public knowledge), impossible to copyright (its copy lefted, and public )
- it can be done faster than coding
- it can be changed faster than coding
- openly discussed (groups of coders will be able to determine futile designs before they are coded, and come up with better ones)
- lack of a need for creative description (to keep code discussions from turning up like a Abbot and Costello "Who's on first" skit..
- make the future obvious to the users and new developers
- help new developers to know of a place to start
- keep the sources from going in a direction that is unproductive (source hijacking can happen, just replicate the sources change them a little bit and name the distribution similar to an existing distribution, plus other methods stollen from brand hijacking used as unfair business practices..).
- to distinguish a good design from a bad one (the knowledge of the developers are pooled into the design, they stick to it unless a
new developer comes into the design)..Developers that do not wish to stick to the design, are seen as outsiders and can continue development their way but they get no cooperation from the other developers.. They can work on their own code distributions of blender..

I've been told by a number of people the developers will disagree,
its like herding cats.. Well that may be true but if that is, you better have some darn good developers because more than likely the sources
will progress slower and slower and slower over time, eventually
they will slow to a snails pace and bugs will be happening more frequently, developers will leave the distribution because its getting worse and worse, few developers will come into the project because of the lack of docs to help them get started and able to make decisions about the
sources, there will be no overall consistency in the sources it will look like Igor with one big eye here and a hunch back there.. It doesn't take a ability to predict the future to see this..

But a stable design can be enforced, the developers won't stop developing it because they like blender, they want a free open source 3D package, they want it to be there, place motivation here..
If its being coded by users with no programming background, it
could be at best a learning experience, but somewhere down the road someone will see a need for doing a re-design..

Even Ton knows it has to happen, ask him..

cessen
Posts: 109
Joined: Tue Oct 15, 2002 11:43 pm

Post by cessen » Thu Jun 05, 2003 7:16 pm

I will reply to this with a more comprehensive list of my ideas, but for now I will just summarize a basic concept that I think should govern Blender3 design and development: users come first. The design process should be done initially from the user perspective (i.e. what do we want the program to be like for the users).

cessen
Posts: 109
Joined: Tue Oct 15, 2002 11:43 pm

Post by cessen » Thu Jun 05, 2003 9:28 pm

Ok, here is my more comprehensive post.

First of all, I think that Blender 3.0 should actually be a suite of programs, not just one program. Of course, we would want to make sure that the way we split it really does make sense (we don't want to end up with a lightwave-style system, where modeling and animation are done in separate programs).
I also think it would be neat to add other programs to the Blender suite that are not represented in the current Blender's feature set. For instance, a full-featured paint program would be neat; although, granted, that would be rather redundant since there is already The GIMP.
So, my idea for the Blender suite would potentially have the following programs:

Blender: Modeler/Animator
Blender: Renderer
Blender: RenderMan
Blender: Video-Editor/Post-Processor
(?)Blender: Painter
(?)Blender: Game-Maker

For the renderer, I think it should be Reyes-based (perhaps with support for raytracing) and would be designed and written specifically to integrate well with the rest of the suite. (I am thinking that Blender:Renderer would not use the RenderMan interface, but a Blender-specific interface that would be developed for the sake of tighter, more intuitive, more efficient integration.)

Blender:RenderMan would basically be an API to allow Blender to communicate with alternate renderers that conform to the RenderMan spec', and to allow other programs to communicate with Blender:Renderer using the RenderMan spec'.

Blender:Modeler/Animator would serve the same purpose as the current Blender does, minus the sequence-editor and the game-engine.
I have a few specific ideas for this program, but by no means a comprehensive vision for the program as a whole.

The first idea is that I want non-linear animation to be the basis of the entire animation system, not just character animation. Linear animation would then be a sub-set of the entire system.

The second idea is that I want everything to be able to be done procedurally (as in procedural textures). For instance, there would be procedural geometry, and procedural animation. Procedural geometry would allow for things such as tree generation, and terrain generation. And procedural animation would allow for things such as planet orbits, or bouncing balls, without having to key-frame things specifically (and in combination with a fully non-linear animation system, this would be extremely powerful). Procedural "stuff" could be supported either via a plugin system, a scripting system (both have their advantages and disadvantages--perhaps both could be supported).
The main advantages of allowing for procedural "stuff" is that it keeps the file-size down (the settings for the procedural generation would be stored, rather than the generated stuff itself--unless, of course, the user specifically tells it otherwise, perhaps for the sake of hand-editing it), and it is very useful for things where the modelers/animators don't need complete control over a large group of complex things.

The third idea has to do with curved-surface based modeling. I won't explain the whole things here, but the basic idea is that the system would stick to an analogy of sewing (where you stitch patches together with various "types" of stitches, and cut patches, etc.).

The fourth idea (which is actually both for the modeler/animator and the renderer) is that there should be four general types of geometry supported:

(0D) Points (particles and such)
(1D) Strands (lines and curves)
(2D) Surfaces (polygons and curved surfaces)
(3D) Volumes (err... um... volumes).

Each of those types are useful for various different things (for instance, strands are useful for hair, volumes are useful for smoke/clouds, etc.).

Anyway, I have other ideas for the modeler/animator as well, but they're less "over-all" types of things, and listing them would be pointless.

I don't really have much of any ideas for the video-editor/post-processor, other than it should be heavily plugin-based.

Well, those are the basic thoughts that I have. Comments and insults are welcome. ;-)

Jamesk
Posts: 239
Joined: Mon Oct 14, 2002 8:15 am
Location: Sweden

Post by Jamesk » Thu Jun 05, 2003 9:50 pm

cessen wrote:Ok, here is my more comprehensive post. - - -Comments and insults are welcome. ;-)
A great and indeed comprehensive post :D

Comments---> Just one really, and it's comprehensive::..

About 'Blender:Renderer' and 'Blender:RenderMan'

Now how about this - Add an intermediate module instead, Blender:PreRenderer, that grabs input (raw scenedata from a Blenderfile in whatever format that has in the future), processes this data according to the user settings/plugins and outputs a stream of data to whatever renderer the user wants to use.

So the PreRenderer will perform the actual 'recasting' of raw data to the desired output format, which could be Blender:Renderer-data or it could be RenderMan-data (using the RenderMan-plugin) or any other scene description format (using the "AnySceneDescription"-plugin).

By breaking up this flow of communication it would be easier to provide possible "hook-in-nodes" for applications/libraries that perform the actual conversion of renderer instructions. With a clearly defined interface for how to hook up a converter in this PreRenderer, it would be much easier for people to write their own rendersupport for whatever engine they want to use. Someone could write plugins for VirtuaLight, MentalRay, FinalRender, Radiance... whatever - using the same interface to get access to the original raw scenedata.

And I couldn't think of any insults, so... umm... your socks are smelly. :D

cessen
Posts: 109
Joined: Tue Oct 15, 2002 11:43 pm

Post by cessen » Thu Jun 05, 2003 11:33 pm

Jamesk: good idea. I should have thought of that. :-)

Just to summarize what you said: 'Blender:PreRender' would essentially be the same thing as what 'Blender:RenderMan' would have been, except that it would be generalized (via a plugins system, or scripting language, or some such thing) to be able to support any rendering interface standard.

Is that about right?

Thanks for the comment! :-D
Jamesk wrote:And I couldn't think of any insults, so... umm... your socks are smelly. :D
They must be, if you can smell them from that far away! I'd better go wash them... ;-)

Jamesk
Posts: 239
Joined: Mon Oct 14, 2002 8:15 am
Location: Sweden

Post by Jamesk » Fri Jun 06, 2003 12:08 am

cessen wrote:'Blender:PreRender' would essentially be the same thing as what 'Blender:RenderMan' would have been, except that it would be generalized - - - Is that about right?
Exactamundo, dear Watson! The generalization is the key, of course. It would mean that those wanting to code support for a particular engine only need to interface with the PreRender API, which would contain a well-designed set of functions returning chunks of scene data in a convenient format. Meshdata, for instance, would come back to the caller as huge BLOBs for maximum efficiency.

This should be somewhat like working with the Python API, only faster, more complete, more consistent and... ummm... more everything. Since it is separate from the Blender core itself, it would probably not suffer from the severe 'versionitis' seen in the Python API.

It wouldn't even have to be limited to conversions for rendering. Since PreRender would read native blenderdata, it could facilitate all sorts of export. It could be considered as a general blender scene decoder.

And about the socks: I'm sorry. It was in fact my own socks that were a bit smelly. We've had a really hot day today in southern Sweden, and combined with a pair of my old sneakers... well, you can probably figure out the rest. :D

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Fri Jun 06, 2003 1:11 am

Jamesk wrote:It wouldn't even have to be limited to conversions for rendering. Since PreRender would read native blenderdata, it could facilitate all sorts of export. It could be considered as a general blender scene decoder.


http://www.blender.org/modules.php?op=m ... =8652#8652

What you are describing is what Verse is supposed to be used for. A Verse server holds the data and whatever application "understands" the Verse protocol can work with the data.

theeth
Posts: 500
Joined: Wed Oct 16, 2002 5:47 am
Location: Montreal
Contact:

Post by theeth » Fri Jun 06, 2003 4:33 am

cessen wrote:The second idea is that I want everything to be able to be done procedurally (as in procedural textures). For instance, there would be procedural geometry, and procedural animation. Procedural geometry would allow for things such as tree generation, and terrain generation. And procedural animation would allow for things such as planet orbits, or bouncing balls, without having to key-frame things specifically (and in combination with a fully non-linear animation system, this would be extremely powerful). Procedural "stuff" could be supported either via a plugin system, a scripting system (both have their advantages and disadvantages--perhaps both could be supported).
The main advantages of allowing for procedural "stuff" is that it keeps the file-size down (the settings for the procedural generation would be stored, rather than the generated stuff itself--unless, of course, the user specifically tells it otherwise, perhaps for the sake of hand-editing it), and it is very useful for things where the modelers/animators don't need complete control over a large group of complex things.
I'm with you on this. Procedural modelling and animation is something I would really look forward to, and since procedural animation (more precisly, behavioral animation) is more of less one of my pet projects you can be sure that I'll be working on this as much as I can.
cessen wrote:Comments and insults are welcome. ;-)
well, since Jamesk already mentioned your smelly socks, I guess I'm fresh out of insults then :wink:

Martin
Life is what happens to you when you're busy making other plans.
- John Lennon

modron
Posts: 0
Joined: Thu Jun 05, 2003 10:13 am

bromadrosis

Post by modron » Fri Jun 06, 2003 5:28 am

...My python boots are too tight...I couldn't get 'em off last night...a week went by...now it's July...I finally got 'em off and my girlfriend cried, "You' got StinkFoot....darlin'....your foot puts a hurt on my no-o-ose"---FrankZappa

I have a couple of ideas for Animation futures, but I am not at all technically inclined, and most things that are technical, or consist of an abbreviation either go over my head or register as something that they are not. I am more used to working with solid things like clay, puppets, or stringed instruments, so forgive me, if I stick to abstractions... It seems to me, what blender does, and other 3D programs do, is to take a whole bunch of parameters, which you define for it, and transforms them into a rendering using a bunch of calculations.. All these parameters, as far as I can tell, have to do with the interactions, of materials, and optics. What about the introduction of audio as a physical parameter? Linking a sound to an object, and using it to define the objects behavior would be interesting,...

Post Reply