N.G.B or Future 3D technology DISCUSSION (We need one)

General discussion about the development of the open source Blender

Moderators: jesterKing, stiv

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Sat Jun 07, 2003 10:57 pm

On Verse:

First and foremost it's not "internet" based, but network based. So for people working in a company doing 3D animations the speed of a 100MBit network is quite enough, and considering most new "pro" hardware comes with GigaBit network cards the problem of bandwidth is even smaller.

Second, there are lots of ways to "cut" bandwidth - you can send commands of what operations to do instead of end-result data (subdivide@2 instead of the subdivided model), so you have a constant flow of data that is not that bandwidth intensive.

Verse is not for games only. And NGB, as discussed by Ton et al during the conference, could use Verse for storing data, regardles of local or not local - and if someone is interested in the data, Verse can be accessed via a network device.

On OpenGL 2 GPUs:

A OpenGL 2 GPU should calculate whatever you tell it to 1:1, at the precision you specify. There are no shortcuts in math, it stays the same, as does the result. You can do raytracing on modern GPUs (not used since it's not 60 fps, but at least ATI showed a proof-of-concept application already). Today the programmers want the GPU to act as they tell it to, and at least the next generation will.

The difference between the GPU and the CPU is that:

a) The next version of the GPU can be a totally different architecture than the previous one, since it's accessed via a driver, so the design can change much more drastic than a Pentium revision (a Pentium still has to be x86 compatible and run old software).

b) The GPUs are packed on a card with their own memory and can be really flexible as far as adding extra registers etc goes (see above). The GPU memory is usually a lot faster than the system memory (for the GPU) and since it's size is reaching over 256 MB it's now viable to store high-resolution textures/data to manipulate with it. Also the path to the memory can be, like with ATIs 9700 and nVidias latest FX, 256bit wide or even wider - meaning the GPu can "move" more data around.

c) The GPUs are, indeed, optimised for specific tasks - like anti-aliasing - working with "tricks" without the loss of quality. Whereas the CPU has to use only one approach not "knowing" any tricks to speed things up without losing quality.

Last but not least a decent new GPU will cost you $250-300, which is better than buying a new system altogether.

dreamerv3
Posts: 119
Joined: Wed Oct 16, 2002 10:30 am

Post by dreamerv3 » Sun Jun 08, 2003 7:24 am

Xype: we basically agree at this point.

Ok done with that...

I think its a great time to take another deep breath and say...

What are our goals for N. G. B?

I hear so many refernces about the blendr conference last year which soooo many of us couldn't attend so many ideas about N.G.B. I cannot wait till SIGGRAPH happens and a bigger bunch will show up to hopefully show a more representative userbase.

Please realize that it is quite impossible to be equally good for everything all of the time, but you can come pretty close if you design it right.

Autodesk uses the same basic underlying technology to build 3d studio max/viz/autocad/and gmax.


Verse is more or less a decentralization strategy, but modularity is also a decentralization strategy, I hear people saying lots of little apps that are coded in things like C, sounds some people don't want to go the OO route.

I've been crawling around in the source code of chromium B.S.U ( I'm sick and tired of the levels, the weapons and the monotonous music, so I'm changing the program to be funner!) for the last week and frankly I love the implementation, its pure OO written in c++, bear in mind it is only a scrolling shooter based on OpenGL, it rocks both as an implementation and as a fun game to play.

Its a great way to learn the c++ school of thought in a simple environment...

Its about 13,500 lines of code (420 pages) without the headers and 15K with them.

A single person can work on "chapters" of such a work.

But I can read through it and NOT feel overwhelmed, everything is an object.

True blender 3 or N.G.B could be viewed as huge, but if everything is paired into subsystems and objects then the core team only has to spend the grunt work on the kernel and the main subsystems, coders can be "plugged-into" the project and then since they follow the API and spec thier work is in the same language and requires NO VM or other sillyness, it "Just works" one of the great strengths of c++ is that it "just works" more than any other languages and it "just works" Fast too....

Enoguth about coding this and that, what do we want it to do?

This is the hard question, because nobody wants to answer it.

Do we want 3d online MOO/MUD's?

Do we want movies?

Do we want still images?

Do we want 3D modell creation prowess?

and in what priority do we wnat these things?

I would venture to stick my neck out and say:

1.) Animation

2.) 3d stills rendering

3.) Modelling/content creation

4.) Design simulations

4.) Online MUD/MOO's

the first 4 do useful work, they can help people further careers in 3D and also better the world through the benefit of 3D design. Games while profitable (and hard as hell to create on a shoestring budget) and fun to play are just too hard to get "right" without some good coders behind them.

Maybe this will change with time as tools which mimic coding functionality in a visual way become available and thenyou need time and patientce and then you need a content creation/management system...


AS cool sounding as collaboration is, we have to asses the probability that most people will NOT login with N G B to verse and swap models. Most people will model and then try to texture and shade and then animate if they're brave enought to try.

I know what I expect from N G B technology, why don't some of you people bring uo your ideas and we can find some common ground. Then we can work out from there.

So what do we want to DO? List something productive.

Jamesk
Posts: 239
Joined: Mon Oct 14, 2002 8:15 am
Location: Sweden

Post by Jamesk » Sun Jun 08, 2003 10:10 am

dreamerv3 wrote:what do we want it to do?
This is the hard question, because nobody wants to answer it.
OK, here's my personal desires - without claiming that it's what the majority wants::..

1. Rendering:Animations - NGB should, as far as possible, be closer to what 'the big ones' are capable of. The current Blender is indeed a very capable 3D suite, but there are lots of areas that need improvement - both the animation system and the rendering technology. I'm pretty sure that the current codebase is too messy for major improvements, and that's why NGB needs to start from scratch - a total rewrite. I think we all agree on that. In my opinion, a strong OO-approach in C++ is the way to go.

2. Rendering:Stills - This is mainly about improvements in the rendering-department. A REYES-based 'default' renderer, specific to NGB, and the previously suggested modular connection to external renderers, with a solid and well-documented API to encourage the creation of plugins to support other external renderers, both free and commersial.

3. Modeling tools - This is tricky. Personally, I'm quite happy using external modelers (i.e. Wings) but I still like to have the internal tools as well - NURBS should be improved (surface blending and more), we need some sort of patch-based modeling too, the subdivision needs refinement (local sub-object subdivision, creasing). Then simple things like backface culling, edge and face modes, some better manipulation-along-axes (Wings does this very well today, 'anything along normal' for instance, and 'anything along userdefined default axis')

IMPORTANT NOTICE! Yes, I think Wings represents the ultimate way to do polymodeling, BUT it is a very slow application. Blender is way, way superior as far as speed goes, and I would want it to be that in it's future form too. On my poor old peecee, I get a noticeable slowdown in Wings as I reach about 3000 polys, while Blender easily handles ten times that amount without slowing down. Blender can deal with 100K polys on one visible layer - the same scene loaded in Wings would shut it down completely.

To find out what to improve: grab a copy of Maya PLE. That functionality, but with the Blender look and feel --- the way to go. I'm sure that a lot of stuff, research, white papers etcetera is available for grabs - not the proprietary A|W-implementations of course, but the general algorithms and technologies. And as far as rendering goes, the same situation applies. Standard raytracing is well documented everywhere (I even found a JavaScript raytracer the other day...), there are lots of clever code-snippets to grab in countless opensource 3D-applications - basically anything should be possible without a lot of timeconsuming research.

eskil
Posts: 140
Joined: Tue Oct 29, 2002 10:42 pm

Post by eskil » Sun Jun 08, 2003 4:23 pm

Ok lets make some things clear about verse.

Verse is ONLY a network spec.

The entire idea with verse is that we don't need to have this discussion. Each component can use different technology, language and so on. OpenGL 2 or not? I don't care! You don't have to discuss the merits of Object oriented programming with me. Just write your stuff object orientated.

How large should each component be? Start programing, and once you are done, look at the size of your executable and that it. If many people want to come together and make large components then fine, if you want to make tiny components then go ahead.

What is it going to do? its going to do what you program it to do, If someone wants to implement an editor for CAD like editing then just do it. If you believe there is a market/area that blender should excel in, in the future then write components that make it happen.

I don't think it is very constructive when a few people try to decide what components should do what, and how they should be written. This is not a commercial project where people just do what they are told. While you argue over what should be supported Zarf wrote a Python Module, so now verse supports Python whether you like it or not. Great for him, great for verse. He has taken the first decision what verse should support and you guys are way behind him.

So Start a new project and start coding what you want to code, a UV editor, a Rendering engine, Animation or what ever. I have already told you that I will help you if you can just decide what kind of component you want to write.

If you think this thread is your shot at shaping the future of blender then I think you are wrong.

E

Jamesk
Posts: 239
Joined: Mon Oct 14, 2002 8:15 am
Location: Sweden

Post by Jamesk » Sun Jun 08, 2003 5:08 pm

Ergo: non-coders like myself have no say in the matter of NGB. I get it.

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Sun Jun 08, 2003 8:06 pm

Jamesk wrote:Ergo: non-coders like myself have no say in the matter of NGB. I get it.


Wrong again. Just don't expect anyone to write code to your ideas. Give input, beta test, give feedback and then coders will likely start implementing your ideas as well. It's only since Blender developers are doing it in their free time that means that they wont spend their next 3 years implementing this huge big object-oriented framework just because a non-coder thinks it sounds funny and should be implemented.

dreamerv3
Posts: 119
Joined: Wed Oct 16, 2002 10:30 am

Post by dreamerv3 » Sun Jun 08, 2003 9:16 pm

Eskil: For some strange reason I think that was aimed at me, and I don't really mind shots @ me, but I can't really shape the future of blender because.

1.) I'm only one person

2.) A bunch of people don't really take my lead (and thats a very good thing since I'm not the most advanced programmer around here, I'm more like a junior)

3.) The first two really sum it up

I think the best "shot" at shaping blender future is to subscribe to the dev mailist, and read the docs, and then get familiar with doxygen cause there is sooo much that hasn't been documented.

To attend the IRC meetings, get involved.

I do these things.

I won't try and haven't tried to convince you about C/C++ preferences.

I'm not taking shots @ Verse either but I am questioning it design concepts (not all of them, just some of them) and I would hope people would question mine as well. Murphy's Law etc...

This thread is all about talking about what we want, I want to hear what people want.
Because Its not all about what I want or what jamesk wants, its about what everybody wants. Its also about logical design so that we can work together.

This isn't a race. At least if it is who's the other team?

I've heard a lot of people say they want a solid polymodelling functionality, something like wings maybe. I agree with that. I want other things as well but I agree with that. I agree with Verse too. I think N.G.B should be built in such a way to allow people to agree and also to disagree, but I also think the architecture shouldn't make N.G.B code unsharable.

Thats just me, my solitary opinion. Don't take it as an attack its not, its a suggestion.
And I stick with it too, maybe next year I'll find all this good python code written for a GL 2 rendering application will calls to a verse server, I can use the application but what I really want is to integrate the shading models into my materiel editor... I want the code.



One of the big problems with linux was lack of standards, now we've settled a bit and have desktop environments like gnome and kde as the big two choices, SDL is the preferred library for multimedia, OpenGL is 3d graphics, people agreed on these things, all the libs like libart, libpng, and the sound libraries.

I can write a game with SDL and OpenGL and I will KNOW it will work on every linux because you can download SDL & OpenGL for every linux and its not clunky like a VM or interpreter its just a library.

I value that, so I'm not trying to hijack the future of blender, far from it.
But I am trying and will keep on trying to figure out and change the code to add new features, can't blame me there...

I'll run it on my machine, and then if it sounds good I'll upload it...

As far as Xype was saying to Jamesk:

It wouldn't hurt to pick up a book about C/C++ and start reading, IMO the syntax in C++ is easier for a non programmer to pick up but thats just me.

But eskil is right about shaping the future of blender, you/I/Person X can post all day but it won't make a coder write the ideas into code, mine either for that matter, which is why before I opened up my mouth I studied c++, because damn near everything is written in C/C++ I might not be as advanced as some others in the community but at least I can read the code and grasp whats going on.

Now all I have to do is read more code, and more, and more, and more, ad inifitum....

But this tread is all about disscussion, because if we don't talk about what we have on our minds and then average that to a common conceptual denominator (then branch out), how can we hope to be representative?

There's a reason companies have focus group testing, and random surveys done.
they want to know whats in demand, what people want.

At least users are talking, that means there are ideas about what we all want.

I don't find it controvesial at all, I find it beneficial.

I'm glad people said: "Hey Dreamerv3: Your idea is goint to make 3d look cookie cutter and bland and uninteresting."

Good! That drives me to think of ways to break the cookie cutter mold and find a way to create originality without huge expenditures of time on the users part.

I want people to rip my designs apart, it only makes them stronger/flexible on the next iteration. Its like natural selection, survival goes to the fittest.

To imply that this wastes time is strange to me.

cmccad
Posts: 0
Joined: Mon Apr 07, 2003 11:58 pm

Post by cmccad » Sun Jun 08, 2003 10:48 pm

eskil wrote:
If you think this thread is your shot at shaping the future of blender then I think you are wrong.

E
Heh. "N.G.B or Future technology DISCUSSION (We need one)" is the title of the thread. Where else would you suggest posting about the future of blender? ;) Shaping the direction of the coding efforts is what I thought these forums in general and these posts in particular are all about.

Personally, I don't identify verse with blender. What I mean is, blender and verse should be seperate. Blender can do things against Verse's design philosopy and vice versa.

I think that blender should have a unified, coherent vision as far as the direction of development. If Verse is seperate, then this allows blender to be OOP in C++, with OpenGL2, Verse, etc. as supported specs, while allowing anything else which uses Verse to be written with any other design philosopy in any other language and still communicate with blender. I think of Verse as enabling technology (it allows communication) and blender as core technology (it facilitates creation).

Switching gears, all this talk about network bandwith and 50 billion polys deserves comment.

1) Any network protocol will have trouble moving this ammount of data quickly. It's not a limitation of the protocol, but of the network. To dismiss the use of a network protocol on these grounds seems short sighted. Better to have it (even if it's slow) than to not.

2) As xype pointed out, verse is not limited to (sometimes slow) internet connections. I believe that the majority of studios use ethernet (100Mbit/sec and/or Gigabit/sec varieties) and some may even use fiber-optic networks. Having the protocol built in to a 3d app would be IM(relatively uninformed)O a great thing.

Just FYI, Pro/Engineer by PTC has as one of its latest features "collaborative design" where a model is distributed to clients (who also are connected via voice and/or chat) and one or more people can view (spin/pan/zoom) the design and make changes to it. It works over the internet and other networks, and if you don't need it you'ld never know it was there. Sound familiar? :) I know from personal experience that Pro/E handles 500+ MB models, which could concievably be used in the collaborative design process. PTC, btw, is a major player in the CAD arena.

Casey

eskil
Posts: 140
Joined: Tue Oct 29, 2002 10:42 pm

Post by eskil » Mon Jun 09, 2003 2:10 am

Xype:
Good point. fully agree.

At this point in time, features doesn't matter, modules doesn't matter, right now architecture matters. This means that it is very coder oriented. I have written a few apps and user input have been very useful. but not when it comes to evaluating something like an API (like vll).

You must understand that people will code what ever they want, and that is why I want to give the coders as much freedom as possible, If I say everyone has to code in C++, only the C++ fans will code for blender. The rest will do something else, the certainly wont switch language because I tell them to.

I have been doing graphics for 13 years and only the last 4 have i been programming. One day I woke up and said "No one else does the things i want so i will do them my self". And i regret that i dint do that sooner. You can learn C in 2 weeks. Its not a big deal.

About network performance:

We were worried about this early on, but it turned out not to be serious problem. Networks are very fast, usually faster then the apps, the rendering hardware and hard drives. It the host is local the performance of verse is never the bottleneck

50 billion polygons equals about 2 terabytes of data, so I think you will have problems with memory, hard drives, busses and graphics cards too. if we consider mores law and say that the avrage user has 512 meg memory it will take 18 years before they can fit that in to their memorys.

however verse servers can be implemented with zoning to cope with huge data sets too. I usually play down this because we are very long way from needing it. But it is nice to know that something can be done if the problem arises.

E

cessen
Posts: 109
Joined: Tue Oct 15, 2002 11:43 pm

Post by cessen » Mon Jun 09, 2003 5:19 pm

eskil wrote:The entire idea with verse is that we don't need to have this discussion. Each component can use different technology, language and so on. OpenGL 2 or not? I don't care! You don't have to discuss the merits of Object oriented programming with me. Just write your stuff object orientated.
In my opinion, that is a really bad idea. If we don't discuss things and explicitly design how the program(s) is going to work ahead of time, we are going to end up with a mess of disorganized code (perhaps even more so that the current Blender!). And that is precisely what we--or at least I--want to avoid.
I have no problem with experimentation, but to me that is precisely what this "verse" things is about: experimentation. In my eyes, it is not going to be the actual Blender 3.0.

Blender 3.0, in my opinion, should be started by a small group (read: less than ten) of developers, and should have a solid design in place, and should have a good deal of it coded before it is released to the larger group of "anyone can contribute" coders.
After all, the vast majority of successful open source projects are the ones that were already functional (or, at least, mostly functional) before they went open source, and the vast majority of failed open source projects (the ones that stagnated and everyone stopped working on them) are the ones that weren't.

I also think that it should be limited to a single programming language (other than a scripting language for users), because forcing developers to learn five different programming languages instead of just one seems rather absurd to me (after all, with a collaborative writing project, all the authors of the book stick to a single written language--otherwise every chapter might be in a different language!).

For the programmers who already know the chosen programming language: great! For the programmers who don't: at least it's only one language that you have to learn instead of five. And, of course, you don't have to worry about someone adding another component in yet another programming language that you don't know.

I am fine with pretty much any programming language, so long as it's powerful enough, even if I don't know it yet. I'm more than willing to learn it. For instance, I don't know C++ (I know C, but not C++), but I would be fine with Blender 3.0 being coded in it--I'm willing to learn a single programming language.

In short, consistancey is very important to me in this project. Letting everyone do their own thing may seem like a neat idea at first, but in the end it will result in a messy patchwork of code that no one can understand.

cessen
Posts: 109
Joined: Tue Oct 15, 2002 11:43 pm

Post by cessen » Mon Jun 09, 2003 5:36 pm

eskil wrote:At this point in time, features doesn't matter, modules doesn't matter, right now architecture matters. This means that it is very coder oriented. I have written a few apps and user input have been very useful. but not when it comes to evaluating something like an API (like vll)
Uh... the feature set is very important to the architecture. In fact, you can think of pretty much any program as simply being a collection of features. A 3d modeling program doesn't have word-proccessing features, for instance. And that, obviously, very heavily influences its architecture. The same things is true of more subtle feature differences as well, except that it influences the architecture more subtley (the more subtle the feature, the more subtle its influence on the architecture of the program).
And to say that modules don't matter while architecture does is just absurd: the module split is *part* of the architecture.

Don't get me wrong, I think that verse should be a part of Blender 3.0. But I think that it should be a means of communication between programs, not between modules of a single program.

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Mon Jun 09, 2003 11:15 pm

cessen wrote: I have no problem with experimentation, but to me that is precisely what this "verse" things is about: experimentation. In my eyes, it is not going to be the actual Blender 3.0.


Well, duh! What difficulty do people have understanding what Verse is? It's a protocol for moving 3D data around, for christ's sake, you don't really think anyone wants Blender 3.0 to be a protocol now, do you?

You people are trying to have a discussion about a thing that really does not warrant a discussion in first place. Verse is a protocol, the ones who understand it think it might be a nice thing for Next Generation Blender to support and what Eskil is saying (afaik) is that he does not want to force anyone to use any specific languages/concepts/ideas when doing NGB. This is possible because Verse is a protocol and not a program in the usual sense.

Get a group of 10 people, start doing concepts and design for NGB and for each part that you might think it's not vital to NGB, you can "offload" it to an external application via Verse. So people can focus on NGB and if someone wants to use Blender's internal data for, say, a Doom3 UV textured game model, he can access it via Verse. No need to make any changes to NGB for stuff that is trivial to NGBs design.

It's about connectivity not about Blender 3+ becoming a dumb server only.

matt_e
Posts: 410
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Post by matt_e » Tue Jun 10, 2003 2:42 am

xype wrote:Well, duh! What difficulty do people have understanding what Verse is? It's a protocol for moving 3D data around, for christ's sake, you don't really think anyone wants Blender 3.0 to be a protocol now, do you?
Yeah, I think it would cause much less confusion if people keep the concepts of Verse and a Next Generation Blender/Blender 3.0/whatever quite separate in their minds. I think a lot of the so called controversy has been caused by Eskil (and others) using the terms NGB and Verse semi-interchangeably, when they're both quite separate concepts.

thorax
Posts: 320
Joined: Sun Oct 27, 2002 6:45 am
Contact:

Post by thorax » Tue Jun 10, 2003 3:57 am

Xype and Eskil, if Verse is a protocol, then HTML/HTTP is
a protocol too.. Is it?? If its not completely a protocol,
if in fact Verse is a fuzzy association of flexible data storage
format and a reuse of UDP packet swapping, its
nothing more than a well meaning HTML/HTTP format based on
UDP..

Okay? Are we even now?

xype
Posts: 127
Joined: Tue Oct 15, 2002 10:36 pm

Post by xype » Tue Jun 10, 2003 7:08 am

thorax wrote:Xype and Eskil, if Verse is a protocol, then HTML/HTTP is
a protocol too.. Is it??


Using the web analogy, Verse is HTTP and NGB is a browser that renders/edits the HTML it gets from HTTP. NGB can be anything between Mozilla and Konqueror as far as the complexity goes.

Post Reply