Notes on design study of the game engine

Game Engine, Players & Web Plug-in, Virtual Reality, support for other engines

Moderators: jesterKing, stiv

Bandoler
Posts: 53
Joined: Mon Oct 14, 2002 3:16 pm
Location: Somewhere between the 1 and the 0

Notes on design study of the game engine

Postby Bandoler » Tue Nov 05, 2002 6:33 pm

Hello,

I've spent some hours studing the code of the game engine and the few documentation available about it. I wanted to study how it works and see what could be done to make it better. The main point i'm interestied in is the rendering pipeline. What i've found out until now is:
- Most systems have a wannabe abstract interface so they can be replaced. But very few of them are very well encapsulated. The rendering system has this in Rasterizer classes.
- First, the scene is converted from blender scene to real-time engine scene. The objects that are converted are meshes, skinned meshes, lights, cameras, scene parameters (background color,etc) and atmospheric parameters (mostly fog). In this conversion step some objects are created to ease the representation. The meshes build buckets grouping faces with same materials, for example.
- Then, already in the main loop, i think everything is rendered blindly. Also, the logic and the physics advance.

I think its good to abstract the rendering from the rest, but it's useless to do it like the current way. To take out the most of each rendering system (read OpenGL, OpenGL 2.0, DX, PS2, GameBox Advance, MSX, or whatever), the conversion step should also be specific for each renderer. Despite requiring to write the specific scene objects for each renderer,this does not imply a rewrite of the engine for each system, only a better encapsulation of the rendering. The physics, the logic, the sound the network don't vary. Also a culling algorithm could be implemented in a way that worked using a very simple interface that any representable thing could fullfill. Summarizing this: for each system there would be an adaptor from the blender scene and a renderer, both can be designed using the visitor pattern to keep thing clean, and the objects the system uses. This could allow to use from the current rendering system using material buckets to an image based system, to say something strange. Every system could convert the objects it is capable to represent in the best format it needs.

I'm afraid to touch the code because everything seems very linked together like a brain and the minimmum attempt to separate some neurons would imply to track all axons and dendrites through all the body, to avoid breaking things.

What i want to do is to write an OpenGL 2.0 renderer, for the real-time engine. I know, i can only do this if i have a 3dLabs Wildcat VP (i don't), but for the time i manage to do it (a distant future) i guess i'll already have access to a OpenGL 2.0 sample implementation. The design can be started now, as the specifications are on the way. The new engine should use a lot of shaders: vertex shaders for the effects like "Wave" and skinning, fragment shaders for the complex materials (even procedurals). I think, but i'm not sure, that the design option of Wavk on the material editor is good to allow the conversion of materials directly to OpenGL 2.0 using the 8 texture layers and shaders to combine them, blending, bumpmapping, reflecting.... Promising, promising...

I still don't have very clear what is this message for, first i need to clarify my near future, that will decide my mid-term future. One of the options would turn my job into doing this for blender, but still is not a very attractive option.

Who is in charge or doing something with the engine? I need to know what can i touch and what can't i. I've heard about people asking for a complete rewrite of the engine. I don't consider this necessary, despite the code should be sorted out a lot. I'd like to work for the rendering pipeline.

(Bandoler)

Cyberdigitus
Posts: 65
Joined: Tue Oct 29, 2002 3:27 pm
Location: Belgium

Postby Cyberdigitus » Tue Nov 05, 2002 7:17 pm

wauw, you really want to tackle the implementation of an opengl2.0 renderer? that would be great, i alreday was going to ask about that. But how close is 3d labs proposal to acceptance to the other opengl board representives?

i also read that maybe blender should be completely written for opengl 2.0. it's interface as well, not only the game engine. someone was proposing this.

and would this mean we would have builing blocks to create vertex/pixel shaders?

Bandoler
Posts: 53
Joined: Mon Oct 14, 2002 3:16 pm
Location: Somewhere between the 1 and the 0

Postby Bandoler » Tue Nov 05, 2002 8:12 pm

Cyberdigitus wrote:wauw, you really want to tackle the implementation of an opengl2.0 renderer? that would be great, i alreday was going to ask about that.

I'm just considering it. But of course i would prefer not to do it alone. That's why i'm requesting for the opinion of others.

Cyberdigitus wrote:But how close is 3d labs proposal to acceptance to the other opengl board representives?

According to 3dLabs, they are receiving very positive feedback from the ARB, i think that the shader language has been accepted already. However, there's nothing official yet. Anyway, sooner or later we will have an OpenGL 2.0, abd i don't think that the features will be very different from the proposed by 3dLabs. and the features should be our concern in this early stage.

Cyberdigitus wrote:i also read that maybe blender should be completely written for opengl 2.0. it's interface as well, not only the game engine. someone was proposing this.

Yes, but this would involve a lot more changes i'm afraid. Trying it with the realtime engine is a good start, as it has it's own simpler rendering pipeline.

Cyberdigitus wrote:and would this mean we would have builing blocks to create vertex/pixel shaders?


I'm afraid i don't understand this question.

(Bandoler)

Cyberdigitus
Posts: 65
Joined: Tue Oct 29, 2002 3:27 pm
Location: Belgium

Postby Cyberdigitus » Wed Nov 06, 2002 1:28 pm

i mean can we create our own shaders/texture effects then through the opengl 2.0 shading language? Maybe we can create a realtime button 'block' (dunno the name) to acces that. or write a python wrapper, nah i'm just thinking loud here that last idea might not be feasible.

Bandoler
Posts: 53
Joined: Mon Oct 14, 2002 3:16 pm
Location: Somewhere between the 1 and the 0

Postby Bandoler » Wed Nov 06, 2002 2:29 pm

Cyberdigitus wrote:i mean can we create our own shaders/texture effects then through the opengl 2.0 shading language? Maybe we can create a realtime button 'block' (dunno the name) to acces that. or write a python wrapper, nah i'm just thinking loud here that last idea might not be feasible.


Well, these are very concrete features i wouldn't address yet. However, the idea would be to deal with higher level things: to build the shaders from the blender material definition. It would be possible to do what you say but, i don't think someone would develop shaders like this better than writing the code directly or using a specialized tools similar to the nvidia's effects browser that surely will appear soon.

(Bandoler)

Cyberdigitus
Posts: 65
Joined: Tue Oct 29, 2002 3:27 pm
Location: Belgium

Postby Cyberdigitus » Wed Nov 06, 2002 7:46 pm

have you heard of OpenML ? maybe we should consider that too then, for the post production part of blender and in the end one considerate whole, that might be the way to implement video into the gameengine.

Of course OpenML is still a way off, but we'd better be ahead of the game a bit.

on a sidenote: is OpenAL also related to OpenGl / OpenML ?
Last edited by Cyberdigitus on Wed Nov 06, 2002 7:54 pm, edited 2 times in total.

Cyberdigitus
Posts: 65
Joined: Tue Oct 29, 2002 3:27 pm
Location: Belgium

Postby Cyberdigitus » Wed Nov 06, 2002 7:52 pm

do you mean we'd create shaders in an external dev tool and provide them as adjustable presets in the material 'editor' - sorry for the quotation marks :-) like the materials are now?

that would be a nice start, given an option to load more of them into it.

jeotero
Posts: 107
Joined: Wed Oct 16, 2002 5:31 am

Postby jeotero » Thu Nov 07, 2002 7:22 am

interesting, thanks

erwin
Posts: 86
Joined: Mon Oct 21, 2002 9:10 pm

replacing graphics renderer in the game engine

Postby erwin » Mon Nov 11, 2002 2:51 pm

bandola is on the right track I think, although replacing the real-time graphics renderer by an opengl 2.0 or other abstraction shouldn't be that hard.
First of all, the current graphics part is very small, only about 10 classes.
Secondly, most of the modules are not dependent on the Rasterizer.

There has been no effort put in the graphics Rasterizer and optimizations, because the process of creating-games-without-programming was considered higher priority.

One idea I had was adding support for several renderers (including its optimizations like portal culling). Imaging adding the quake2 renderer and the crystal space / nebula device renderer into the gameengine, while keeping the other modules there. That gives an idea on what level the abstraction needs to be. Usually the renderer has a large impact on the overal engine, and I think the dependency is not that bad in ketsji.

jeotero
Posts: 107
Joined: Wed Oct 16, 2002 5:31 am

Postby jeotero » Tue Nov 12, 2002 3:01 am

Why this shoulndt be that hard :) all the plugins for other applications that allow to export models and scripts directly from them.

Ths could be a great idea, export the geometry, textures, logicbricks, etc TO ps2, gamecube, xbox, crystal space, quake2, pocket pc, java3d etc

Panther
Posts: 86
Joined: Tue Mar 04, 2003 7:55 pm

Postby Panther » Thu Apr 03, 2003 1:49 pm

Hi Bandoler,

If you're serious about implementing OpenGL 2.0 into the GameEngine ( or Blender in general for that matter ), you should probably talk to Erwin ( the original GameEngine creator ) or Hos ( of whom manages the Tohpuu build of blender ).

It's great to see another developer who wants to push Blender forward :D

I've been looking into NVIDIA's CgFX, but OpenGL 2.0 is definately the way to go !!!

Personally, I'm not a true ( good ) developer, but if I can help you in any way please let me know !!!

Bandoler
Posts: 53
Joined: Mon Oct 14, 2002 3:16 pm
Location: Somewhere between the 1 and the 0

Postby Bandoler » Thu Apr 03, 2003 2:16 pm

Hello,
As you see i posted that message on November, but since then i haven't done much for personal reasons (i finished and presented my graduation project and started to look for a job, etc..). In fact i've been thinking about it and examining some other engines like CrystalSpace or NeoEngine (which is the one i'm currently fighting with).
OpenGL 2.0 seemed to be a very good option but they are not developing it very fast, and right now, the only way to use its incomplete beta drivers is purchasing a 3DLabs WildCat VP, which i'm not planning to yet.

From my point of view and as i've stated in some other messages, "the way to go" is to integrate an exististing engine for the graphics part of the blender real time player. The logic and scripting seems great to me as it is. The sound i don't know (haven't looked at it). As i haven't got much time these months i've decided to wait for blender to cool down (right now, the engine is not very stable) and decide what to do later.
Also i was waiting to see what human resources were available and what other ideas came up. I have to say i was a little bit dissapointed because it seems that there's not much people interested in developing the engine and at the same time capable of doing it.
Fortunately, i'll join a university department next month and start research to make a PhD, related to computer graphics and game development. Maybe i'll find some time to look at it seriously then. The thing is that it is a small university with limited resources and so we will possibly work with some free engine.

It's good to see other people interested in this.

(bandoler)


Return to “Interactive 3d”

Who is online

Users browsing this forum: No registered users and 0 guests