64 bit rendering- The quick and dirty way (works right now)

Blender's renderer and external renderer export

Moderators: jesterKing, stiv

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

64 bit rendering- The quick and dirty way (works right now)

Post by ideasman »

Hi, I have been looking at how to get 64 bit images out of blender.

I cant code in the rendering area yet so I looked at some other ways and discovered a realy simple method (tho dirty as I have mentioned)

Render larger- 3x with no/less AA.

convert the image to 64bpp in same app (imagemagick/cinepaint)

scale back down by 3.

Thats it.


Image magick is simple and can be automated easely.
The line below converts 1
convert test.png -depth 16 test2.png
convert test2.png -resize 300x300 test3.png

Im not sure if combining these into 1 copmmand makes it compute the scale in 16bpp. 2 definetly will.

and viola- 64bpp

Heres the scale ratio

1px = 8bits
2 = 9
3 = 10
4 = 11
5 = 12
6 = 13
7 = 14
8 = 15
9px scaled down to 1 = 16bpp

Therefor the width/height x3 will give a 64 bit image when scaled back down.

- Cam

Usagi
Posts: 0
Joined: Wed Jan 14, 2004 8:26 pm

Post by Usagi »

What is a 64-bit image good for? :shock:

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

Post by ideasman »

Okey- for film post processing they ONLY use 64/48 bit image processing.

24 bit images do useually look fine but after colour correction you tent to see quantization in the colour (tho subtle)

Try CinePaint - Do curves in a 24 bit image and the same on a 48 bit image.
You can see the difference.

bertram
Posts: 0
Joined: Wed Oct 16, 2002 12:03 am

Post by bertram »

@usagi: Let me say it in other words:
64 bit-output is mandatory for using blender in a professional environment. :)

@ideasman: Your approach to 64bit output sounds interesting, although you wont't be able to reverse the quantisation in areas of an image where colors blend very softly. And exactly these areas may cause the most problematic situations when doing a color correction.
If blender would do 24bit-dithering out of the (assumed) internal 64bit calculation, your method should work perfectly.

Bertram

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

Post by ideasman »

Thats a good point about the dithering.

This is not realy a perfect solution but its not that bad either.
We realy need 16 bit colour channels saved to a PNG from within blender.

Its interesting that blender can load 64 bit PNG images. I assume they are downsampled back to 32 but even so, it means that adding 64 bit support wouldent be odd. (blender rendering images that it coundent read)

- Cam

Usagi
Posts: 0
Joined: Wed Jan 14, 2004 8:26 pm

Post by Usagi »

Okay, I understand now. Because usually displays don't support more than 16 million colors I couldn't think of an application for more than 8 bit per channel. (Except HDRI, but your method doesn't produce HDRI images, does it?)

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

Post by ideasman »

Nup no hdri, though I tyhink it could be wangled but still not true hdri.

bertram
Posts: 0
Joined: Wed Oct 16, 2002 12:03 am

Post by bertram »

You got it, usagi!

Actually HDRI is just a more fancy term for "higher color-depth than the usual 24bit", isn't it?

When talking about the output of your visual information, 16,7 millions of coulours are good enough, because the human eye doesn't have the perception for any more colours.
But when talking about processing the information, it would be a big problem to use the dynamic range of the output as the dynamic range for working.
In digital age we do have the ability to lossless reproduce any digital media. But when digitising analogue sources and processing them, we will always have to deal with big loss of information. In worst cases even more than with processing the analogue media/signal itself. Therefore the working dynamic range has to be a multiple of the target- (/output-) range.

This is why I campaign, that nowadays the ability to output 16bit per channel is no more a nice-to-have feature. It's much more of an urgently needed basic feature for blender.
Look ahead: More and more digital cameras and scanners yet in the consumer range provide color depths of 12 or even up to 16bit/channel.


By the way: Not only displays do support only up to 16,7 million colours, but also if your image is used for press, the dot screen allows 256 shades of each colour (Cyan, Magenta, Yellow, [Key/Black]). This is because most of the (aged) typesetters like Agfa, Linotronic, etc. are configured to work with resolutions of 2540 dpi. Using the very common screen ruling of 150 lpi (lines per inch), each dot is ~ 16 x 16 points (2540 / 150 = 16,9333) which allows to build up to 256 sizes of a dot.
In the meantime, techniques like FM-screening (frequency-modulated screen like your inkjet printer does it [instead of amplitude-modulated screen like laser printer or newspaper]) or hexachrome (6 instead of the 4 primary colours) push the demand for a higher dynamic range even in press application areas.
And: Sorry, if I bored anyone with my essay.

Usagi
Posts: 0
Joined: Wed Jan 14, 2004 8:26 pm

Post by Usagi »

bertram wrote: Actually HDRI is just a more fancy term for "higher color-depth than the usual 24bit", isn't it?
If I'm not totally mistaken it is not exactly the same. As I understand ideasman's method it produces more shades of color but within the previous range from black to "display white". True HDRI however ranges from black to sunlight which is possibly 1000 times brighter then the brightest white of a monitor screen.

Here is a good explanation: http://www.debevec.org/HDRShop/main-pages/intro.html

ideasman
Posts: 0
Joined: Tue Feb 25, 2003 2:37 pm

Post by ideasman »

HDRI would be more usefull then 64 bit images-

matt_e
Posts: 410
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Post by matt_e »

Bertram - I absolutely agree with you on the importance of higher precision output!
bertram wrote:You got it, usagi!

Actually HDRI is just a more fancy term for "higher color-depth than the usual 24bit", isn't it?
No - as mentioned, HDRI contains information above and below the values that you see on screen. You can bump up the exposure to see details that were under-exposed before, or reduce the exposure to see details that were over-exposed. So for example, instead of a range from 0 (black) to 1 (white), you can have a range from -5 (underexposed) to 0 (black) to 1 (white) to 5 (overexposed).

More bits per channel just means that you're representing the image that you already see with more definition - there are more steps between the 0 and the 1. Which is of course very useful in its own way, but serves a different purpose.
When talking about the output of your visual information, 16,7 millions of coulours are good enough, because the human eye doesn't have the perception for any more colours.
Not necessarily - For example if you're doing something in black and white (greyscale), you only have 256 levels of brightness - this can be clearly visible in the form of banding on gradients even when the image hasn't been post-processed.

Most film is digitised at 16 bit per channel, mainly because film itself has a higher dynamic range than 24bit RGB, and also because losses in the dynamic range of your image are much more noticeable when viewed on a powerful projector (which has a much higher dynamic range than your average PC display). If you look at an 8bpc image beside a 16bpc image on a film projector, the difference should be discernible. Incidentally, this is the story behind Cinepaint (aka Film Gimp). The studios needed to work in 16bpc, the standard Gimp maintainers didn't want it, or allow the patches, or something, so they forked The Gimp to make their own 16bit version.

bertram
Posts: 0
Joined: Wed Oct 16, 2002 12:03 am

Post by bertram »

No - as mentioned, HDRI contains information above and below the values that you see on screen.
But at least HDRI is a plain and simple 16bit/channel RGB, isn't it?
The name "HDRI" only indicates the special way in which the gradient of the source image(s) is mapped to the final target "hdri"-image gradient.
Not necessarily - For example if you're doing something in black and white (greyscale), you only have 256 levels of brightness - this can be clearly visible in the form of banding on gradients even when the image hasn't been post-processed.
Well, surely this is true für 8bit, although you may achieve a multiple of 256 "greyscales" in a 24bit RGB by only incrementing one channel at a time instead of all three RGB values. This gives slight variations in hue but this is almost not noticeable.

Another reason for heavy banding is the gamma correction that is applied 1. by blender, 2. by your graphics card, 3. by your imaging software,... This can make the best input image look awful on screen!

But back to 24bit: The 24bit would also be enough for film as it would be dithered. The dithering pattern would - if at all - be noticeable as a very slight extra noise.
Look at HDCAM which "only" works in 10bpc or 12bpc and was used for feature films like Episode 2...

In my opinion, the processing and storing of information in a maximum resolution is a legitimate interest as long as this is economically arguable. But exposing 16bpc to film is simply overdone, though granted this is standard.

matt_e
Posts: 410
Joined: Mon Oct 14, 2002 4:32 am
Location: Sydney, Australia
Contact:

Post by matt_e »

bertram wrote:But at least HDRI is a plain and simple 16bit/channel RGB, isn't it?
The name "HDRI" only indicates the special way in which the gradient of the source image(s) is mapped to the final target "hdri"-image gradient.
I think it depends on the specific format. I found this siggraph presentation with a bit of googling that describes it well:

http://www.debevec.org/IBL2003/GWcourseTalk-IBL2003.pdf

bertram
Posts: 0
Joined: Wed Oct 16, 2002 12:03 am

Post by bertram »

I had a look at the presentation. This is the point where I've got to quit the discussion about colourspace because it becomes too theoretical and high-levelled for me :wink:
The only conclusion that I can draw is, that my information about the perception of the human eye was obviously outdated and therefore wrong.
I assume, that if blender could generate 16bpc-output, it should also be able to allow HDRI-Output. At least with the help of a little tweaking in S&L done by the artist himself.

loos
Posts: 0
Joined: Mon Jun 09, 2003 3:43 pm
Location: South Jordan, UT

Using HDRI vs Outputting HDRI

Post by loos »

Don't most 3D programs use HDRI instead of outputting it? I remember Dr.<?> Debevec taking a bunch of pictures at different f-stops then combining all that information into an HDRI image. But most modelling programs seem to use that information in reflections.

For example, if you have a black pool ball on a table right beside a window on a very sunny day, you can't see anything out of the window because it's too bright, but on the pool ball in the reflection of the outside you can see that there is a tree outside because the light has lost some of it's intensity. I could be way off, but that's the way I've seen it used.

So outputting HDRI doesn't seem like something we'd need Blender to do (maybe talk to the GIMP people?). It'd be really nice if Blender could use HDRI images though. Maybe it's already in Yafray?

Post Reply