Are there any plans for HDRI use from tiff or jpg formats?

Are there any plans for HDRI use from tiff or jpg formats?

Currently, the HDRI function only works with the .hdr extension. I was able to create hdr images in jpg and tiff formats, but getting them into .hdr has been near impossible. Photoshop saves in .hdt, QTPFSGUI gets me nowhere and, supposedly, GIMP can do this, but I can't figure out how.

Any suggestions and are there plans to have this functionality.
 
Hi,
jpg is no HDR file format. It os LDR. But you can also use OpenEXR files which is also a very popular HDR file format.

Bye,
Martin
 
Hi,
jpg is no HDR file format. It os LDR. But you can also use OpenEXR files which is also a very popular HDR file format.

Bye,
Martin

Martin,

Thank you. The GUI suggested otherwise, but I'll try that. I wonder why the tutorials I found in several places indicated that you could save an 8-bit HDR jpg. It sounded odd to me, also, but this explains a lot.
 
There are photos that people take in HDR format (multiple shots, EV bracketed), then reduce to LDR jpegs which have a certain almost surreal look to the colors. (example at http://www.flickr.com/photos/tboard/3394025867/in/photostream/ ) They call them HDR because of the look, but they really aren't. There are software filters that can put that certain look on regular photos, but they don't have the dynamic range for lighting, either. HDR lighting really needs the real dynamic range in order to do do its job best, I think.

I've been messing around with HDR panoramas a little, assembling them with Autopano software, and am not too good at it yet. I'm hoping to be able to render with them. My camera will automatically bracket three shots at 2EV difference, so a total of 6EV of dynamic range. I'll probably experiment with manual bracketing to get a wider range. If I'm doing the math right, each EV doubles the light, so adds one bit to the range from black to white.

Bill in MN
 
Yes, I believe I understand. The jpg I produced looks very cool, as it brings all the highs, mids and lows out with even contrast, but the file itself does not store any light source information, as an HDR would.
 
HDR isn't storing "light source" info, it's simply storing more dynamic range. When you look around a scene, your eyes are constantly changing the exposure (by varying aperture) so you don't get blinded. Your mental image of the world is horribly inaccurate -- nothing is "white" or "black" -- merely lighter or darker than other stuff you're looking at at the time, as evinced when you naively point a camera at something and take a photo... backlit subjects look black OR have washed out backgrounds.

HDR simply tries to capture the actual brightness of a scene. Now, the problem is that no real camera can actually capture an HDR scene, so they approximate it by shooting multiple photos (some over-exposed, some under-exposed) -- it's called "bracketing". So you might shoot five photos -- one five stops under-exposed, one two stops under exposed, one "correctly" exposed, one two stops over exposed, and one five stops over exposed.

Each stop is a doubling of brightness, so you then take the bits of each image which are not blown out (white or black) and combine them, multiplying by the exposure (1/32 for five stops under-exposed, 1/4 for two stops under-exposed, and so on). Photoshop (for example) can now do this automatically.

This gives you probably something like 3x the dynamic range of a typical photograph -- at least in ideal cases. It may be impossible to bracket photographs of light sources adequately (point a camera at the sun and you're not going to be able to under-expose it... 1/8000th of a second at f32, ISO 80 will STILL be blown).

So that what you're getting is only an approximation of a "true" HDR image -- which would require either a very expensive specially-constructed camera OR computer rendering, but WAY better than anything you get with a single exposure.

Close enough for government work.

You can fake up an HDR in Photoshop (or similar) by loading a non-HDR image, going into 16-bit mode and dialing up brightness by a bunch of stops and then painting bright pastel colors over light sources.
 
Ever present and helpful, thank you.

The way I've understood this in the past, and the excerpt below is one example, I apparently misunderstood it to mean that the radiance/luminescence that is stored is light source information.

"Information stored in high dynamic range images usually corresponds to the physical values of luminance or radiance that can be observed in the real world. This is different from traditional digital images, which represent colors that should appear on a monitor or a paper print. Therefore, HDR image formats are often called "scene-referred", in contrast to traditional digital images, which are "device-referred" or "output-referred". Furthermore, traditional images are usually encoded for the human visual system (maximizing the visual information stored in the fixed number of bits), which is usually called "gamma encoding" or "gamma correction". The values stored for HDR images are often linear, which means that they represent relative or absolute values of radiance or luminance (gamma 1.0).

HDR images require a higher number of bits per color channel than traditional images, both because of the linear encoding and because they need to represent values from 10−4 to 108 (the range of visible luminance values) or more. 16-bit ("half precision") or 32-bit floating point numbers are often used to represent HDR pixels. However, when the appropriate transfer function is used, HDR pixels for some applications can be represented with as few as 10–12 bits for luminance and 8 bits for chrominance without introducing any visible quantization artifacts.[6]"

Source: http://en.wikipedia.org/wiki/High_dynamic_range_imaging

I do understand the process, in general. I took my photo of the sky with sharp, bright clouds, using -2, 0, and +2, which is all my wife's camera could handle (she wouldn't let me buy the nicer, more professional one :frown: ). I then put it in Photoshop and bounced out the .hdt, which, of course is not usable in Cheetah3D. I'm merely trying to get that sucker in Cheetah3D, without having to buy more stuff.

Now, if I apply the new image to the HDR tag in any way, it comes out blank (as expected). If I apply it to a sphere and put my scene in side it (as I did with the River Bank scene I posed in the Gallery section), it will work. That's obviously not the way to go. Meanwhile, it appears that HDRI helps to point lighting from the appropriate directions and at the appropriate intensities. Regardless of how this comes out, I'm very happy with the information that all of you have provided. It has been an education.:)
 
Back
Top