Normal Map Bake

Are you guys talking about the same thing? I think all you need is to apply the directional shader, put a radiosity tag on your camera, and bake that. You'll get an unwrapped normal map.
 
Typically baking normal maps involves baking the normals from a complex model onto a simpler geometry. C3D has no intrinsic support for this so you need to make sure you have UV mapped your simple and complex objects in a compatible way so you can bake normals on one object and then map them to another.

See the example file (attached).

The directional shader that ships with C3D isn't well set up. I'd use it as a starting point but map it to emissive instead of diffuse, and set the shader's diffuse and specular colors to BLACK. Now you can bake without changing the lighting setup.

If you don't like the directionality of the normal map you can use vec2float and float2vec nodes to remap the RGB channels of the directional node.
 

Attachments

  • renders.png
    renders.png
    194.6 KB · Views: 836
  • Screen Shot 2013-03-11 at 5.09.45 PM.png
    Screen Shot 2013-03-11 at 5.09.45 PM.png
    35.3 KB · Views: 834
  • Screen Shot 2013-03-11 at 5.06.37 PM.png
    Screen Shot 2013-03-11 at 5.06.37 PM.png
    53.6 KB · Views: 848
  • Screen Shot 2013-03-11 at 5.06.30 PM.png
    Screen Shot 2013-03-11 at 5.06.30 PM.png
    47.7 KB · Views: 868
  • baking normal map.jas.zip
    16.6 KB · Views: 461
hm…

unfortunately, that doesn't work for anything more complex than planes. Let's say we have a character or a car and would like to bake normals from a highres mesh into a lowres version… looks like you can't do that in cheetah 3d? :/
 
This is what it looks like on a torus:
Torus.png


And here's the baked normal map:
Torus-Norm.png

That's not the kind of normal map you want to bake. You want the normal of the source mesh relative to the target mesh normal, so you'd expect the normal map for a torus to be more-or-less blue, say.

If you've UV-mapped your low-poly mesh it's actually pretty straightforward to export it and the high poly mesh to blender and bake the normal map (it wouldn't be too hard to create a script to automate the process of baking given two meshes to a single keystroke in Blender). Here's a link to a tutorial:

http://cgcookie.com/blender/2010/06/30/normal_maps_blender_2_5/
 

Attachments

  • example normal map.png
    example normal map.png
    194.3 KB · Views: 872
Last edited:
@podperson:Yes, yours is what everyone on this thread is after. The normal maps you can bake in Blender (and Maya, and others) are the typical normal maps you'd expect in order to slightly "tweak" existing normals.

But looking at this thread is making me think the directional->emissive node patch might be a solution for what I'm doing. I'm making a game engine for iOS and loading from .jas files. Unfortunately, I have to calculate normals on the CPU while loading model data because there are no normals saved in a .jas file. Looking at the normal map created from the directional node, this might be a way to eliminate building normals on the CPU (decreasing load time). I could pass zeros for my normals on load and then have my fragment shader override them using this directional->emissive kind of map to set normals. Maybe? I'll try it out.
 
@podperson:Yes, yours is what everyone on this thread is after. The normal maps you can bake in Blender (and Maya, and others) are the typical normal maps you'd expect in order to slightly "tweak" existing normals.

But looking at this thread is making me think the directional->emissive node patch might be a solution for what I'm doing. I'm making a game engine for iOS and loading from .jas files. Unfortunately, I have to calculate normals on the CPU while loading model data because there are no normals saved in a .jas file. Looking at the normal map created from the directional node, this might be a way to eliminate building normals on the CPU (decreasing load time). I could pass zeros for my normals on load and then have my fragment shader override them using this directional->emissive kind of map to set normals. Maybe? I'll try it out.

It won't work if the objects actually ever rotate since their normals will be totally wrong.
 
Shaders can apply vector transformations and the RGB data are just vectors. Either calculated on load or grabbed from RGB data, why should it matter where the vector comes from?
 
Shaders can apply vector transformations and the RGB data are just vectors. Either calculated on load or grabbed from RGB data, why should it matter where the vector comes from?

Of course it doesn't matter. The problem is that you're throwing out disparate parts of the pipeline.

While I think about it, your idea has a bigger problem, which is that baked normal maps don't deal with negative components in the vectors.

I think this is a pretty complicated workaround for the relatively simple approach of baking more standard normal maps using Blender (say).
 
I also use the blender round-trip.

Is there a thing coming up in the future that would allow normal map baking to UVs? Generally baking high-resolution things onto low-resolution things is very important to me an I work on real-time graphics.

It would catapult Cheetah 3D to the next level.
 
Back
Top