Cheetah3D 7.3 Beta 1

Sorry for my absence lately. Here are just some short notes about the glTF/glb support.

First of all glTF support isn't finished yet. But my goal is full support of all glTF features. The next beta will already offer a feature complete material system support. v7.3b2 will add support for metalness,roughnesss and AO textures. Also transparencies will be supported.

Animations support also is completely missing at the moment. But it won't be missing for too long.

Features like "receive shadow" or "cast shadow" can't be saved to glTF files. At least not when using the default glTF 2.0 spec.

@podperson: PBR support in FBX is not that easy since FBX simply doesn't have support for metalness or specular level (F0) channels. But I'm quite optimistic that Unity will add glTF support soon. Just read the list of editors who wrote the glTF 2.0 spec (https://github.com/KhronosGroup/glTF/tree/master/specification/2.0). UE4 already has native glTF support. Once Unity added glTF support I will definitely look into the command line tool converter.

Bye
Martin
 

Attachments

  • FirefoxScreenSnapz001.jpg
    FirefoxScreenSnapz001.jpg
    103.2 KB · Views: 542
Features like "receive shadow" or "cast shadow" can't be saved to glTF files. At least not when using the default glTF 2.0 spec.

Not surprising. And in any event, using a naming convention to deal with any issues in the file-format is a simple-enough fix for workflow purposes, as long as you don't have to deal with too many issues.

@podperson: PBR support in FBX is not that easy since FBX simply doesn't have support for metalness or specular level (F0) channels. But I'm quite optimistic that Unity will add glTF support soon.

Nice! (I'm guessing the PBR workflows with other apps presumably rely on writing files directly to Unity rather than FBX itself.)

Would it be possible to pick your preferred "default shader" in new materials as a preference (so that those of us so-desiring could switch entirely over to a PBR workflow?)
 
PBR in Falcon

After extensive testing with Falcon I'm putting those PBR shaders to a rest now.

Conclusions so far:

Lambertian diffusion as in the Cheetah material shader is not physically correct, because all materials become reflective at "grazing angles".
The PBR shader handles this with specular reflection at 0.5 and roughness at 1.0.
Under certain conditions one can see the difference and it may look better.

The downside is that with this shader blurry reflections are always on which means reflective caustics are needed for energy conservation.
All this increases render times and currently the adaptive sampler doesn't catch the caustic noise well and the images still look noisy, disabling adaptive sampling causes prohibitive render times.

So I'll use PBR in Falcon only when absolutely necessary :smile:
EDIT: So I'll use diffuse PBR in Falcon only when absolutely necessary.

MAT.jpg PBR.jpg
 
Last edited:
It sounds like you have some "final conclusions" on an open beta devel version.
I wrote: "conclusions so far".
But the fact that PBR shaders render diffuse surfaces via blurry reflections is a generic one, nothing that Martin could change in a coming beta, no?
Also the fact that blurry reflections put more strain on the renderer.
Only the adaptive treshold thing could be optimized but even that won't change the conditions mentioned above.

I was quite unaware of the Disney principal shader until Martin introduced it.
After reading the Disney paper I understood that this thing promises more realism in rendering, something I'm quite interested in.
But now I found out that the old (physically incorrect) diffuse shader renders faster by design (diffuse shading is easier than specular shading for any path tracer) without looking visibly wrong (see the Cornell Box renders).
That's indeed a final conclusion I wanted to share in case anyone else is interested in using PBR shaders in Falcon.
Now for metals this doesn't apply, here I see an advantage in rendering highlights as I posted earlier.

So I see this new shader as a very welcome addition for anyone who wants
- a better workflow wrt material transfer to other apps
- to use image based PBR material packages
- more realistic reflections
- ...

That's what I discovered so far, now I'm going to play with something else (but may come back whenever there is a new exciting development, maybe in the next beta :smile: )
 
@misoversaturated

Impressive research and testing of the newer features. I'd contribute too, but I don't think my knowledge of a rendered image's attributes' technical correctness are at the same level. That's a mouthful:rolleyes:

Thanks for sharing!
--shift studio.
 
I'd contribute too ...

Please do so :smile:

In your wish list entry you talked about metals, by all means try the PBR shader on your projects, in this case there is no cost regarding render times because you'll have specular (better:metallic) reflections no matter which shader.
You'll get nicer reflections at least.

But if you use it with diffuse materials like brick walls and find it renders slowly I'd encourage you to cheat and set the specular value to zero.
Compare and see if it really looks different.
Now you have disabled reflections and the rendering should be the same as with the material shader without reflection: faster.
You don't need to switch to the material shader and can keep any albedo and normal maps (only roughness maps won't effect the render any more) if provided.
Of course you then have departed from the path of physically based rendering, but who cares if it still looks ok?
We worked like this the whole time until PBR appeared in this beta!

This is no Cheetah thing (or about early implementation stage), it's how PBR works in all renderers, I just tested in Cycles: Render inside a box for two minutes with diffuse shader, then switch to principled shader: more noise after same render time!
 
Last edited:
Hi Mis

You're bright as the morning star (and i never compliment anyone lightly), which shows in your analysis. But still, you miss a few points pro and contra.

And it's a simple ground rule: More realism equals more render time, usually exponentially. Without knowing anything about the disney shader this was to be expected. The fastest renders you still get with the traditional (mostly blinn and phong) materials in any render app. And the least realistic. Sometimes, especially in the background, you still can use them. If you use DOF, it doesn't matter at all. If you go for realism, you have to keep in mind that everything reflects light, even a brick. In some scenarios there should be a big difference between specular 0 and realistic reflections.

So, in the end, the higher render times shouldn't be a pbr thing but a realism thing, granted that Martin will iron out the quirks like the adaptive sampling problems.

An other ground rule is 'fake it if you can'. So the obvious thing to do is simple mix different shaders in a pic, but be aware that it doesn't always mix well, especially not with procedural shaders. The difference in realism does show sometimes. The user has to plan carefully before using materials and to know what will be in the foreground, what almost invisible (the same goes for the polycount of objects).

One big plus I mentioned before is the wealth of materials we get at a fingertip with pbr. The delivered materials in Cheetah are few and they only work as a starting point for own materials, at least for me. And to get that done realistically is time consuming. Always. So, in a way, what you pay in render times you'll get in creating materials. While you have to do the latter yourself you can do whatever pleases you while your computer is at work, be that getting some sleep or cleaning the dishes. Of course, you have to change the pbr always at least a bit, especially bump and normal settings. Mostly you have to tone down the specular, too, as the pbrs tend to be 'over realistic'.

And you can create pbrs easily yourself, at least with photoshop. As soon as you have a good photo without (much) reflections (i. e specular) you can create diffuse, ao, bump and normal there without much fuss (there are filters for normal and bump. The rest, like a specular map, is just working with the bump or a black and white version, over exposure and tone it down in the end). Next to photoshop there should be some free apps to do the same (the real problem is anyway only the normal).

By the way, the weather here around is ugly and wet today, but as soon as it's dry again, this ugly light without much shadows, the enemy of every photographer, is near to perfect for getting pictures for image maps.

And the images are another downside of pbrs.

I was away from 3d for several years; sadly I just hadn't the time left for this and couldn't even follow the new developments in this small industry. When I left, some ten years ago, procedural shaders, sss, volumetrics and so where the big new thing (three weak points in cheetah). Away from the image based materials and the problems they cause. So I was quiet perplexed when I saw the pbrs and with them image based shaders back (even if I should have guessed). And with them the same old problems like extensive (now better and easier but still time consuming) uv mapping, image seams and above all repeated patterns. As todays pictures have a far better resolution you need very often quiet big images for the textures, 8000 px and above are quiet normal, something you don't create that easily (and some 66 % of my textures collection was ripe for electronical heaven, ie the virtual paper bin). Of course, the reason for that are the game engines and even your procedural materials you have to bake (with a normal map, too) to get them into unity or ue or whatever. I should have expected this, of course, but I had hoped in general for much, much better procedural materials and with them less reason for uving (or whatever the correct verb may be), which is something I always hated and try to avoid.

For me with this kind of materials the image based factor is the biggest downside as you have to be very careful to get the right size (an 8000 px texture that covers in reality some 50 cms of the wall doesn't help at all if the wall is some 10 m wide and 4 m high) of the pic and avoid the repeated patterns. To do this in Cheetah, you have to work on the textures in your paint app, but with layers and / or the possibility to mix shaders it could be done in Cheetah in a far better way.

But still, good analysis.

Have a nice day and so.
 
For me with this kind of materials the image based factor is the biggest downside as you have to be very careful to get the right size (an 8000 px texture that covers in reality some 50 cms of the wall doesn't help at all if the wall is some 10 m wide and 4 m high) of the pic and avoid the repeated patterns. To do this in Cheetah, you have to work on the textures in your paint app, but with layers and / or the possibility to mix shaders it could be done in Cheetah in a far better way.

In Cheetah, and I assume pretty much everything else, there's no particular dependence on image-maps for PBR. You can pipe procedural maps through the channels perfectly happily.

I would love to see SVG support for image maps, where you can pick the resolution that the SVG will be rendered at (you can't render the SVG every time you need a sample for performance reasons).
 
. You can pipe procedural maps through the channels perfectly happily.

Yes, you can (and in everything else btw even better because almost everywhere you have the possibility to build layers, mix shaders and use masks) and I do that frequently with pbrs to mix noise in etc. Only then you lose all the compatibility to game engines and other apps. You'd have to bake textures first (I'm not even sure if Cheetah can bake a normal map). And, logically, everything you can download is image based.

Btw, in Cheetah that's not even much of a downside, to use image as textures, as I always used at least one pic in the mix of the materials (often for bump or so).

But it really was a bit perplexing for me to see that the image based materials are back big time, be that in principled or the own physically based shaders of all the apps i looked into recently. There are more and better ways to disturb and change them, though, new apps like substance painter to create them.

On the other hand, with the procedural materials I didn't really find anything new in the apps i looked into recently, nothing I couldn't have done in lightwave 10 years ago. I have to admit, I had expected to find new methods of simulating materials. Today it's still mostly bringing in predefined vector graphics as textures.

So, I understand your idea with svg support, pod, but does it really make sense when you just can create the svg in your vector grapics app and export it in any resolution you want every time you need it?
 
Regarding procedural vs. image based maps in path tracers, it's all about render times!
Path tracers are by design very slow renderers. Rendering images at megapixel resolution requires billions of ray bounces, which each time request a color value (and bump, mask or others).
In case of image maps which are accessible in expanded/uncompressed form in system memory as lookup tables that works rather quickly.
But a realistic procedural map is probably built from various nested fractal nodes which means lots of time expensive floating point calculations.

When Falcon became public with v7beta two years ago I tested image against procedural materials and found that the procedurals are two to four times slower.
So I gave up on that.

It's no surprise for me that procedurals don't take over the renderers at a time when path tracers do, they are kind of antagonistic.

It's another matter with game engines which may do ray tracing but not unbiased path tracing.

Same with the PBR materials.
Obviously there is a possible workflow gain in bringing PBR materials via glTF into Unity and render them there.

But in path tracing you have to pay a price for the enhanced realism, which is quite logical and basically has always been the case as realistic rendering was always a matter of waiting a day or two.
 
Well, neither was I anticipating path tracers.

In hindsight I understand fairly well what was going on in the last 10 years and especially how much influence the game engines won over time. Like I said, I probably should have guessed but I didn't. Not important in any way what I thought, but still I think procedurals should get more important. This isn't Cheetah specific, it's more about 3d apps in general.

And I'm not so sure if procedurals are really always slower while rendering. It has a lot to do with how they are precompted (and when), but from there on it shouldn't take more time than image maps.

In Cheetah I used mostly image based textures as the procedurals lack in quality imho. So I route for the new pbrs anyway.

But in the end, I wouldn't give up that easily on procedurals, especialls as apps like substance designer also use them and export in the end bitmaps (as much as I understand at least). Something like that incorporated in a 3d app would be great (but not realistic in Cheetah, I know); mixed with bitmaps procedurals can be great, like creating a good brickstone material and then bringing in the brick tiles as procedurals. But for that you need a possibility to slightly disturb the proc.

And even if procedurals are always slower, you can gain in time with less uv mapping and less work on the images themselves, like eliminating patterns and stuff. Path tracers and game engines aside, it's just what I had hoped for: less need for uv-mapping.

Obviously there is a possible workflow gain in bringing PBR materials via glTF into Unity and render them there

Why using gITF? Worked perfectly well without that (well, that's no argument, I know). The new format got big because of facebook, as I see it (but could be wrong, of course). As Martin stated, Unity doesn't even support it at the moment.

Here, too, I was hoping for something else, namely a format be that collada (it was just that it was made for) or whatever, that's really well made and a standard for all apps, so the developers don't have the need to incorporate a dozen formats or more, just native or that one export format which will work in any app around (hoping for, but never expected that. It would mean that some firms give up their format. It's always very difficult to get a new standard (why should be possible in 3d what's not even reality with easy stuff like electricity outlets? You'd need an adapter to use those in my country)). But at the moment some of those formats are just half baked and they don't work as good as promised ex- and importing them around between the apps. Often enough it's obj we have to take, as the newer ones sometimes plain don't work).

realistic rendering was always a matter of waiting a day or two.

Well, that's taking it a bit far, but I get the drift. It's just that it should be possible in maximum of a few hours (at least with resolutions up to 4k) with some post work, of course.
 
P.S.: I completely forgot to mention, that if you have problems with the longer render times of the new pbrs (they really are worth it) and procedural materials, then you will have really serious problems with the things to come. Like some procedural / image mapped mix as material, some glass, some caustics combined with sss and volumetric lights. Now that will take a long time to render :smile:

But to keep the thread on track, at least a bit: So far the new things in the new beta are great. Beautiful caustics (they are great) and a good adaption of the pbrs with the disney shader. :icon_thumbup:
 
@hasdrubal procedural maps are perfectly viable using shaders in game engines, it’s just that we don’t have any standardization on them yet (in essence shader graphs are all proprietary). In the mean time we use images (or bake).

The idea with SVGs is that you can reduce the size of your game binary and also make its materials scalable, including to future hardware.
 
Back
Top