gpu

gpu

others have already started and when that takes off no one will use cpu based rendering any more for stills like everyone already uses gpu acceleration in games.

i'm talking here about unbiased real time (cam, light and procedural textures) rendering done by the gpu.

that means you have no editor (except maybe for modeling) but the view is always live and all cam, light or texture changes appear (with radiosity) instantly in the render view which starts noisy but then gets increasingly smooth, and when you like it you take a snapshot.

what i'm asking for is that martin starts to think about how this will be possible with open cl on snowleo, before others take over with mac solutions.

maybe there are people out there to team up with ?

i would pay 500$ if cheetah offers this (am already thinking of buying a cheap pc with an nvidia).

this is meant as an encouragement more than as a demand :smile:

- archie
 
I don't know where you got this insane idea :)rolleyes: :redface: ) but I agree GPU powered rendering seems the direction anybody is taking these days (not only Octane is in beta, but Furryball for maya and Arion have been released, a GPU version of Luxrender is under fast development, and who knows how many others...)... time will tell if it's the right direction, of course.
so, back in topic, I think what Archie is talking about (unbiased, realtime, GPU) is very far from where we are now and don't even know if it where all cheetah users really want to go... what I'm pretty sure is that almost anybody would enjoy reducing render times thanks to GPU contribution in render calculations, even if anything else is changed.
I see this probably requires a huge effort from Martin, and can't be done in few days or weeks (also OpenCL doesn't seem completely mature right now). In the same time, I think this effort has to be rewarded and imagine not all of us are willing to pay more for this extra feature... so this is not just a technical/development question.
I personally would pay more for this. Also, I tried Octane and I quite like it (and will probably buy a license if the Mac demo will run well on my system) but Cheetah is a completely different app and I wouldn't want to see C3D taking that direction... I'm just wondering how hard it would be to add the GPU power to this wonderful toy we have right now.

just my 2c. debate is open...

cheers,
Alessandro
 
500? No thanks. If Cheetah would be at that price-point I would consider going over to Modo instead.
 
500? No thanks. If Cheetah would be at that price-point I would consider going over to Modo instead.

dear Dranix,
I see your point: 500$ is way too much (coming from 150) and we talked about price policy before (Martin said clearly he does't want to raise current price). I think Archie was talking about the value he/she would give to Cheetah if it has some additional features he/she desires... like GPU rendering and so on.
In my opinion there could be two versions of Cheetah (with and without GPU) or an additional plugin to enable GPU rendering... of course both have some development/maintenance issues (I really have no clue here) and only Martin can say wether this is possible or not.

Now if Martin think it's possible to have a CheetahGPU for example when we'll upgrade to v.6 or 7 without raising the price well I won't ask for it :rolleyes: but even if I'm no programmer I can see it's a lot of work. (it's the same for some of the plugins: I'm happy some talented users are releasing them for free but I'm sure I will pay some fees for some of them if I had to - hope Hiroto is not listening :tongue: )

That said I love Cheetah so much I'm sure I'll stick with it in the future, with or without GPU acceleration...
 
There are obvious potential benefits to GPU rendering, but who has the appropriate GPU to really do much dmg to the render times? Unless you've spent a good amount already on a really good GPU, you may not see a lot of benefit. What would be great is to have the ability for Cheetah to let you select GPU, CPU, or GPU/CPU.
 
How can they have such a butt ugly website?

Unlimited Detail's tech is outstandingly impressive -- they should be able to find some halfway decent artist who'll do stuff for them just for bragging rights.
 
Last edited:
i'm totally stunned by the apparent disinterest of the cheetah community in ten to thousand times faster render times (still remembering how anyone cheered the speedup coming with v5)
 
i'm totally stunned by the apparent disinterest of the cheetah community in ten to thousand times faster render times (still remembering how anyone cheered the speedup coming with v5)
Just here to let you know there was already a thread in the beta forum about real-time unbiased rendering: Octane, Arion, Luxrender/OpenCL, Augenblick.
I´m not here to tell how far Martin´s experience is so far as I simply don´t know it.
 
Hi,
gpu raytracing is something I'm looking into and writing a gpu pathtracer isn't to complicated. But there are some reasons why I not hurry into that feature.

1. Almost no Mac has a appropriate GPU today. Only a ATI 4870 and bigger or a Nvidia 285 makes sense for GPU raytracing. Anything weaker makes no real sense.

2. Even on that cards the limited RAM is quite a problems. 512 MByte Ram is not very much for geometry and textures!!! But the next GPU generations (Fermi is 64bit) seems to solve these limitations.

3. The OpenCL compiler is very new and what I read in OpenCL related forum threads quite tricky to work with. But I'm sure that will be addressed by Apple soon.

4. And most important. OpenCL is a Snow Leopard feature. Supporting OpenCL means dropping Leopard and it is way to early to do that.

So I'm following that tech closely and will experiment with it but that will be definitely no 5.x feature.:wink:

Bye,
Martin

P.S. I have absolutely no plans to take 500$ for Cheetah3D.:wink:
 
Last edited:
thanks for your answer Martin! almost anything you said makes sense to me, specially when you talk about Mac's built in vCards...

mmm... I suppose you mean "no" and not "now" in your last statement :smile:

cheers,
Alessandro
 
Hi,
gpu raytracing is something I'm looking into and writing a gpu pathtracer isn't to complicated. But there are some reasons why I not hurry into that feature.

1. Almost no Mac has a appropriate GPU today. Only a ATI 4870 and bigger or a Nvidia 285 makes sense for GPU raytracing. Anything weaker makes no real sense.

2. Even on that cards the limited RAM is quite a problems. 512 MByte Ram is not very much for geometry and textures!!! But the next GPU generations (Fermi is 64bit) seems to solve these limitations.

3. The OpenCL compiler is very new and what I read in OpenCL related forum threads quite tricky to work with. But I'm sure that will be addressed by Apple soon.

4. And most important. OpenCL is a Snow Leopard feature. Supporting OpenCL means dropping Leopard and it is way to early to do that.

So I'm following that tech closely and will experiment with it but that will be definitely no 5.x feature.:wink:

Bye,
Martin

P.S. I have absolutely no plans to take 500$ for Cheetah3D.:wink:

thanks martin for quick response.

i didn't expect to get this as a 5.x feature :smile:
the octane folks have managed to get decent results on current (nvidia) cards though.
i'm not asking for a whole shift but for an additional feature for extra charge.
good to see there is concern already,

regards,
archie
 
now here is a free mac app demonstrating how that works.
when you try the sponza scene you'll see how it gets slow to respond, there should be a switch to open gl wireframe when dragging the camera and restart of the render on mouse release.
so i would like something like this to be not another render manager but editor view option, so you have wireframe > shaded > live gpu pathtracer where you see your lighting and procedural mats immediately, and very quickly on not too heavy scenes (also a snapshot option is needed here).
cheetahs main render is still faster and well suited for evenly lighted scenes, but because of the stupid 10 000 radiosity samples limit you cannot do any natural lighting with large shadowed areas, whereas this gpu pathtrace thingy will finally (overnight) have a decent result (as with the sponza scene).
also please note that this kind of rendering produces grainy noise that is common in photography and can be effectively reduced by filters like the photoshop noise reduction, whereas the cheetah cloudy noise can not be tackled that easily, you need do do masking and blurring and other kinds of complicated stuff.

- archie
 
Octane render

Archie,

Thanks for the post and info on Octane. Here is a render done after a couple of experiments. One issue that came up is Cheetah's inability to save materials when exporting to the .obj format. This means that you only have one uniform material in Octane. I had to open the exported .obj Cheetah file with Silo and create materials so that I had a range of materials to tweak in Octane. The above image as you can see was rendered in less then twenty minutes (19:41 or 1181 sec). This is with a GeForce 8800 GS with 512 MB memory. The rendered image size is 1024 X 512.

Could the next release have an .obj export that would include materials?

That would be great.

Jenn109 (Mike)
 

Attachments

  • GT-2 render.png
    GT-2 render.png
    309.1 KB · Views: 491
Cool render, Mike. I worked on a car, which is way better than my previous efforts and still doesn't look that good.
 
Archie,

Thanks for the post and info on Octane. Here is a render done after a couple of experiments. One issue that came up is Cheetah's inability to save materials when exporting to the .obj format. This means that you only have one uniform material in Octane. I had to open the exported .obj Cheetah file with Silo and create materials so that I had a range of materials to tweak in Octane. The above image as you can see was rendered in less then twenty minutes (19:41 or 1181 sec). This is with a GeForce 8800 GS with 512 MB memory. The rendered image size is 1024 X 512.

Could the next release have an .obj export that would include materials?

That would be great.

Jenn109 (Mike)

Hi Mike,
cheetah actually can export wavefront with materials using the OBJ+MTL export.js (you can find it in the script/macro menu, it comes bundled with the application). the problem is if you want to import obj materials...

@Dranix: you're right, 20 min is quite long but the 8800GS is an old and slow card (96 CUDA cores... the new GTX480 has 480!) and Octane is an unbiased/physically based render (so any comparison between Oct/C3D is unfair ATM I think). I use them both (and love them both) and have to say each of them has its strength/potential... DoF comes almost free of charge in Octane, Particles/Instances are a great time saver in C3D and so on
 
Archie,

Thanks for the post and info on Octane. Here is a render done after a couple of experiments. One issue that came up is Cheetah's inability to save materials when exporting to the .obj format. This means that you only have one uniform material in Octane. I had to open the exported .obj Cheetah file with Silo and create materials so that I had a range of materials to tweak in Octane. The above image as you can see was rendered in less then twenty minutes (19:41 or 1181 sec). This is with a GeForce 8800 GS with 512 MB memory. The rendered image size is 1024 X 512.

hi mike, cool car !

now that i bought the expensive imac with a radeon i have myself effectively excluded from the new cuda based renderers, i think i have to learn blender plus luxrender :frown:
as alessandro mentioned already there is a obj + mtl export script you should try.

cheers,
archie
 
Hi,
4. And most important. OpenCL is a Snow Leopard feature. Supporting OpenCL means dropping Leopard and it is way to early to do that.

I'd disagree on this one. I already have at least one program I purchased at the same time as Cheetah (Pixelmator) that has just gone Snow-Leopard-only. The only machines that can't upgrade to Snow Leopard are pre-Intel, and most of them would not be good choices for 3D work anyhow.

I'd also wonder if you could do other tasks (filters, texture manipulation) in a not-brand-new-and-optimal GPU? We have a year old (4850 or something like that) GPUs at work and even before we upgraded to those, various 2D applications (Motion, etc) were WAY faster using the GPU than CPU.
 
Back
Top