is hardware rendering really out of the question ?

is hardware rendering really out of the question ?

Here at this site is a great piece of software I have seen for speed and quality
http://www.giofx.net/osx/index.htm
http://www.giofx.net/osx/chimera.htm

It is a super fast app for visuals. I have talked to the dev of the project but he is not interested in making anymore than just a screensaver app.

I know there was a test app here for GPU and was scrapped but why ? This app is so fast for tinkering in the graphics it seems like a perfect feedback previz, and then once you have everything animated how you want it could export frames as images that can be cropped together in quicktime.

I just dont understand why this is not sought after. It is fast and get work done is weeks faster than constatly makeing test renders at low quality size and zero anti aliasing.


I will pay dam good money if this is intergrated into Cheetah. I cant stand scanline and other slow@ss cpu rendering, most times I cant even get the detail I want cause I run out of time due to stupid test renders.

Please PLease what do you want, I must see this kind of tech brought to life on osx.
 
Re: is hardware rendering really out of the question ?

tripdragon said:
Please PLease what do you want, I must see this kind of tech brought to life on osx.

Hi,
I want graphics boards which can handle such a job. Somebody might remember when I tried to to add a hardware renderer to Cheetah3D 1.2.
I failed badly due to hardware limitation on current graphics boards. My hardware renderer already supported Phong shading,Bump mapping,Environment mapping, shadows, four light sources, 64x anitaliasing etc.

But it wasn't able to implement depth peeling for example on ATI boards since they only work with 24 bit floating point accuracy. Effects like ambient occlision, real refraction and reflection, radiosity, caustics etc. are all not implementable in a general and efficient form. Yes I know there are demos out there which even run a raytracer on the graphics board. But everybody how tried to run a raytracer on the graphics board comes to the same conclusion that it isn't faster than a software raytracer.

On current graphics boards you can only render a very limited number of effects at once. It is not possible to combine them arbitrarily yet. But that is a must have for an easy to use renderer.

Maybe a future graphics board generation.

By,
Martin
 
ok that is great and stuff. But what of simple scenes? Take for example your piggy in the hdri scene. It is a one simple object and hdri used for lighting, True opengl hdri will not get the perfect details but the opengl render view in that app blows the useability of testing out of the water for that.

What I gues I am trying to say is cant you at least implement the hardware render for basic scenes? Take a look at,
http://realtimecg.com/
http://cgtalk.com/showthread.php?threadid=238007
http://sabzil.net/
http://www.cgchat.com/forum/forumdisplay.php?s=1b54a35b6392498dd1facc0e6d60ec41&forumid=58

It is all realtime tools, I am sure some are scanlined rendered for the final one image render but for the most part opengl rendering simple scenes or low amount of shaders has a place in 3d art and animation right now. And later on as the cards get better more and more can be added just like the poly count in game machines now.

It is sad if you dont try again, I think your really missing out on furture and present tech. And besides that a lot of new user feed up with current 3d speed. It is like, I see these games haul @ss on graphics and then I try and animate something in my program and egh what the hell,,,,

You could have a waring like " becarefull and try and use very little amount of shaders, like make your pictures have a focus shader and the resr are small tibits"

Please please reconsider, just start with basics of shaders and whatever is supported on osx, I know glsl is not yet, but I do know dot3 normalmap, bump map glows dirt and others... There are alot to tinker with but the main one is normal and bump.... Oh and antialising gotta have that.. I am in a project right now that I know could habe been done in ten minutes but instead I have spent days working only cause I have to constantly rerender rerender and sop forth :(

Hmm, what do you need if you wanted to ask ? I am sure others here would love to help out. Oh and if the card is the issue that is silly, this ibook can handel most all of those features, short of pixel shaders. .
 
tripdragon said:
It is sad if you dont try again, I think your really missing out on furture and present tech. .

I'll admit I'm ignorant of much of what you're talking about tripdragon, but, just a thought....
In an interview I read somewhere it noted that Martin (apart from writing Cheetah by himself :shock: is doing it in his spare time while doing his PHD in Statistical Mathematics :eek:

I'm guessing Martin's got all the "future and present tech" he can handle at the moment :wink:
You might have to wait till he's got some more "spare time" :)

cheers,
N
 
Ya dude, I am not attacking, and I was the one before trying to rally the hardware renderer, and even got to test it out, but it was still to slow.

But I am just trying to tween the focus of future features to a faster output. I dont see what is so wrong with that. He can build whatever he likes, I am just asking kindly to try and make an app that is new and fast since thye tech is there it just needs some polish..
 
tripdragon said:
I am sure others here would love to help out. Oh and if the card is the issue that is silly, this ibook can handel most all of those features, short of pixel shaders.

I think we are talking about two totally different things. If you are happy with a simple game like graphic feature set like bumpmapping and phong shading than current hardware is fine. But don't expect that it is possible to combine more than two of these effects on an Radeon board whose GPU just allows 64 ALU instructions.
Such hardware is fine for handwritten specialised game shaders not for general purpose rendering.

Even Nvidia needs their haviest iron, which isn't available for the Mac yet, for doing a hardware accelerated renderer.
http://film.nvidia.com/page/home.html
And harware accelerated doesn't mean everything is done by the GPU. They just use it as an external floating point unit. I'm sure they are still doing many things in software and they also say that they can't do it in reatime.

If the Nvidia engineer need a NVIDIA Quadro FX® how should I be able to handle that on your iBook.

By,
Martin

P.S. propstuff I'm doing probability theory and not statistics 8) But the rest is true :wink:
 
Ok I give.. I have one last question. As you said you have seen demos before to. They seem to be able to make nice pictures, just take a look at itunes basic visualizer. Why either your tool or anyones, why is there no artist tool to build and control this kind of data ?

I dont really care about the quality I just wonder why there is no tool at all that lets users tinker with that stuff, the only way to edit or make this kind of eye candy is scripting..... Why is this ? Yet we can have a million and one modeling programs but no real animation program... Why ? Not directed at you just in general for all developers. I never really get a clear answer when I ask this, so I just keep trying...
 
Hi,
thanks for the link to the GPU finall gathering page. That looks really interesting. But the gorilla on the mainpage also had a rendering time of 60 sec. which is obviously not real time. :wink:
And for the radiosity renderings a photon mapping prepass was necessary. Nevertheless a very interesting technique.

The nvidia demo page is also nice but as I said it is difficult to combine all these effects to one image on current hardware. That will become much simple in the future of course. At that point I will probably give an hardware renderer another try. Already the next graphics board generation looks quite promissing. But it will take some time until they make it to a wider mac audience and until GLSL is available on the mac. The currently ARB_fragment and ARB_vertex progs don't even supports many of the GeForce 6800 features.

To your former post: If you want to make something like the iTunes visualizer you probably have to code it manually because that is actually something very very special. And as far as it know it is even Altivec optimized. You are requesting alot if you want such general purpose access to OpenGL on a GUI app basis.

By,
Martin
 
:( ok I know I cant convince you towards OpenGL render at this point in time. But how about this then ?
http://www.worley.com/fprime.html

It is CPU, and works fast, (well from the videos) It provides instant feedback of lighting and slowly but faster example of the shaders...

I wont go on about the opengl, I know I dont know enough of code to make any consice thoughts towards it and all. So I will leave it at that.
 
That GioFX stuff is really cool...

the screensaver version... especially the Grail2 one... is so cooool.... only bugger is that it killed my Tiger in the PB... weird flicking of the entire screen... had to hard restart the PB and then run Applejack to fix it... seemed to have killed the volume header in the process... pity!!!!

think it is only the screensaver if it is left to come on by itself... if previewed in the Prefpane it is all OK... just I forgot I had it selected and then later saw my PB going ape...

cheers
Trevor
 
Back
Top