Texture Baking
Texture Baking
Hey all,
Got some questions on this topic. I know its a feature request frequently asked but my issue is :
Alot of biased GI renderers are able to incorporate this feature commonly...but I have noticied (i think) no biased MT renderers have this function yet (ie Maxwell Indigo...radium?) although I KNOW that they are requested as well.
Just wondered what is the technical difference in baking the textured info onto the polys. I'd like to begin writing a renderer myself and this is a topic I most want to tackle. Thanks in advance all!
Got some questions on this topic. I know its a feature request frequently asked but my issue is :
Alot of biased GI renderers are able to incorporate this feature commonly...but I have noticied (i think) no biased MT renderers have this function yet (ie Maxwell Indigo...radium?) although I KNOW that they are requested as well.
Just wondered what is the technical difference in baking the textured info onto the polys. I'd like to begin writing a renderer myself and this is a topic I most want to tackle. Thanks in advance all!
texture Baking
Perhaps I worded this wrong:
is it easierto implement texture baking in an unbiased MLT type renderer than it is in a biased regular GI renderer?
is it easierto implement texture baking in an unbiased MLT type renderer than it is in a biased regular GI renderer?
Thanks for the reply.
Thanks fort he reply Nick.
ZomB I dont think you understand...I dont know of a renderer or 3D program that can capture the unbiased quality of real world data and materials (such as using the Maxwell ior materials or the Indigo .nk materials) As far as I know only Fryrender can do it ---hence their VR capability thing. I use Microwave to bake my lighting info onto my models but it would be cooler to have the LOOK of my indigo scene for realtime use.
Also my question was whats the difficulty in implementing this functionality between a biased and unbiased renderer not that Nick HAS to do it.
ZomB I dont think you understand...I dont know of a renderer or 3D program that can capture the unbiased quality of real world data and materials (such as using the Maxwell ior materials or the Indigo .nk materials) As far as I know only Fryrender can do it ---hence their VR capability thing. I use Microwave to bake my lighting info onto my models but it would be cooler to have the LOOK of my indigo scene for realtime use.
Also my question was whats the difficulty in implementing this functionality between a biased and unbiased renderer not that Nick HAS to do it.
John,
I'm not sure there's any difference in implementation effort but, as Nick has already pointed out, it's generally only useful for diffuse surfaces that capture GI and caustics (which are view independent assuming the surface is truly Lambertian).
The only advantage I can see having it in an MLT renderer would be for caustics (either those coming directly from a lightsource or from a brightly-illuminated diffuse surface) and the advantage would be speed (over a path-tracer) or accuracy (over a photon mapper). The same applies, to a lesser extent, for indirect lighting.
In terms of implementation ... the only tricky bit is reverse UV mapping ("which physical point on the object does (U,V) map to?") with all the wrap-around problems involved for tiled/repetaed textures etc.
Not sure what Fryrender is planning, because details of RC4 are extremely sketchy right now. Chema has implied that it will go a bit beyond basic texture baking but has said little else on the subject. He has also said that texture baking will be inextricably linked to the RC4 engine, implying it's impossible to bake textures for external use. This would be extremely foolish IMHO and would lock games developers into using RC4 or another renderer.
Christopher Kulla just added texture baking into Sunflow but I've not had a chance to look at the new code yet.
(P.S. Will reply to your e-mail soon...)
Ian.
I'm not sure there's any difference in implementation effort but, as Nick has already pointed out, it's generally only useful for diffuse surfaces that capture GI and caustics (which are view independent assuming the surface is truly Lambertian).
The only advantage I can see having it in an MLT renderer would be for caustics (either those coming directly from a lightsource or from a brightly-illuminated diffuse surface) and the advantage would be speed (over a path-tracer) or accuracy (over a photon mapper). The same applies, to a lesser extent, for indirect lighting.
In terms of implementation ... the only tricky bit is reverse UV mapping ("which physical point on the object does (U,V) map to?") with all the wrap-around problems involved for tiled/repetaed textures etc.
Not sure what Fryrender is planning, because details of RC4 are extremely sketchy right now. Chema has implied that it will go a bit beyond basic texture baking but has said little else on the subject. He has also said that texture baking will be inextricably linked to the RC4 engine, implying it's impossible to bake textures for external use. This would be extremely foolish IMHO and would lock games developers into using RC4 or another renderer.
Christopher Kulla just added texture baking into Sunflow but I've not had a chance to look at the new code yet.
(P.S. Will reply to your e-mail soon...)
Ian.
Thanks for the Info!
Thanks Ian for this extremely useful info. I also appreciate all the people here who take the time to offer info to anyone (especially those inspired by people like Nick & Ian to create our own stuff---getting started is the hardest part so helping out is really generous)
I see your point on baking in terms of what can & cannot be captured...its something i'll try to experiment with...I also thinks its foolish to lock a developer into a engine for real-time application.
Ill look at sunflow too! See how thats done. Thanks again
I see your point on baking in terms of what can & cannot be captured...its something i'll try to experiment with...I also thinks its foolish to lock a developer into a engine for real-time application.
Ill look at sunflow too! See how thats done. Thanks again
- VictorJapi
- Posts: 58
- Joined: Sun Jan 21, 2007 1:27 am
- Location: Zamora - Algeciras || España
I really think that texture baking could be usefull for many thinks mostly for interactive works.
The light distribution of one MTL based renderer ever will be better than the other based in photons or irradiance maps etc speculars and other effects could be 'easily' faked.
Other feature that would be really interesting for RT projects are the Box cameras as the vray ones (six orto-cameras rendered in a cross render) for environments, reflections etc.
The light distribution of one MTL based renderer ever will be better than the other based in photons or irradiance maps etc speculars and other effects could be 'easily' faked.
Other feature that would be really interesting for RT projects are the Box cameras as the vray ones (six orto-cameras rendered in a cross render) for environments, reflections etc.
winmain(){
japi victorJapi;
victorJapi::Signature();
}
japi victorJapi;
victorJapi::Signature();
}
rather than simply baking colors to a mesh, couldnt you also set it up tole to bake stuff like vectors or photons to the mesh so that you could then recalculate specular and stuff without having to redo intersections? of course these vector textures wouldnt be much use to anything but a specific app designed to rerender using them, but it would still speed up a scene with fixed lighting but moving camera for stuff like animations.
a shiny monkey is a happy monkey
I've always wanted to (after writing and mastering an unbiased renderer, first, of course) experiment with texture baking in this manner:
Setup scene, etc, as normal.
Next, for each tri in scene:
place camera at position and size of tri, facing same way as the normal
use a FOV of 180 degrees
render()
take framebuffer, slice it up, and use as the texture for that tri: save to disk, uv map accordingly
after... forever (this would take a very long time to do)
open the scene in your favorite rasterizer (OGL, DX), using the textures you just saved.
Tada!
Navigate your scene in real time!
Setup scene, etc, as normal.
Next, for each tri in scene:
place camera at position and size of tri, facing same way as the normal
use a FOV of 180 degrees
render()
take framebuffer, slice it up, and use as the texture for that tri: save to disk, uv map accordingly
after... forever (this would take a very long time to do)
open the scene in your favorite rasterizer (OGL, DX), using the textures you just saved.
Tada!
Navigate your scene in real time!
Who is online
Users browsing this forum: No registered users and 50 guests

