Texture Baking

Discuss stuff not about Indigo.
Post Reply
11 posts • Page 1 of 1
Smaug7800
Posts: 8
Joined: Wed Dec 20, 2006 3:07 pm

Texture Baking

Post by Smaug7800 » Fri Dec 29, 2006 8:03 am

Hey all,

Got some questions on this topic. I know its a feature request frequently asked but my issue is :

Alot of biased GI renderers are able to incorporate this feature commonly...but I have noticied (i think) no biased MT renderers have this function yet (ie Maxwell Indigo...radium?) although I KNOW that they are requested as well.

Just wondered what is the technical difference in baking the textured info onto the polys. I'd like to begin writing a renderer myself and this is a topic I most want to tackle. Thanks in advance all!

Smaug7800
Posts: 8
Joined: Wed Dec 20, 2006 3:07 pm

texture Baking

Post by Smaug7800 » Sat Dec 30, 2006 2:40 am

Perhaps I worded this wrong:
is it easierto implement texture baking in an unbiased MLT type renderer than it is in a biased regular GI renderer?

User avatar
Zom-B
1st Place 100
Posts: 4701
Joined: Tue Jul 04, 2006 4:18 pm
Location: ´'`\_(ò_Ó)_/´'`
Contact:

Post by Zom-B » Sat Dec 30, 2006 4:46 am

The most 3D Progs support Texture Baking,
So why should Nick work on a feature, that already exists in most of our
working pipeline???
polygonmanufaktur.de

User avatar
OnoSendai
Developer
Posts: 6244
Joined: Sat May 20, 2006 6:16 pm
Location: Wellington, NZ
Contact:

Post by OnoSendai » Sat Dec 30, 2006 4:55 am

Texture baking usually only capture view-independent light, so you don't get stuff like specular highlights, and all the nice other stuff you get from an unbiased renderer.

Smaug7800
Posts: 8
Joined: Wed Dec 20, 2006 3:07 pm

Thanks for the reply.

Post by Smaug7800 » Sat Dec 30, 2006 5:43 am

Thanks fort he reply Nick.

ZomB I dont think you understand...I dont know of a renderer or 3D program that can capture the unbiased quality of real world data and materials (such as using the Maxwell ior materials or the Indigo .nk materials) As far as I know only Fryrender can do it ---hence their VR capability thing. I use Microwave to bake my lighting info onto my models but it would be cooler to have the LOOK of my indigo scene for realtime use.

Also my question was whats the difficulty in implementing this functionality between a biased and unbiased renderer not that Nick HAS to do it.

IanT
Posts: 153
Joined: Fri Aug 25, 2006 3:13 am

Post by IanT » Sun Dec 31, 2006 12:51 am

John,

I'm not sure there's any difference in implementation effort but, as Nick has already pointed out, it's generally only useful for diffuse surfaces that capture GI and caustics (which are view independent assuming the surface is truly Lambertian).

The only advantage I can see having it in an MLT renderer would be for caustics (either those coming directly from a lightsource or from a brightly-illuminated diffuse surface) and the advantage would be speed (over a path-tracer) or accuracy (over a photon mapper). The same applies, to a lesser extent, for indirect lighting.

In terms of implementation ... the only tricky bit is reverse UV mapping ("which physical point on the object does (U,V) map to?") with all the wrap-around problems involved for tiled/repetaed textures etc.

Not sure what Fryrender is planning, because details of RC4 are extremely sketchy right now. Chema has implied that it will go a bit beyond basic texture baking but has said little else on the subject. He has also said that texture baking will be inextricably linked to the RC4 engine, implying it's impossible to bake textures for external use. This would be extremely foolish IMHO and would lock games developers into using RC4 or another renderer.

Christopher Kulla just added texture baking into Sunflow but I've not had a chance to look at the new code yet.

(P.S. Will reply to your e-mail soon...)

Ian.

Smaug7800
Posts: 8
Joined: Wed Dec 20, 2006 3:07 pm

Thanks for the Info!

Post by Smaug7800 » Sun Dec 31, 2006 6:52 pm

Thanks Ian for this extremely useful info. I also appreciate all the people here who take the time to offer info to anyone (especially those inspired by people like Nick & Ian to create our own stuff---getting started is the hardest part so helping out is really generous)
I see your point on baking in terms of what can & cannot be captured...its something i'll try to experiment with...I also thinks its foolish to lock a developer into a engine for real-time application.

Ill look at sunflow too! See how thats done. Thanks again

User avatar
VictorJapi
Posts: 58
Joined: Sun Jan 21, 2007 1:27 am
Location: Zamora - Algeciras || España

Post by VictorJapi » Sun Jan 21, 2007 1:38 am

I really think that texture baking could be usefull for many thinks mostly for interactive works.

The light distribution of one MTL based renderer ever will be better than the other based in photons or irradiance maps etc speculars and other effects could be 'easily' faked.

Other feature that would be really interesting for RT projects are the Box cameras as the vray ones (six orto-cameras rendered in a cross render) for environments, reflections etc.
winmain(){
japi victorJapi;
victorJapi::Signature();
}

User avatar
oodmb
Posts: 271
Joined: Thu Oct 26, 2006 5:39 am
Location: USA
Contact:

Post by oodmb » Fri May 11, 2007 4:51 pm

rather than simply baking colors to a mesh, couldnt you also set it up tole to bake stuff like vectors or photons to the mesh so that you could then recalculate specular and stuff without having to redo intersections? of course these vector textures wouldnt be much use to anything but a specific app designed to rerender using them, but it would still speed up a scene with fixed lighting but moving camera for stuff like animations.
a shiny monkey is a happy monkey

User avatar
zsouthboy
Posts: 1395
Joined: Fri Oct 13, 2006 5:12 am

Post by zsouthboy » Sat May 12, 2007 12:40 am

I've always wanted to (after writing and mastering an unbiased renderer, first, of course) experiment with texture baking in this manner:

Setup scene, etc, as normal.

Next, for each tri in scene:
place camera at position and size of tri, facing same way as the normal
use a FOV of 180 degrees
render()
take framebuffer, slice it up, and use as the texture for that tri: save to disk, uv map accordingly

after... forever (this would take a very long time to do)

open the scene in your favorite rasterizer (OGL, DX), using the textures you just saved.

Tada!

Navigate your scene in real time!

User avatar
CTZn
Posts: 7240
Joined: Thu Nov 16, 2006 4:34 pm
Location: Paris, France

Post by CTZn » Sat May 12, 2007 5:37 am

you mean "rendering in texture space" ?

:D
obsolete asset

Post Reply
11 posts • Page 1 of 1

Who is online

Users browsing this forum: No registered users and 50 guests