Big scene?

Feature requests, bug reports and related discussion
Post Reply
6 posts • Page 1 of 1
User avatar
YaroslavL
Posts: 34
Joined: Wed Dec 27, 2006 7:20 am
Location: Ukraine
Contact:

Big scene?

Post by YaroslavL » Wed Dec 27, 2006 7:48 am

Im using Blender 2.42 for modeling some interior scene.
After exporting I run (from SuSE Linux 10.2) rendering, but unfortunately, after ~10 min
get message about crashing:
...
-----------TriTree::build()-------------
17798 tris.
temp tris mem usage: 625.711KB
intersect_tris mem usage: 834.281KB
calcing root AABB.
max tree depth: 30
reserving N nodes: 17798(139.047KB)
leafgeom reserved: 71192(278.094KB)
tri_boxes mem usage: 556.188KB
total nodes used: 3259425 (24.867MB)
total leafgeom size: 36450053 (139.046MB)
finished building tree.
Saving tree to 'tree_cache/2391684036.tre'...
Done.
simple3d mesh vert mem used: 2.153MB (47030 verts)
simple3d mesh tri mem used: 417.141KB (17798 tris)
Couldn't find matching cached tree file, rebuilding tree...
-----------TriTree::build()-------------
17798 tris.
temp tris mem usage: 625.711KB
intersect_tris mem usage: 834.281KB
calcing root AABB.
max tree depth: 30
reserving N nodes: 17798(139.047KB)
leafgeom reserved: 71192(278.094KB)
tri_boxes mem usage: 556.188KB
SceneLoaderExcep: Failed to allocate memory while building mesh kd-tree.
Fatal Error: SceneLoaderExcep: Failed to allocate memory while building mesh kd-tree.
vijt@ideja:~/3D/indigo_06>

Scene = 49.5 Mb
AMD x2 64 3800+
RAM 2x1024Mb
________________
Yaroslav Lebidko
"3D XATA"
Photorealistic computer projection & design
of architecture constructions, interiors, furniture.
Ternopil, Ukraine
e-mail: ideja@3dxata.te.ua
web: www.3DXATA.com.ua

User avatar
CTZn
Posts: 7240
Joined: Thu Nov 16, 2006 4:34 pm
Location: Paris, France

Post by CTZn » Wed Dec 27, 2006 10:47 am

You have sufficient disk space available ? Your disk is not write protected ?

http://www.indigorenderer.com/joomla/fo ... .php?t=946

User avatar
CTZn
Posts: 7240
Joined: Thu Nov 16, 2006 4:34 pm
Location: Paris, France

Post by CTZn » Thu Dec 28, 2006 12:08 am

I agree the scene file is quite huge (50MB). Maybe try to split the large objects into smaller ones ? How many tris have the biggets object for instance, YaroslavL ?

Also OnoSendai here is an excerpt of the doc for mental ray, it allows to set manually the depth and leaf size for the bsp tree. I know these are very advanced settings, but maybe being allowed to set them on an object basis could help sometimes ? Tweaking this allows mr to render scenes that won't render with auto-settings, there is virtually no limit in scene size, at the cost of render time.
bsp size size(int)

The maximum number of primitives in a leaf of the BSP tree. mental ray will subdivide BSP voxels containing more triangles, unless the maximum BSP depth (see next statement) is exhausted. Larger leaf sizes reduce memory consumption but increase rendering time. The default is 10.

bsp depth depth(int)

The maximum number of levels in the BSP tree. Larger tree depths reduce rendering time but increase memory consumption, and also slightly increase preprocessing time. The default is 40. If there are too many triangles in the scene to fit into the BSP tree with the size specified by bsp size and bsp depth, the bsp size value is disregarded and larger leaves are created. This slows down rendering significantly. Larger bsp depth values of 50 or even higher often massively improve rendering speed in BSP mode for larger scenes.
What do you think, I mean I'm thinking at this for a few days now, and though that would be on an error and try basis (along with the understanding of the above quote tho), I think that would be cool, say...

Does Indigo do disk swapping (I guess it does coz it writes the tree cache on disk) ? mental ray also allows to set a limit for the maximum ram used for its processes, usually it's recommended to set it at 80% of the available free ram but in the case of big scenes, setting this to 50% helps a lot, causing more disk swapping but avoiding this kind of memory errors...

What's your priority on optimizing such stuff OnoSendai ? Could this apply to Indigo ?

Cheers !

User avatar
YaroslavL
Posts: 34
Joined: Wed Dec 27, 2006 7:20 am
Location: Ukraine
Contact:

Post by YaroslavL » Fri Dec 29, 2006 3:36 am

thank U. After deleting some objects indigo start rendering.
sorry for my en
________________
Yaroslav Lebidko
"3D XATA"
Photorealistic computer projection & design
of architecture constructions, interiors, furniture.
Ternopil, Ukraine
e-mail: ideja@3dxata.te.ua
web: www.3DXATA.com.ua

User avatar
deltaepsylon
Posts: 417
Joined: Tue Jan 09, 2007 11:50 pm

Post by deltaepsylon » Fri Jan 12, 2007 9:01 am

695420 tri's, 156 MB scene file, rendered perfectly well on a Pentium4 3.02GHz with only 512 MB RAM. So it seems werid that a 56 MB file on a 2GB RAM machine woulnt work. :shock:

User avatar
CTZn
Posts: 7240
Joined: Thu Nov 16, 2006 4:34 pm
Location: Paris, France

Post by CTZn » Fri Jan 12, 2007 1:10 pm

I think it depends on if you have one big object or many average ones. One big object with 100 000 tris will take more ram at pre-processing than 10 at 10 000 I believe...

Also Ono said 0.7t1 was less ram consuming than previous releases.

Post Reply
6 posts • Page 1 of 1

Who is online

Users browsing this forum: No registered users and 4 guests