Thought I would add some experiment info to this thread
I used a blender build enabled for the 3gb switch in a 3gb enabled boot of XP and was able to import 3 batches (layers) of 4000+ meshes sequentially
The commit space rose to 2.4 gb and all in made a .blend of 550mb
This was about 11 million faces all told. I guess <12 m is the practical limit.
Unfortunately it wouldnt write out an include of a layer of this size even when reopened (to offload the extra commit) - ran out of memory again..
I think the limit for an include with a blend file of this size will be about 2.5 million faces each .. let you know...and also if indigo will then load that
EDIT:
well bad news
with the 3gb switch importing a large number of files wasnt difficult or even moving them around.. (with undo and the outliner off as before) ...although doing stuff with a large file like this is s-l-o-w as you might expect
but the problem is the exporter still wants real ram apparently

or I guess python isnt 3gb aware? I need to follow that up..
I redistributed my meshes to have 2, 2.25 and 2.5 miliion faces in the layers to try to find the limits..
and by experiment determined that the max I should get out ought to have been the 2.25 case - this would have pushed memory right out to the full 2.9 gb including the final save step.. or at least that is the case with a 550mb file..
...but alas no...

so ATM it isnt happening

It would seem there is only an AMD64 python build available so perhaps almost everyone has this limitation regardless of their set up..