www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - dmd crashes with "out of memory" error

reply Trass3r <mrmocool gmx.de> writes:
I tried to uncompress a file containing zlib data which I converted to a 
source file with bintod.


module main;

import save; // containing the file data
import std.zlib;

void main(string[] args)
{
	//void[] data = uncompress(BINARY_SAVE_SWW[8 .. $], 
*cast(uint*)(BINARY_SAVE_SWW.ptr+4)); // this would be the correct call, 
but it doesn't compile either
	void[] data = uncompress(BINARY_SAVE_SWW);
}


dmd 1.046 & 2.031 both crash with "out of memory".
Tried changing the const BINARY_SAVE_SWW to other modifiers like enum, 
this gave an error message, but still led to out of memory.

Is this normal?


Here are the files, if you want to try out:
http://ul.to/2f5zzq
Jul 14 2009
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
File is probably too big. Remember that for every byte in your binary, 
DMD is likely allocating several hundred for the literal xpression 
object + codegen for the expression, etc., and frees very little 
dynamically allocated memory.

Trass3r wrote:
 Here are the files, if you want to try out:
 http://ul.to/2f5zzq
Why have people not learned about SkyDrive? I'm surprised crap like this, Megaupload, etc is still in business. SkyDrive has no stupid "wait time", is ****ing fast (MS has a lot of data centers, all over the world) and 50MB per-file/25GB total limits [and yes it's free].
Jul 14 2009
parent reply Trass3r <mrmocool gmx.de> writes:
Robert Fraser schrieb:
 File is probably too big. Remember that for every byte in your binary, 
 DMD is likely allocating several hundred for the literal xpression 
 object + codegen for the expression, etc., and frees very little 
 dynamically allocated memory.
 
So we can't even embed a 1.7 MB file?
Jul 15 2009
parent reply grauzone <none example.net> writes:
Trass3r wrote:
 Robert Fraser schrieb:
 File is probably too big. Remember that for every byte in your binary, 
 DMD is likely allocating several hundred for the literal xpression 
 object + codegen for the expression, etc., and frees very little 
 dynamically allocated memory.
So we can't even embed a 1.7 MB file?
For embedding data, use import() (http://www.digitalmars.com/d/1.0/expression.html#ImportExpression). Or did you do this? I don't know the file size restrictions of import either.
Jul 15 2009
parent Trass3r <mrmocool gmx.de> writes:
grauzone schrieb:
 For embedding data, use import() 
 (http://www.digitalmars.com/d/1.0/expression.html#ImportExpression).
 Or did you do this? I don't know the file size restrictions of import 
 either.
Thanks, didn't know that option. Compiles, but this leads me to the error I actually wanted to test. std.zlib fails with "out of memory" message although another program I tested could decompress it. I can't imagine why this shouldn't work (decompressed size resides at offset 4 and zlibbed data starts at offset 8): ubyte[] data = cast(ubyte[]) import("save.sww"); data = cast(ubyte[]) uncompress(data[8 .. $], *cast(uint*)(data.ptr+4)); And here, as a special for Robert ;), the files on SkyDrive: http://cid-f3bfa204ac326df0.skydrive.live.com/self.aspx/.Public/zlibtest.rar
Jul 15 2009