Giving a process more memory

SimpleMinded

Vault Fossil
So my brother's got a laptop with Vista and 4 GB of ram on it. I want to run an application for my research that will need about 2.5 gig of memory to run. However, whenever the process gets near 1.5 gig, even though task manager says only 60% of physical memory is in use, the process throws an error that it's out of memory and crashes.

If it helps I'm running it in Visual Studio.

I was curious if there's some kind of cap on how much memory processes can use and is there a way of changing this cap?
 
32 bit programs cap out at about 2-2.5 gb ram. xp/vista can see/use upwards of 3-3.5 gb but thats because they actually use a 46 bit memory addressing which only about 42 bits is used.

most programs while designed to run in a 32 environment usualy have internal memory controls usually based on the language they were wrote it in.

1) program has a fatal memory leak

2) shitty programming

3) doesnt work with 32 bit compatability in a 64 bit os

4) shitty hardware

5) bad hardware

6) shitty programming
 
Visual Studio might also be a factor here.

do you run it from inside VS or do you compile it and run it from an .exe (or whatever your program will be)?
 
I'd been running it inside visual studio simply by running the process. Do you think I would have more freedom if I ran it outside of visual studio?

I've actually resolved the issue with better programming (the files I'm working with our 300MB where they used to be 1MB so I just became less sloppy), but it's going to surface again in the future when I use the even bigger data set. So for now, it's more a curiosity than a pressing issue.
 
Back
Top