Memory limitted to about 2.5 GB for single .net process

c# increase memory limit
gcallowverylargeobjects not working

I am writing .NET applications running on Windows Server 2016 that does an http get on a bunch of pieces of a large file. This dramatically speeds up the download process since you can download them in parallel. Unfortunately, once they are downloaded, it takes a fairly long time to pieces them all back together.

There are between 2-4k files that need to be combined. The server this will run on has PLENTLY of memory, close to 800GB. I thought it would make sense to use MemoryStreams to store the downloaded pieces until they can be sequentially written to disk, BUT I am only able to consume about 2.5GB of memory before I get an System.OutOfMemoryException error. The server has hundreds of GB available, and I can't figure out how to use them.

MemoryStreams are built around byte arrays. Arrays cannot be larger than 2GB currently.

The current implementation of System.Array uses Int32 for all its internal counters etc, so the theoretical maximum number of elements is Int32.MaxValue.

There's also a 2GB max-size-per-object limit imposed by the Microsoft CLR.

As you try to put the content in a single MemoryStream the underlying array gets too large, hence the exception.

Try to store the pieces separately, and write them directly to the FileStream (or whatever you use) when ready, without first trying to concatenate them all into 1 object.

The 4GB Windows Memory Limit: What does it really mean?, Even when more than 4GB of memory is present, each process still has the normal 2GB chipset, the PAE mode allows access to up to 64 GB of physical memory. The cool thing about the AMD Opteron (unlike the Intel Itanium) is that it runs If you were to use a single 32 bit OS on this hardware, you'd likely be limited to  @JPRichardson, neither 32 nor 64 bit .Net executable limited to 2GB per process - first of all per-process address space is OS level restriction (2/3+GB for 32bit process and much more for 64bit), second even 32bit version can use more than 2GB if "LargeAddressAware" flag is set on the executable.

According to the source code of the MemoryStream class you will not be able to store more than 2 GB of data into one instance of this class. The reason for this is that the maximum length of the stream is set to Int32.MaxValue and the maximum index of an array is set to 0x0x7FFFFFC7 which is 2.147.783.591 decimal (= 2 GB).

Snippet MemoryStream

private const int MemStreamMaxLength = Int32.MaxValue;

Snippet array

// We impose limits on maximum array lenght in each dimension to allow efficient 
// implementation of advanced range check elimination in future.
// Keep in sync with vm\gcscan.cpp and HashHelpers.MaxPrimeArrayLength.
// The constants are defined in this method: inline SIZE_T MaxArrayLength(SIZE_T componentSize) from gcscan
// We have different max sizes for arrays with elements of size 1 for backwards compatibility
internal const int MaxArrayLength = 0X7FEFFFFF;
internal const int MaxByteArrayLength = 0x7FFFFFC7;

The question More than 2GB of managed memory has already been discussed long time ago on the microsoft forum and has a reference to a blog article about BigArray, getting around the 2GB array size limit there.

Update

I suggest to use the following code which should be able to allocate more than 4 GB on a x64 build but will fail < 4 GB on a x86 build

private static void Main(string[] args)
{
    List<byte[]> data = new List<byte[]>();
    Random random = new Random();

    while (true)
    {
        try
        {
            var tmpArray = new byte[1024 * 1024];
            random.NextBytes(tmpArray);
            data.Add(tmpArray);
            Console.WriteLine($"{data.Count} MB allocated");
        }
        catch
        {
            Console.WriteLine("Further allocation failed.");
        }
    }
}

Governing Intelligent Cube memory usage, An Intelligent Cube memory limit defines the maximum amount of RAM of the 2.5 times the size of the Intelligent Cube, the publishing process requires 2.5 GB. of Intelligent Cube data stored in Intelligence Server memory at one time for a  Memory limitted to about 2.5 GB for single .net process I am writing .NET applications running on Windows Server 2016 that does an http get on a bunch of pieces of a large file. This dramatically speeds up the download process since you can download them

As has already been pointed out, the main problem here is the nature of MemoryStream being backed by a byte[], which has fixed upper size.

The option of using an alternative Stream implementation has been noted. Another alternative is to look into "pipelines", the new IO API. A "pipeline" is based around discontiguous memory, which means it isn't required to use a single contiguous buffer; the pipelines library will allocate multiple slabs as needed, which your code can process. I have written extensively on this topic; part 1 is here. Part 3 probably has the most code focus.

Why by default Firefox uses so much RAM?, Now I see only one process running and my memory usage is down. Here's the What is interesting is the problem goes away if I kill FF using  Marcell Toth. Full Stack Developer, Team Lead at Appsint Ltd. 5 Memory limitted to about 2.5 GB for single .net process Nov 7 '18. 3 In which version of .NET

Just to confirm that I understand your question: you're downloading a single very large file in multiple parallel chunks and you know how big the final file is? If you don't then this does get a bit more complicated but it can still be done.

The best option is probably to use a MemoryMappedFile (MMF). What you'll do is to create the destination file via MMF. Each thread will create a view accessor to that file and write to it in parallel. At the end, close the MMF. This essentially gives you the behavior that you wanted with MemoryStreams but Windows backs the file by disk. One of the benefits to this approach is that Windows manages storing the data to disk in the background (flushing) so you don't have to, and should result in excellent performance.

Limiting MATLAB memory usage - MATLAB Answers, Limited by System Memory (physical + swap file) available. As you can swap file. On a single user Windows system with 12 GB of memory you should never need swap. Looks like there are some ways to have Windows limit a processes memory usage, so I'll look into that. It is an interesting question, though. I know​  5 Memory limitted to about 2.5 GB for single .net process; View more network posts → Top tags (2) airport-design. Score 0. Posts 1. takeoff. Score 0. Posts 1.

Limiting use of RAM in Chrome?, I don't think there is a way to limit every individual tab's RAM usage but you can limit .com/questions/192876/set-windows-process-or-user-memory-limit I'm having 16 GB physical RAM and it's still not enough. I wrote a Python 2.5 program which kills chrome's renderers when they use over This is really neat, thanks! In fact, a process can reserve more memory than is available on the system. Before a memory address can be used by a process, the address must have a corresponding data storage location in actual memory (RAM or disk). Commit memory is memory that has been associated with a reserved address, and is therefore generally unavailable to other processes.

Implement strategies to limit memory usage. · Issue #455 , Currently, Prometheus simply limits the chunks in memory to a fixed number. Keep in mind this is only one (albeit major factor) in RAM usage. more than 1GB I am seeing it run out with the docker container setting at 2.5GB. point and killing prometheus process does not free all memory immediately. In part one, Leonid Ganeline introduced the concept of big memory and discussed why it is so hard to deal with in a .NET environment. In part two, Dmitriy Khmaladze describes their solution NFX

32Bit x86 XF UWP RAM limit 1GB · Issue #5091 · xamarin/Xamarin , Description Xamarin Forms UWP app crashes when 1GB ram limitation is Also verified with process explorer (private bytes and working set). 32Bit native UWP application reaching limit by around 2/2.5GB The interesting part is that the limit seems to be lower when using Xamarin. No one assigned. -Windows may be reserving the RAM for a process -Attempt a RAM refresh, type "Mystring=(80000000)" without quotations in notepad, and save the file as "1.vbe". Then double click on the saved file.

Comments
  • ConcatenatedStream from How do I concatenate two System.Io.Stream instances into one? might meet your needs, as long as you don't need random seeking.
  • And, make sure you are compiled to x64 (or Any CPU without the "prefer 32 bit flag", and running on x64)
  • 2. it is x64, i used dumpbin to validate large addresses were supported.
  • Here is a simple program to recreate my issue. static void Main(string[] args) { MemoryStream ms1 = new MemoryStream((int)Math.Pow(1024, 3)); MemoryStream ms2 = new MemoryStream((int)Math.Pow(1024, 3)); MemoryStream ms3 = new MemoryStream((int)(Math.Pow(1024, 3)*.95)); MemoryStream ms4 = new MemoryStream((int)Math.Pow(1024, 3)); //this errors out with an out of memory error }
  • the third memory stream is interesting. I multiplied the total number of allowed bytes by .95 and it will work, if I go up to .96 or higher it won't work. The 4th MemoryStream causes the out of memory error if the third doesn't.
  • There is an application setting which allows the creation of arrays with more of 2GB size. See docs.microsoft.com/en-us/dotnet/framework/configure-apps/…
  • @ckuri I believe that using that you still won't be able to create byte arrays larger than 2GB as it is still bounded by the max index size (which is UInt32.MaxValue). Read the remarks section: "The maximum index in any single dimension is 2,147,483,591 (0x7FFFFFC7) for byte arrays..."
  • I tried it, and you are right. It's not possible to initialize an array with a length larger then 2^31. So new long[2_000_000_000] successfully created a 16 GB array, but new byte[3_000_000_000] threw an OverflowException.
  • I tried using multiple MemoryStreams that were all smaller than 2GB and put them in a Queue. Regardless of the size I used I could create as many as would fill about 2.5 GB and I would start getting out of memory errors when i tried and create another.
  • @JoshDayberry You are 99% running it as x86 then. What I'm guessing is that your app is set to AnyCPU (prefer 32 bit). This will make your code run as a 32bit assembly too, see this: stackoverflow.com/a/12066861/10614791 Set it explicitly to x64 (there is no point in AnyCPU if it will outright crash on a 32bit system) and it will work. I reproduced your issue, this solves it.
  • I wasn't storing more than 2GB in a stream. I was however using multiple streams, one per chunk, to hold the data. No matter what sized I made the chunks, it always seem to peak at 2.5GB of ram usage before it would give out of memory errors. I was storing the many MemoryStreams in a Queue.
  • @JoshDayberry, I see... What is the configuration of your pagefile? I know there can be some issues with memory allocation if you have the size of the pagefile configured the wrong way.I just searched for the article and found it again: Pushing the limits of windows physical memory and Pushing the limits of windows virtual memory. The last one may help you...
  • The server has around 800GB of memory and we are getting stuck at around 3. Page file utilization is steady at 0%. Why would the pagefile be relevant?