Solving Out of Memory exception in .NET

Last week I’ve been assigned to solving problem with Out of Memory exception we were getting in our application. Oh, what I fine day it was! Something fun to solve, something useful to learn.

It’s been quite some time since I’ve bothered with memory usage in my applications. Usually it was so low, that no one cared and we concentrated on performance rather than memory print. Now things are bit different. We are not using all that much memory, but OoM exceptions were pretty common at some point. That was however solved by setting LargeAddressAware flag ( to our executable – problem solved for most of our usages (and we are having a lot of free ram in our server boxes anyway so it’s not like we needed to minimize memory usage).

Then, few months later, problem re-appeared and this time it was persistent and repeatable. One report we were generating was failing. Unfortunately generating this report was taking 20+ minutes so not easy to reproduce. Still – I tried.

First I looked at task manager when in debug mode. Memory usage usually stayed around 500MB. Not too much, given that 32 bit Windows applications are given 2GB memory. Yet, few lines later, OoM appeared. Tried again – the same thing happened.

The code, that was throwing Out of Memory exception was Coherence.dll, .NET client for Coherence cache system. We weren’t having similar problems with Coherence before so I looked at what our code was doing. Nothing fancy really, trying to save some binary data – 61MB of data to be specific. Not huge amounts, but 3-4 times more than largest objects we were saving before. Funnily enough, saving those 61MB was fine – it was saving 4 character string in line after this that throws exception.

There are tools you want to use when you are having such problems. Profiles I mean. I thought there has to be something fishy going on. Ants Memory Profiler was tool I picked and tried to profile memory. It slowed down report generation to 40 minutes, but it profiled memory. Unfortunately it died when trying to snapshot memory usage just before OoM occurs. Well, I could’ve expected that. Looking through earlier memory snapshots did not reveal any obvious memory leaks or unexpected objects. All looked pretty much as expected. Still – exception was there.

So I decided to fall back to some lower level tools – VMMap ( – tool that will map your memory for given process, give you colorful diagram and show some details. Now – if you are going to use it – give it some time. It was processing my memory usage for 5 or 10 minutes before giving me results. So patience is the key.

However results are very useful, at least to me. Of course memory usage was low – 500 megs, like I said. What was not low was memory fragmentation. In Free type of memory there was something like 1400MB free but fragmented into close to 20 blocks, with largest being just a little over 100MB in size.

How is that helping me? Very much – given that Coherence.dll is serializing data into some kind of stream – MemoryStream probably. And MemoryStream has very interesting behavior. When it’s reaching its limit it tries to allocate new memory block (while preserving current one so data can be copied into new one of course). New memory block is going to be twice the current size. You have 1MB? It will go up to 2. You have 32? 64 it is. You see where I’m going with this? I have 64MB memory stream in memory (or something like that) already and while saving those 4 string characters I’m reaching its limit. It tries to get new one, 128MB (or something around this value) of continuous memory block (since MemoryStream is just byte[] underneath and arrays in C# occupy continuous blocks of memory). But as VMMap shows, largest block of memory is just 100MB!

Now, that clearly is Out of Memory situation. What can you do about this? Free some memory, remove some objects you no longer need. Or, as I did, limit saving data to something less. Since my data was one big xml underneath, zipping it made it small 5MB+ stream of bytes. Something that could easily fit into memory. Solution was implemented in 15 minutes and works perfectly.

What is left is – where does memory fragmentation come from? Well, obviously – you allocate memory, then you free some of it in the middle and here you go – you have small block of memory free, but not very usable if your other objects can’t fit into it. .NET platform is clever about it – it compacts memory after Garbage Collector runs (unless you pinned it, but that’s a different story). After compacting memory you should be left with one big block of free memory at the end and all allocated objects at the beggining of the memory. There is an exception to that rule however – Large Object Heap. If your objects are big enough (85k and above I think, but that’s probably not necessarily constant value), .NET will put your objects into separate heap and will not compact it since it would hit performance of GC hard.

Was that the case? Nope. LOH was just below 200kB, highly unlikely it would cause high memory fragmentation. What really was going on is, one of our component was working with COM objects. Those were not allocated by .NET framework, yet they reside in our process memory. So no compacting or anything. And we have quite a lot of those (probably around 200MB out of 500MB used by the application). Could that cause memory fragmentation? You bet it could! Didn’t touch that however, since impact of this change would be much bigger, change itself wouldn’t be as easy and we had other, better solution in place already (zip compression).


Binary serialization is no fun

I’m currently working on an iPad project that communicates with Java service. Nothing fancy, right? Yet, instead of using JSON or XML, communication goes using AMF protocol with binary serialization. Doesn’t sound scary at all, binary serialization is fast, output is small, we all like it, right?
Now, I didn’t get any automatically generated classes that I can synchronise with service in few clicks, like I could do with WCF. I got some classes generated by developer who was taking care of the project before, but those classes got a little bit outdated recently. There is possibility to convert Java classes into Objective-C code using some converter that this guy wrote, but it’s not perfect, it would take me some time to fix some issues from autogenerator. But more important, if something breaks down it takes so much time to find out what is wrong.
Few days back I was trying to figure out what went wrong so I can no longer deserialize data from service. In world of XML you look at elements, attributes and it’s pretty easy most of the time to find out what’s wrong. If XML has some additional attributes or elements, serialization usually don’t care or at least gives you clear error message saying what is wrong, which piece you are missing. Binary serialization – it don’t. You just get some pieces of binary data and convert them into objects hoping everything will go right. Of course AMF helps you with that up to some point, but not always.
So I was trying to fix this problem. Watching binary data, trying to figure out what’s going on. One thing that helped me was library reported object of type 78. Clearly, this didn’t fell into enum values, which start me wondering – what could it be? Quick look into ASCII reveals capital ‘N’ letter. Then, next property should be “New” text value. But next property, not the current one. Clearly – I got 2 bytes to far into the data. No need to go into details, but this made me to go to Java service, and piece by piece look through clases to find out which property I missed. And there it was, boolean property that appeared in service class definition.
All this took me some time. Using JSON or XML it would take me minute or two.
My advice – go with JSON or XML to make your life easier. If you think that it’s to big or to slow, thing again. Try to redesign your service, think if you really need to send all this data to client. Probably not. Time saved on coding, debugging, looking for errors – it invaluable. You must have really, really slow device that couldn’t handle parsing normal data formats so you would need to go into binaries.
Save binaries to images, movies and music. Keep data easy to discover, easy to read, easy to use.