Memory Leaks

Intro

In garbage-collected environments, "memory leak" means objects that are no longer useful but remain reachable from GC roots — so the collector never reclaims them. The process RSS grows monotonically until an OutOfMemoryException crashes the service or the container hits its memory limit and gets OOM-killed. In production, this typically manifests as a slow climb in memory usage over days, with periodic restarts masking the underlying issue until traffic increases and the leak accelerates.

There are two categories. Managed leaks: objects held alive by forgotten references (event subscriptions, static caches, closures capturing this). The GC works correctly — it just can't collect objects that are technically still reachable. Unmanaged leaks: native memory allocated via Marshal.AllocHGlobal, P/Invoke, or wrapped OS handles that are never freed, because the GC doesn't manage memory outside the managed heap.

The diagnostic workflow: capture a memory dump (dotnet-dump collect), load it in Visual Studio or dotnet-dump analyze, and run dumpheap -stat to find the largest retained types. For event-related leaks, gcroot <address> traces the reference chain from a leaked object back to its GC root — the root is where the fix goes.

Below are 8 of the most common causes. The first 6 are managed leaks; the remaining 2 are unmanaged.

Event handlers

Events in .NET are notorious for causing memory leaks. The reason is simple: after you subscribe to an event on some object, it will keep a reference to your class where the handler is defined (unless you used an anonymous method that does not capture any members of the class).

Look at this example:

public class MyClass
{
	public MyClass(WiFiManager wiFiManager)
	{
		wiFiManager.WiFiSignalChanged += OnWiFiChanged;
	}

	private void OnWiFiChanged(object sender, WifiEventArgs e)
	{
		// do something useful
  }
}

So if wifiManager is defined outside of MyClass, you have a memory leak. wifiManager references the MyClass instance, which now will never be collected by the garbage collector.

Events really can be dangerous, and there is a dedicated article about this: 5 Techniques to Avoid Memory Leaks When Using Events in C# .NET That You Should Know.

What can you do in this situation? The article above describes several good practices to avoid memory leaks. Without going into details, here are some of them:

  1. Always unsubscribe from events.
  2. Use weak event patterns (Weak Event Pattern).
  3. If possible, subscribe using anonymous methods that do not capture other members of the class.

Capturing class members in anonymous methods

It is fairly obvious that using an instance method as an event handler creates a reference from the handler to the object that owns the method. What is much less obvious is that the same thing happens when a class member is captured in an anonymous method.

Here is an example:

public class MyClass
{
	private JobQueue _jobQueue;
	private int _id;

	public My Class(JobQueue jobQueue)
	{
		_jobQueue = jobQueue;
	}

	public void Foo()
	{
		_jobQueue.EnqueueJob(() =>
		{
			Logger.Log($"Executing job with ID {_id}");
			// do useful work
		});
	}
}

In this example, the class member _id is captured by the anonymous method and, as a result, the class instance ends up holding a reference to itself. This means that as long as _jobQueue exists and references the anonymous delegate, it [_jobQueue] also references the MyClass instance.

The fix here is simple: use a local variable instead:

public class MyClass
{
	public My Class(JobQueue jobQueue)
	{
		_jobQueue = jobQueue;
	}

	private JobQueue _jobQueue;
	private int _id;

	public void Foo()
	{
		var localId = _id;
		_jobQueue.EnqueueJob(() =>
		{
			Logger.Log($"Executing job with ID {localId}");
			// do something
		});
	}
}

If you copy the value into a local variable, the class member will not be captured and you will prevent the leak.

Note: if the root cause of the leak in this case is not entirely clear, take a look at this comment.

Static variables

Some developers consider static variables to be a bad practice. Nevertheless, when talking about memory leaks, they are important to mention.

Before getting to the point of this section, let's briefly talk about how the .NET garbage collector works. The basic idea is that the GC walks all root objects (GC Roots, roots) and marks them as objects that will not be collected. Then it walks all objects referenced by those roots and marks them as well, and so on. Eventually, the GC collects everything that remains unmarked (a great article about the garbage collector).

What is considered a root object?

  1. The stacks of executing threads.
  2. Static variables.
  3. Managed objects passed to COM objects via Interop.

This means that static variables, and everything they reference, will never be reclaimed by the garbage collector. Here is an example:

public class MyClass
{
	static List<MyClass> _instances = new List<MyClass>();
	public MyClass()
	{
		_instances.Add(this);
	}
}

If you write the code above for some reason, any MyClass instance will remain in memory forever, causing a leak.

Caching

Developers love caching. After all, why perform an operation twice if you can do it once and store the result, right?

That is true, but if you cache without bounds, you will eventually exhaust all available memory. Look at this example:

public class ProfilePicExtractor
{
	private Dictionary<int, byte[]> PictureCache { get;set; } = new Dictionary<int, byte[]>();

	public byte[] GetProfilePicByID(int id)
	{
		// Ideally, you should use a synchronization mechanism here,
		// but we omit it to keep the example simple
		if (!PictureCache.ContainsKey(id))
		{
			var picture = GetPictureFromDatabase(id);
			PictureCache[id] = picture;
		}
		return PictureCache[id];
	}

	private byte[] GetPictureFromDatabase(int id)
  {
		// ...
	}
}

Caching in this example helps reduce expensive database calls, but the cost is memory bloat.

To address this, you can use the following practices:

  1. Remove items from the cache that have not been used for some time.
  2. Limit the cache size.
  3. Use WeakReference to store cached objects. WeakReference allows the garbage collector to clean up the cache on its own, which in some cases may not be a bad idea. The GC will promote objects that are still in use to older generations so they stay in memory longer. This means frequently used objects will remain in the cache longer, while unused ones will be collected without your explicit involvement.

Incorrect data binding in WPF

Data binding in WPF can also cause memory leaks. The main rule to prevent leaks is to always use DependencyObject or INotifyPropertyChanged. If you do not, WPF creates a so-called strong reference to the object, causing a memory leak (more detailed explanation).

Example:

<UserControl x:Class="WpfApp.MyControl"
		xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
		xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
	<TextBlock Text="{Binding SomeText}"></TextBlock>
</UserControl>

The class below will remain in memory forever:

public class MyViewModel
{
	public string _someText = "memory leak";

	public string SomeText
	{
		get { return _someText; }
		set { _someText = value; }
	}
}

But this class will not cause a leak:

public class MyViewModel : INotifyPropertyChanged
{
public string _someText = "not a memory leak";

public string SomeText
	{
		get { return _someText; }
		set
		{
			_someText =value;
			PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(nameof (SomeText)));
		}
	}
}

In fact, it does not even matter whether you raise PropertyChanged or not; the key point is that the class implements INotifyPropertyChanged. This tells the WPF infrastructure not to create a strong reference.

Memory leaks occur only when the binding mode is OneWay or TwoWay. If the binding uses OneTime or OneWayToSource, there is no problem.

Memory leaks in WPF can also happen when binding collections. If the collection does not implement INotifyCollectionChanged, you will get a memory leak. You can avoid the problem by using ObservableCollection, which implements this interface.

Threads that never stop

We already discussed how the garbage collector works and what GC roots are. I mentioned that a thread stack is considered a root. A thread stack includes all local variables as well as call stack frames.

If you create an infinite thread that does nothing but keeps references to objects, you will get a memory leak. One way this can happen easily is incorrect use of the Timer class. Look at this code:

public class MyClass
{
	public MyClass()
	{
		Timer timer = new Timer(HandleTick);
		timer.Change(TimeSpan.FromSeconds(5), TimeSpan.FromSeconds(5));
	}

	private void HandleTick(object state) => // do something
}

If you do not stop the timer, it will keep running indefinitely on a separate thread, holding a reference to MyClass and preventing it from being collected.

Unreleased unmanaged memory

So far, we have only talked about managed memory, which is reclaimed by the garbage collector. Unmanaged memory is a different story. Instead of just avoiding references to unneeded objects, you must explicitly free the memory.

Here is a simple example:

public class SomeClass
{
	private IntPtr _buffer;

	publicSomeClass()
	{
		_buffer = Marshal.AllocHGlobal(1000);
	}

	// do something, but do not free the memory
}

In this example we used Marshal.AllocHGlobal to allocate a block of unmanaged memory (see the MSDN documentation). If you do not explicitly free the memory via Marshal.FreeHGlobal, it will remain allocated in the process heap, causing a leak even after SomeClass is collected by the GC.

To prevent such issues, you can add a Dispose method to your class to clean up unmanaged resources. For example:

public class SomeClass : IDisposable
{
	private IntPtr _buffer;

	publicSomeClass()
	{
		_buffer = Marshal.AllocHGlobal(1000);
		// do something, but do not free the memory
	}

	public void Dispose() => Marshal.FreeHGlobal(_buffer);
}

Unmanaged memory leaks can be even worse than managed leaks due to fragmentation. The GC can defragment managed memory by moving surviving objects next to each other to free space for new allocations. Unmanaged memory, on the other hand, stays tied to the location where it was allocated.

Dispose not called

In the previous example we added a Dispose method to release unmanaged resources when they are no longer needed. That is great, but what happens if someone uses the class and never calls Dispose?

What you can do is use the C# using construct:

using (var instance = new MyClass())
{
	// ...
}

The construct from the example works for classes that implement IDisposable and is compiled into the following code:

MyClass instance = new MyClass();
try
{
	// ...
}
finally
{
if (instance != null)
	{
		((IDisposable)instance).Dispose();
	}
}

This is convenient because even if an exception is thrown, Dispose will still be called.

For maximum reliability, MSDN suggests the Dispose implementation pattern. Here is an example of how it can be used:

public class MyClass : IDisposable
{
	private IntPtr _bufferPtr;
	public int BUFFER_SIZE = 1024 * 1024; // 1 MB
	private bool _disposed = false;
	
	publicMyClass()
	{
		_bufferPtr =  Marshal.AllocHGlobal(BUFFER_SIZE);
	}

	protected virtual void Dispose(bool disposing)
	{
		if (_disposed)
		return;
		
		if (disposing)
		{
			// clean up managed objects being used
		}

		// clean up unmanaged objects
		Marshal.FreeHGlobal(_bufferPtr);
		_disposed = true;
	}

	public void Dispose()
	{
		Dispose(true);
		GC.SuppressFinalize(this);
	}

	~MyClass()
	{
		Dispose(false);
	}
}

Using this pattern helps ensure that even if Dispose is not called explicitly, it will still be called by the finalizer when the garbage collector decides to collect the object. If Dispose is called manually, the object's finalizer is suppressed and will not run. Suppressing finalization is important because running a finalizer is relatively expensive and can cause performance issues.

But keep in mind that Microsoft's Dispose pattern is not a silver bullet. If you do not call Dispose manually and the object is not collected because of a managed leak, the unmanaged resources will not be released either.

Tradeoffs

Decision Option A Option B When A When B
Event subscription model Strong events (standard C# events) Weak events (WeakEventManager, ConditionalWeakTable) Short-lived subscribers with deterministic unsubscription (e.g., using scope) Long-lived publishers with many transient subscribers (UI frameworks, plugin systems)
Caching strategy Unbounded Dictionary cache MemoryCache with size limits and eviction Never — unbounded caches always leak eventually Always for any cache that grows proportionally with input; set SizeLimit and AbsoluteExpirationRelativeToNow
Unmanaged resource cleanup IDisposable only (deterministic, no finalizer) IDisposable plus finalizer safety net When all callers reliably use using/await using (internal code, DI-managed lifetimes) When the type is exposed to external consumers who may forget Dispose() — the finalizer catches the leak at the cost of one extra GC cycle
Leak detection approach Periodic memory dumps plus manual analysis Continuous monitoring with dotnet-counters / EventPipe Post-incident investigation, deep root-cause analysis Production monitoring — alert on Gen 2 heap size or GC handle count crossing thresholds before OOM

Decision rule: treat every IDisposable as a potential leak. Use using statements for all disposable objects. For caches, always set size limits and TTLs — unbounded caches are the number one managed leak pattern in production .NET services.

Questions


Whats next