Skip to content

[Bug] Add GC memory pressure tracking for native allocations to avoid memory leaking#607

Open
Nucs wants to merge 3 commits intomasterfrom
memory_leak
Open

[Bug] Add GC memory pressure tracking for native allocations to avoid memory leaking#607
Nucs wants to merge 3 commits intomasterfrom
memory_leak

Conversation

@Nucs
Copy link
Copy Markdown
Member

@Nucs Nucs commented Apr 12, 2026

Summary

Fixes #501 and implements #605.

  • Add GC.AddMemoryPressure() when allocating native memory via NativeMemory.Alloc
  • Add GC.RemoveMemoryPressure() when freeing it
  • Update documentation to reflect the new behavior

Problem

When creating many NDArrays in a loop, memory would grow to 10GB+ before GC kicked in:

for (int i = 0; i < 1_000_000; i++)
{
    NDArray array2 = np.array(new double[110]); // 880 bytes each
}
// Before: peaks at 10+ GB
// After: stable at ~54 MB

Root Cause

NumSharp allocates array data via NativeMemory.Alloc (unmanaged) but did not inform the GC about this memory. The GC only saw small managed wrappers (~100 bytes) and was unaware of the ~880+ bytes of unmanaged data per array.

Solution

Track memory pressure in UnmanagedMemoryBlock<T>.Disposer:

Allocation Type Tracks Pressure? Reason
Native (NativeMemory.Alloc) ✅ YES NumSharp allocates → NumSharp tracks
External with dispose ❌ No Caller allocates → Caller's responsibility
GCHandle (pinned managed) ❌ No GC already knows about managed arrays
Wrap (no ownership) ❌ No Not our memory

Test Results

Scenario Before After
np.array() × 1M (110 doubles) ~10 GB 54 MB
np.array() × 100K (10K doubles) 1,193 MB peak 46 MB

Checklist

  • Native allocations track pressure
  • External Disposer has optional bytesCount parameter (default 0)
  • Documentation updated

Nucs added 3 commits April 12, 2026 13:09
Fixes GitHub issue #501 where memory would grow to 10GB+ when creating
many NDArrays in a loop without explicit GC.Collect() calls.

Root cause: NumSharp allocates data via NativeMemory.Alloc (unmanaged)
but did not inform the GC about this memory pressure. The GC only saw
small managed objects (~100 bytes each) and didn't know about the
~880+ bytes of unmanaged data per array, so it would not trigger
collections frequently enough.

Fix: Add GC.AddMemoryPressure() when allocating unmanaged memory and
GC.RemoveMemoryPressure() when freeing it. This informs the GC about
the true memory footprint so it schedules collections appropriately.

Before fix: Creating 1M arrays with 110 doubles each peaked at 10+ GB
After fix:  Same workload peaks at ~54 MB (stable, proper GC behavior)

Changes:
- Disposer constructor now takes bytesCount parameter for Native allocs
- Call GC.AddMemoryPressure(bytesCount) on allocation
- Call GC.RemoveMemoryPressure(bytesCount) on deallocation
Fixes GitHub issue #501 where memory would grow to 10GB+ when creating
many NDArrays in a loop without explicit GC.Collect() calls.

Root cause: NumSharp allocates data via NativeMemory.Alloc (unmanaged)
but did not inform the GC about this memory pressure. The GC only saw
small managed wrapper objects (~100 bytes) and was unaware of the
~880+ bytes of unmanaged data per array, so it wouldn't trigger
collections frequently enough.

Fix: Add GC.AddMemoryPressure() when allocating unmanaged memory and
GC.RemoveMemoryPressure() when freeing it. This informs the GC about
the true memory footprint so it schedules collections appropriately.

Only the Native path (NativeMemory.Alloc) tracks pressure. External
memory paths (np.frombuffer with dispose) are the caller's responsibility.

Before fix: Creating 1M arrays with 110 doubles each peaked at 10+ GB
After fix:  Same workload peaks at ~54 MB (stable, proper GC behavior)
- Fix incorrect "No GC Pauses" claim in Why Unmanaged Memory section
- Add GC Pressure Tracking subsection explaining how NumSharp informs GC
- Update transfer ownership example to show AddMemoryPressure best practice
@Nucs Nucs changed the title fix(memory): Add GC memory pressure tracking for native allocations [Bug] Add GC memory pressure tracking for native allocations to avoid memory leaking Apr 12, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Memory leak?

1 participant