dotMemory Snapshot Fails When Private Bytes > ~25GB

I have relocated the core temp and working directory to a drive with plenty of space as previously it was failing while taking the snapshot, now I continually have issues while it tries to process the snapshot.  It appears to be running out of memory which is unfortunate since I need the snapshot from a machine that is low on memory to see where the issue lies.  Furthermore I do not see the capability to take a snapshot/analyze a dump file so I cannot do it offline either (and remote options are limited being in a production data center).

One issue was the entire app hung, other was it stopped having lost CLR4 profiler connectivity and a few other seemingly random issues.  Is this a fundamental limitation or am I missing something that would allow me to perform this type of snapshot?

0
1 comment
Official comment

Hello Gabe,

Please check that NTFS compression in not enabled for core temp and working directory:



What is the size of swap file on the computer where profiled application is running? dotMemory uses a lot of RAM to snapshot processing if snapshot contains a large objects count. Thus, out of memory may occur even if temp and working directory was relocated.

Also could you please attach dotMemory screenshot when out of memory issue occurs?

Please sign in to leave a comment.