Author Topic: Heap fragmentation & you  (Read 1374 times)

Offline chaos

  • BFF
  • ***
  • Posts: 291
  • Job, school, social life, sleep. Pick 2.5.
    • View Profile
    • Lost Souls
Heap fragmentation & you
« on: April 16, 2014, 01:13:11 PM »
FYI, if you're running an LPMud and operations seem to be getting inexplicably slower, with intermittent bursts of extreme slowness, you might be running into something that dogged me for a long time before I realized what it was: heap fragmentation.

Basically, your memory allocation is taking too long, sometimes way too long, because your free lists (the data structures that track what memory is available to be allocated) are full of wee little chunks and you're sometimes searching through enormous numbers of them to find a chunk of memory you can use.  So you get operations that were fast yesterday suddenly taking forever, and then not taking forever the next time you run them, and similar glorious debugging joys.

The fastest, most effective way to inflict horrific heap fragmentation on oneself that I've found is to quickly allocate enormous numbers of data structures, and immediately deallocate most of them but not all of them.  Intensive graph search in high combinatoric complexity spaces works great for this; what changed on Lost Souls that made me realize what was going on was a new system for procedural generation of descriptions and configurations for combinations of damage types.

Having realized what was up, and that we had been having the problem for a while without it getting bad enough to be properly identified until this new system was deployed, I was able to quickly get enormous performance benefits by 1) making said new system cache its final results to disk so the graph search didn't need to be recapitulated all the time, and 2) reducing the core lib's reliance on intermediate data structures, by which I mean arrays and mappings that are quickly used and discarded, usually for passing function results around.  Retaining and reusing, rather than reallocating, frequently used data structures, along with strategic use of returning complex results via pass by reference instead of data structure, seems to have quickly and dramatically reduced LS's vulnerability to heap fragmentation.  So I thought I'd pass along the information in case anybody else runs into the problem.

(Honestly, most people running LPMuds probably don't need to worry about the issue at all.  It's mostly really memory-intensive ones that would.  Lost Souls is, as far as I know, the only MUD to discover that things go horribly wrong with 32-bit LDMud's LPC parser if you try to address more than 2 gigabytes of memory.  If your MUD fits in a couple hundred megabytes, you'll be fine.)

Offline FallenTree

  • BFF
  • ***
  • Posts: 476
    • View Profile
Re: Heap fragmentation & you
« Reply #1 on: April 26, 2014, 11:03:51 AM »
use a better malloc library? Also, some visibility will be helpful (look at gperftools)

Offline quixadhal

  • BFF
  • ***
  • Posts: 618
    • View Profile
    • A Waste of Time
Re: Heap fragmentation & you
« Reply #2 on: April 28, 2014, 01:39:04 PM »
While re-engineering your code to be more efficient is never a bad thing, I agree that it shouldn't be required for a MUD these days, unless you're on vintage hardware.

There are a couple of different malloc's that are available out there as easy plug-in wrappers.  I'm not sure how much work it is to get ldmud to work with them, but you might try experimenting to see if one is a better fit to your particular situation than the standard one ldmud uses.

gcmalloc is always a favorite for me, as it auto-frees by reference count (garbage collection) and makes free into a no-op.  There are also ones that attempt to recycle freed blocks below a given size.