Optimizing

Optimization is one of those programming tasks that usually isn’t necessary but everybody thinks you have to do it.  A general rule of thumb is to try to keep software less complex by not introducing loads of difficult to understand optimization routines.  War of Words has a central algorithm for finding a word from a given set of random letters.  This algorithm represents the bulk of the AI “brain”.

A while back, I ran War of Words through the XBox CLR Profiler tool to get a look at the memory use and garbage collection.  I found that the garbage collector runs a lot – it seems like it runs every 30 frames sometimes.  But, I never noticed any performance issues.  The game runs smoother at or around 60 fps without a hitch.  Occasionally there’s a “hiccup” but that is understandable given the profiler’s output.

Taking a look at the word search algorithm, I’m pretty satisfied with it.  However, one thing bothered me – it is very inefficient with regards to memory.  A typical word search can produce hundreds if not thousands of strings given a large enough population of letters.  One test revealed over 28,000 words possible from just 42 letters.  If each word was on average 6 letters in length, we are looking at 336K allocated for all of them!  In the end, a technique I chose limited the amount of words returned and helped quite a bit.  If I was writing a desktop application that used this algorithm or a web app, I probably wouldn’t care at all about string use (after all, web apps are great for garbage collectors because of the request/response nature of pages – once a response is done, all the memory used can be collected and latency is expected to be high with requests because of the internet).

However, I was still bothered by the fact that for every partial word match, I was allocating a string.  For example, say I find a simple word like CAT.  This algorithm actually produces three words – C, CA, and CAT as it searches within a word graph.  The first two words are just garbage.  Compound this by many, many more words and we are talking about a lot of wasted garbage for many words that are not words at all.  Because I was using .NET string types, these variables are immutable and cannot be changed.  They just accumulate in memory until over 1 MB of allocation occurs.  At that point, the .NET garbage collector is invoked and the garbage is released.  That’s great, except you can fill that 1 MB pretty quickly with searches that take hundreds of KBs to perform.  Lots of garbage collections can lead to dropping frames and cause glitches in the gameplay.

My solution was to take out the use of strings and instead use StringBuilder.  StringBuilder is basically a nice wrapper around a char array.  It just allocates space when needed and allows for the mutability (changing) of strings.  Since I only allow up to 15 letter words in the game, I basically allocate one 15 character StringBuilder per search and then add and remove characters to this buffer.  Once I find words, I convert them to strings.  There are still some string allocations, but they are guaranteed to be actual words.  These words can then be sent to the word scorer and compared to themselves to find optimal and suboptimal candidates for the AI.  I’m proud that this optimization got in, but in reality I didn’t even need it!  I think the complexity introduced here was worth it just for scalability, but I will admit that the process introduced a bug that took me a while to figure out.  So in the end, like all other things, it was just a judgment call on my part.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: