Chi-Coder: CPU Got What I Need.

A couple weeks ago we talked about memory.  It’s an extremely important topic and arguably the most crucial part of any game.  That’s debatable though, and this week I’d like to take some time to talk about the CPU and it’s extremely important role in every single game we play.

First, the basics.

If you’re a programmer of any skill level it’s important to understand the physical components of a computer.  The hard drive stores files long-term, memory holds those files for quick access when they need to be used, and the CPU processes instructions for manipulating that memory.  As an example, when you create a Word document, you hit the New button and a document opens up in memory.  You then type some letters and the keyboard inputs are translated by the program into instructions that the CPU executes.  Once it’s executed all the instructions you’d like it to, you then save the file to the permanent storage of the hard drive.  That’s basically of the holy-trinity of the computer.  The thing is, you could operate without RAM (your hard drive can mimic it’s functionality) and you could operate without a hard-drive (you just couldn’t save anything permanently) but without a CPU neither the RAM or the hard drive would have anything to do.  Powering them up would literally do nothing.  So the CPU is the god of the computer.

What makes the CPU so important?

Beyond the obvious fact that it processes every single instruction that your computer does, it’s important because it can be used (and misused) to an incredible degree.  What’s so dangerous about the way we use the CPU is how much effect even a small coding change can have on performance.  It’s also worth noting that memory management might be equally important, but it’s much easier to understand and fix when it’s an issue.  It can be far more difficult to track down some pesky code that’s causing the CPU to have to do more work than it needs too.

If you’re a programmer, chances are that you’re aware that most coding challenges come with more than one “right answer”.  While that’s absolutely true, there are definitely varying degrees of right when it comes to solving a coding problem.  Let’s say you wanted to give your player’s character some health.  Here’s one way you could do it:

function addPlayerHealth(int AmountToAdd)
{
    for(int i = 0; i < AmountToAdd; i++)
    {
        playerHealth = playerHealth + 1;
    }
}

Or you could do this:

function addPlayerHealth(int AmountToAdd)
{
    playerHealth = playerHealth + AmountToAdd;
}

The first example is technically right but it’s ridiculously silly for a number of reasons.  For the purpose of this discussion though let’s focus on the fact that if you were to ad 1000 health to the player, you’d be forcing the CPU to add 1 a thousand separate times.  This would mean that when giving the player 1 health, the functions perform relatively equally but the second one scales with virtually no performance change whereas the first is impacted severely the more health is added to the player.

While this example should be painfuly obvious, what’s not so obvious are all the little catches like this that exist all over your code.  And if you’ve ever programmed anything requiring even a small bit of logic it becomes easy to realize just how out of control program flow can get.  As your game or app grows in complexity, so does the opportunity for bad code, or even good code that’s used in a bad way.

Why is it so important?

Let’s get something out of the way; it’s pretty hard to tax your CPU.  If chose to execute the first example from above a million times with an input parameter of 1000 (so 1,000 x 1,000,000 loop iterations) you might not even notice a problem.  That’s how crazy fast CPU’s are these days.  What’s more, if you’re under 35 there’s a decent chance that you’ve grown up coding in an environment where the CPU and memory have largely been ignored.  Even most enterprise level business apps have such robust performance environments that bad coding often has to be egregiously bad to even be noticed.  Games, however, are unique.  Games typically use every system resource available.  To be honest, I think every programmer should write games, because it teaches you resource management in a way that most other programming environments usually never care about.

To that point, it’s incredibly important that you use your CPU properly.  Your game isn’t some one-off business application that is allowed to take 30 seconds to run some task because “that’s just how it works”.  Your game has to be fun, and in order to be fun it has to run well.  That’s not the only thing that makes a game fun but bad performance is a surefire way to make a game not fun.  Going back to the above examples, the first function could have massive negative impact on your games performance if there were a lot of players simultaneously gaining health, whereas the second example is virtually ignorable in terms of performance difference.  This difference cannot be understated either.  Bad programming is the quickest way to absolutely derail any game’s experience.

To make matters worse, when you write bad code it can have a cascading effect on other parts of your game.  3D graphics, for example, are usually offloaded to the GPU to be drawn to the screen, but in order to do that, they first need instructions on what’s changed.  If your bad code forces the CPU to take forever to get it’s job done it can leave little or no time for the GPU to do it’s job.  So when your game start stuttering don’t be so quick to run out and buy a new graphics card; it might be that the programmers who created the game simply did a poor job of optimizing their code.  As you can see though, writing bad code has an adverse effect on CPU performance and that effect cascades down in ways that may not be initially apparent.

How can we combat this?

The good news is that while tracking down bad code isn’t quite as easy as tracking down bloated memory, it’s not terribly hard.  Your first weapon in the fight against bad code is your debugger.  Every once in a while you should create a break point at your first line of code and painstakingly walk through everything the game is doing.  It’s not a fun process, but when you’ve stepped through the same function 150 times for a single frame of movement you’ll quickly realize where  your bad code is.

The second weapon is unit testing.  This isn’t the kind of unit testing you’re used to though.  When it comes to games, I’m talking about performance based unit testing.  Every unit of work in your game should be shoved into some type of test environment and put through it’s paces.  If a single unit of work is too small to measure in milliseconds (which it often is) then run that unit of work a thousand or a billion times.  Then simply take the delta time and divide it by the number of iterations.  Referencing our above examples again, we’d want to consider the entire function to be one unit of work.  Performance testing it that way would yield some strikingly different results, based on which function we chose.  It would be easy to see that example two was far superior, even if it weren’t obvious from looking at the source code.

Let’s recap.

It’s incredibly important that we not ignore the role that the CPU plays in our game.  It holds the keys the kingdom in terms of performance.  Good code affords the freedom to add more features to a game while bad code can cause features to be removed, play experience to be hindered, and the whole experience to be crummy.  The CPU affects more than just itself, too.  Graphics hiccups in a game are often mistakenly blamed when poor CPU management is the true culprit.  That said, if you some pretty basic testing strategies it’s not all that hard to recognize bad code and fix it.

Advertisements

Leave a comment!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s