Programming 2D Games

The official forum for "Programming 2D Games" the book by: Charles Kelly

It is currently Sun Sep 22, 2019 4:39 am

All times are UTC




Post new topic Reply to topic  [ 1 post ] 
Author Message
PostPosted: Sun Jul 12, 2015 8:09 am 
Offline

Joined: Fri Oct 19, 2012 5:50 pm
Posts: 32
When I bought Prof. Kelly's book, I already understood (or thought I understood) a lot of the concepts, and already had a working engine that was influenced by other sources. Also, I always try to be my own person and code in my own style and voice. So I don't just blindly incorporate all of the professor's concepts if what I have seems to be ok and I don't understand the difference.

Such was the case with the timer. The professor's book was the first I had ever heard of this performance timer. It seemed needlessly complicated. Every other source I had copied...I mean learned from...just used GetTickCount(). Simple. I like simple. If you're like the me of a week ago and you're using GetTickCount() and using a DWORD to hold a number of milliseconds, maybe I can shed some light on why that's...at best, problematic. The short answer is, just do it exactly like the professor does. It's not as scary as it looks. But let me share a little of my recent experience with this.

So my artist modeled the main character of our game with 8 frames of animation, and said the animation was supposed to last 1 second. So I set the fps to 24, because it's a multiple of 8. That way, I can show each cell of the animation for 3 game frames and everything should look nice. If I set the fps to 30 or 60 or something not divisible by the number of cells (I'm trying to avoid overloading the term "frames" here) of the animation, then it's going to play too fast or too slow, or else have a kink in it somewhere--that is, some cells are going to be visible for more frames than others. This is the main character's walking animation, so it's got to look right and be smooth and precise.

So here's how that 24 fps number worked its way through my clocking code. First, that gets converted, in code, to milliseconds per frame (dword) by the simple formula 1000 / 24 = mspf. I figure there's no point in using a float, because the clock only gives me whole milliseconds anyway. So I clock a frame, and then don't let the next frame happen until mspf milliseconds have elapsed. This seemed to work ok, so I left well enough alone. Until...

Some of my faster-moving things, like camera pans, felt a little choppy. So I upped the fps to 48, and then doubled the speed of everything so the end result would be the same, just hopefully smoother. But to my dismay, everything was sooooo slowwwww. It was really frustrating, because I thought I had all the numbers right. And in theory, I did. I was worried that there was some larger issue with my game that was causing it to not be able to run fast enough for 48 fps. So I dusted off my old debug clocking code, and did some testing.

To start with, I removed the time limits on the frames, and just told it to run 1,000 frames as fast as it could, clock each frame using GetTickCount(), and store the time of that frame, in milliseconds, in an array of 1000 dwords, then hit a breakpoint when it got to 1,000 so I could look at the data. Here's what the array looked like: 0, 0, 0, 0, 0, 16, 0, 0, 0, 0, 0, 0, 0, 0, 16, 0, 0, 0, 0, 0, 0, 15, 0, 0, etc. It was a big list of zeroes, mostly, with these intermittent spikes (if you can call 16ms a spike). So then I asked myself, why is my code running so lighting fast most of the time, and spiking here and there? And then I realized that neither of those things were true. Then it hit me what the real problem was. The timer only ticks like once per 15-16ms. So why didn't I notice this before?

It's also worth mentioning that, using a dword, 1000 / 24 was doing an integer division, the result of which was 41. So by using that number to gate the frames, I was essentially asking for 24.4 fps. Not a big difference, but it would become so later on. But if the timer only ticks once per 15-16ms anyway, I'm essentially just waiting on the timer to tick 3 times, which averaged out to about 46-47ms. So the fps I was really getting was somewhere around 21.5 fps instead of the 24.4 I was asking for. A somewhat significant difference, but it wasn't really enough for me to notice until I bumped up to 48 fps.

At 48 fps, that's 1000 / 48 (integer division) = 20 mspf. So right off the bat, this rounding error is causing my 48 fps to be 50 fps. But then it gets much, much worse. Because 20ms is just over the 16ms it takes the timer to tick. So each frame is waiting on the timer to tick twice, which was around 31ms. So even though I said I wanted 48 fps, I was really asking for 50, and even though I was asking for 50, I was really getting 32-33 fps. Completely unacceptable.

I grabbed the book, saw the professor's performance timer usage, researched it a little more on MSDN so I was sure what was happening, and incorporated it into my code. Also started using floating point numbers. I'm super happy with the results. I'm getting exactly the fps I want now, and everything looks silky smooth. I don't know if that's helpful or interesting or even makes sense to anyone, but I learned a lot in the process.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 1 post ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group