Programming 2D Games

The official forum for "Programming 2D Games" the book by: Charles Kelly

It is currently Sat Nov 25, 2017 5:53 am

All times are UTC




Post new topic Reply to topic  [ 3 posts ] 
Author Message
PostPosted: Sun Oct 15, 2017 5:35 am 
Offline

Joined: Thu Oct 05, 2017 6:53 am
Posts: 9
Currently, the TextureManager class is able to contain many separate textures from separate files, but the only way to take advantage of this is by calling initialize() and passing it a text file of all texture files to load.

I added an addTexture() function to the TextureManager class to be able to add a new texture to an already-initialized TextureManager object:
Code:
bool TextureManager::addTexture(std::string file) {
    initialized = false;
    bool success = true;
    try {
        for (UINT i = 0; i<file.size(); i++)    // convert to lowercase
            file.at(i) = tolower(file.at(i));
        fileNames.push_back(file);      // put one file name in files
        width.push_back(0);         // make room for width
        height.push_back(0);        // make room for height
        texture.push_back(NULL);    // make room for texture

        // load texture files
        UINT i = fileNames.size() - 1;
        hr = graphics->loadTexture(fileNames[i].c_str(),
            graphicsNS::TRANSCOLOR, width[i], height[i], texture[i]);
        if (FAILED(hr))
            success = false;    // texture failed to load
    }
    catch (...) { return false; }
    initialized = true;                    // set true when initialized
    return success;
}


Another issue I ran into is that the draw() function in the Image class requires you to provide the texture number (default 0) every time you drew an image.
If a particular image (or entity) object always uses the same non-zero texture number from its TextureManager, it's a little bit awkward to structure the game so that it calls that object's draw() function and passes it the right textureN every time.

A better solution in my opinion is to store the textureNum in the Image object itself:
Code:
  protected:
    Graphics *graphics;     // pointer to graphics
    TextureManager *textureManager; // pointer to texture manager
    // spriteData contains the data required to draw the image by Graphics::drawSprite()
    SpriteData spriteData;  // SpriteData is defined in "graphics.h"
    COLOR_ARGB colorFilter; // applied as a color filter (use WHITE for no change)
    UINT textureNum;        // which texture from textureManager is currently being used
    int     cols;           // number of cols (1 to n) in multi-frame sprite
    ...


And also add a getter and setter for textureNum.
Then change the draw() function prototype to this:
Code:
virtual void draw(COLOR_ARGB color = graphicsNS::WHITE) { draw(color, -1); }

And the function implementation to this:
Code:
void Image::draw(COLOR_ARGB color, UINT textureN)
{
    if (!visible || graphics == NULL)
        return;
    // set texture to draw
    if (textureN != -1)
        spriteData.texture = textureManager->getTexture(textureN);
    else
        spriteData.texture = textureManager->getTexture(textureNum);
    if(color == graphicsNS::FILTER)                     // if draw with filter
        graphics->drawSprite(spriteData, colorFilter);  // use colorFilter
    else
        graphics->drawSprite(spriteData, color);        // use color as filter
}


This way, the textureN arg defaults to -1, which results in the Image property `textureNum` being used to draw the Image. But a different textureN can always be passed and used instead.

I also added a textureN arg to Image::initialize() and Entity::initialize() so that the a textureNum can be specified during initialization.
I also re-ordered the args to Image::initialize() and Entity::initialize() so that width, height, and ncols are optional and default to 0.
The prototype for Image::initialize() is now:
Code:
    virtual bool Image::initialize(Graphics *g, TextureManager *textureM, UINT textureN = 0,
                                   int width = 0, int height = 0, int ncols = 0);

The implementation is now:
Code:
bool Image::initialize(Graphics *g, TextureManager *textureM, UINT textureN,
                       int width, int height, int ncols)
{
    try{
        graphics = g;                               // the graphics object
        textureManager = textureM;                  // pointer to texture object

        textureNum = textureN;
        spriteData.texture = textureManager->getTexture(textureN);
        if(width == 0)
            width = textureManager->getWidth(textureN);     // use full width of texture
        spriteData.width = width;
        if(height == 0)
            height = textureManager->getHeight(textureN);   // use full height of texture
        spriteData.height = height;
        cols = ncols;
        if (cols == 0)
            cols = 1;                               // if 0 cols use 1

        // configure spriteData.rect to draw currentFrame
        spriteData.rect.left = (currentFrame % cols) * spriteData.width;
        // right edge + 1
        spriteData.rect.right = spriteData.rect.left + spriteData.width;
        spriteData.rect.top = (currentFrame / cols) * spriteData.height;
        // bottom edge + 1
        spriteData.rect.bottom = spriteData.rect.top + spriteData.height;
    }
    catch(...) {return false;}
    initialized = true;                                // successfully initialized
    return true;
}


Similar changes were made to the Entity class.

I also added a very simple getSize() function to textureManager.h:
Code:
UINT getSize() const
{
    return texture.size();
}

Which has been useful for me when I want to set an Image/Entity to use a TextureManager's last texture (size - 1).

The last thing to note on this subject is Entity::collidePixelPerfect(). That function includes these lines:
Code:
// get fresh texture because they may have been released, maybe a bad idea
ent.spriteData.texture = ent.textureManager->getTexture();
spriteData.texture = textureManager->getTexture();


I think this code has always been problematic, because it assumes that both entities should use texture 0 from their TextureManager when considering pixel perfect collision.
If you want to perform pixel perfect collision using any texture from either TextureManager, these lines need to be changed to:
Code:
// get fresh texture because they may have been released
ent.spriteData.texture = ent.textureManager->getTexture(ent.getTextureNum());
spriteData.texture = textureManager->getTexture(getTextureNum());


Last edited by Danny-E 33 on Mon Oct 16, 2017 3:59 am, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Sun Oct 15, 2017 10:23 pm 
Offline
Site Admin
User avatar

Joined: Sat Jan 28, 2012 4:36 pm
Posts: 542
Thanks for sharing your code changes. It looks like some nice additions.

_________________
Professor Kelly


Top
 Profile  
Reply with quote  
PostPosted: Mon Oct 16, 2017 4:22 am 
Offline

Joined: Thu Oct 05, 2017 6:53 am
Posts: 9
You're welcome! Many thanks for making the engine and writing the book!

I just realized that the Image::draw() function takes a UINT for the textureN arg.. and I made it default to -1...
and then in the implementation compare the arg to -1 to decide if the object's textureNum property should be used instead.
Of course since the arg is unsigned, using -1 causes it to get treated as 4294967295 (ULONG_MAX), and then the comparison 'if (4294967295 != -1)' "correctly" evaluates to false.
I suppose I should change the default value and the comparison value from -1 to ULONG_MAX to be more proper, even though the behavior is identical.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 3 posts ] 

All times are UTC


Who is online

Users browsing this forum: Google [Bot] and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group