Read Write Destination Problem


I am creating a game called "Feathers," and I'm having a problem with asset initialization. I am using Microsoft Visual Studio for this project. I have also omitted lines of code that don't affect the following.

I'm trying to load in images via a vector array, so I created a functions.h file with the following:

1#pragma once 2#include <stdio.h> 3#include <iostream> 4#include <allegro5\allegro.h> 5#include <allegro5\allegro_font.h> 6#include <allegro5\allegro_ttf.h> 7#include <allegro5\allegro_native_dialog.h> 8#include <allegro5\allegro_primitives.h> 9#include <allegro5\allegro_image.h> 10#include <allegro5\allegro_audio.h> 11#include <allegro5\allegro_acodec.h> 12#include <allegro5\allegro_opengl.h> 13#include <allegro5\allegro_video.h> 14#include <allegro5\allegro_opengl.h> 15#include <vector> 16 17 18#define FPS 60 19 20/*forward declarations*/ 21 22void menu_images(std::vector <ALLEGRO_BITMAP *> &menu); 23void options_images(std::vector <ALLEGRO_BITMAP *> &option);

I then created a functions.cpp file that says this:

2 3#include <stdio.h> 4#include <iostream> 5#include "functions.h" 6#include <allegro5\allegro.h> 7#include <allegro5\allegro_font.h> 8#include <allegro5\allegro_ttf.h> 9#include <allegro5\allegro_native_dialog.h> 10#include <allegro5\allegro_primitives.h> 11#include <allegro5\allegro_image.h> 12#include <allegro5\allegro_audio.h> 13#include <allegro5\allegro_acodec.h> 14#include <allegro5\allegro_opengl.h> 15#include <allegro5\allegro_video.h> 16#include <allegro5\allegro_opengl.h> 17#include <vector> 18 19 20#define FPS 60 21 22 23/*FUNCTIONS*/ 24 25void menu_images(std::vector <ALLEGRO_BITMAP *> &menu) 26{ 27 28 menu.push_back(al_load_bitmap("assets/menucont.png")); 29 menu.push_back(al_load_bitmap("assets/menunewg.png")); 30 menu.push_back(al_load_bitmap("assets/menuopt.png")); 31 menu.push_back(al_load_bitmap("assets/menucred.png")); 32 menu.push_back(al_load_bitmap("assets/menuquit.png")); 33 menu.push_back(al_load_bitmap("assets/menutext.png")); 34 menu.push_back(al_load_bitmap("assets/menutextcont.png")); 35}; 36void options_images(std::vector <ALLEGRO_BITMAP *> &option) 37{ 38option.push_back(al_load_bitmap("assets/optionsmenu.png")); 39 option.push_back(al_load_bitmap("assets/resarrow.png")); 40 option.push_back(al_load_bitmap("assets/soundarrow.png")); 41 option.push_back(al_load_bitmap("assets/restrans.png")); 42 option.push_back(al_load_bitmap("assets/arrow2160.png")); 43 option.push_back(al_load_bitmap("assets/arrow1080.png")); 44 option.push_back(al_load_bitmap("assets/arrow720.png")); 45 option.push_back(al_load_bitmap("assets/arrow480.png")); 46 option.push_back(al_load_bitmap("assets/res2160.png")); 47 option.push_back(al_load_bitmap("assets/res1080.png")); 48 option.push_back(al_load_bitmap("assets/res720.png")); 49 option.push_back(al_load_bitmap("assets/res480.png")); 50 option.push_back(al_load_bitmap("assets/soundtrans.png")); 51 option.push_back(al_load_bitmap("assets/arrowsoundon.png")); 52 option.push_back(al_load_bitmap("assets/arrowsoundoff.png")); 53 option.push_back(al_load_bitmap("assets/soundon.png")); 54 option.push_back(al_load_bitmap("assets/soundoff.png")); //LOADING IMAGES PROBLEMS 55 option.push_back(al_load_bitmap("assets/backarrow.png")); 56 option.push_back(al_load_bitmap("assets/arrowfull.png")); 57 option.push_back(al_load_bitmap("assets/arrowwind.png")); 58};

The problems happen on the images after the comment, where an error pops up saying "Read write violation, dest was nullptr". The program will work if I comment out the bottom 4 images, and I can swap the destinations for the working images with the error images and they will work fine.

When I try to load the images into the actual game cpp, I use this: (menustate is defined an enum value, multiplyer is a float used when resizing, and sound is a bool)

3 4std::vector <ALLEGRO_BITMAP *> menu; 5 menu_images(menu); 6std::vector < ALLEGRO_BITMAP *> option; 7 options_images(option); 8 9switch (menustate) 10 11 { 12case OPT_SOUND_OFF: 13 al_draw_scaled_bitmap(option[12], 0, 0, al_get_bitmap_width(option[12]), al_get_bitmap_height(option[12]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 14 al_draw_scaled_bitmap(option[14], 0, 0, al_get_bitmap_width(option[14]), al_get_bitmap_height(option[14]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 15 if (sound) 16 { 17 al_draw_scaled_bitmap(option[15], 0, 0, al_get_bitmap_width(option[15]), al_get_bitmap_height(option[15]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 18 } 19 else if (!sound) 20 { 21 al_draw_scaled_bitmap(option[16], 0, 0, al_get_bitmap_width(option[16]), al_get_bitmap_height(option[16]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 22 } 23 break; 24 25 case OPT_BACK: 26 al_draw_scaled_bitmap(option[17], 0, 0, al_get_bitmap_width(option[17]), al_get_bitmap_height(option[17]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 27 break; 28 29 case OPT_FULL: 30 al_draw_scaled_bitmap(option[3], 0, 0, al_get_bitmap_width(option[3]), al_get_bitmap_height(option[3]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 31 al_draw_scaled_bitmap(option[18], 0, 0, al_get_bitmap_width(option[18]), al_get_bitmap_height(option[18]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 32 break; 33 34 case OPT_WIND: 35 al_draw_scaled_bitmap(option[3], 0, 0, al_get_bitmap_width(option[3]), al_get_bitmap_height(option[3]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 36 al_draw_scaled_bitmap(option[19], 0, 0, al_get_bitmap_width(option[19]), al_get_bitmap_height(option[19]), 0, 0, (int)(1920 * multiplyer), (int)(1080 * multiplyer), 0); 37 break; 38}

I apologize if the error is obvious, as I am a relatively novice programmer, especially when it comes to Allegro. I appreciate the help!

Kris Asick

Double check to make sure "soundoff.png" actually has that filename and that it doesn't have a typo in it. :P

Also, routinely switching the active texture you're drawing from on the GPU side of things will kill your framerate. The better thing to do is to group as many graphics as you can onto a single bitmap and then divide it into sub-bitmaps after loading. It's less convenient, but your framerate will be a LOT better. ;)

Also also, it looks as though you're loading assets for multiple screen resolutions all at once. You'd probably have an easier time of everything if you only loaded the pertinent ones. When the player switches screen resolutions you unload what's in memory, do the switch, then load up the new ones.


In terms of the filenames, here's a link to a screenshot of the assets folder.
All four of the bottom loads (soundoff.png, backarrow.png, arrowfull.png, arrowwind.png) will not load in because the program thinks the vector array is only a size 16, while it is really 20. I am able to swap one of the filepaths for the images above these four with one of them and the program will load it in.

The problem that I am having is relating to reading and writing. If I wanted to add more images to the game, they won't be able to load in, so it's not an issue with framerate.

The images that I think you are thinking of for different resolutions are a part of the options (res2160.png, res1080.png, res720.png, res480.png). Those images are just text that tells the player about the different resolutions.

RPG Hacker

What happens if you take just the line
And store it in a temporary local ALLEGRO_BITMAP* variable before adding it to the vector? Will that variable be nullptr or actually something else?

Edgar Reynaldo

There shouldn't be any kind of problem simply pushing back pointers onto a vector. Memory use is almost negligible. And it shouldn't matter if the bitmap didn't load properly, as it should just push back a null pointer. Something else is going on. How often do you call the function that creates those two vectors? Once every frame? That would be a massive memory leak if you did that, without freeing the ALLEGRO_BITMAP*s.

Something else, you should be checking for NULL values returned by al_load_bitmap. Instead of using hard coded loading, try using an array.


1const int NUM_MENU_IMAGES = 7; 2 3const char* menu_image_paths[NUM_MENU_IMAGES] = { 4 "assets/menucont.png", 5 "assets/menunewg.png", 6 "assets/menuopt.png", 7 "assets/menucred.png", 8 "assets/menuquit.png", 9 "assets/menutext.png", 10 "assets/menutextcont.png" 11}; 12 13for (int i = 0 ; i < NUM_MENU_IMAGES ; ++i) { 14 ALLEGRO_BITMAP* bmp = al_load_bitmap(menu_image_paths[i]); 15 if (!bmp) { 16 printf("Failed to load %s\n" , menu_image_paths[i]); 17 } 18 my_menu_vector.push_back(bmp); 19}

Show more code. Show the rest of the game loop.

Kris Asick

Actually, you may be running out of video memory... :o

Video cards don't have an infinite amount of RAM and the amount of RAM taken up by things can be a lot more than you might expect. If all of your graphics there are sized at 3840x2160 then to load all 27 of them into video memory at once would require just over 854 MB of video RAM, provided your video card can handle non-power-of-2 texture sizes, plus Allegro 5 defaults to storing copies of bitmaps into system RAM to refresh the video RAM following a task switch, so that's 854 MB of system RAM you'll need as well. If you have an older video card or an internal video chipset which is going to share video memory with system memory, you're likely exceeding your system's limits.

If your video card does NOT support non-power-of-2 textures, then the smallest texture size those could be stored in would be 4096x4096, 27 of which would require 1,728 MB of video RAM.

I know, the file sizes on disk are much smaller, but that's because image files compress their contents in some way. Once loaded into video memory, in order to be usable, they're stored UNCOMPRESSED. (DirectX does have a means for doing compression on the fly, but because it's platform-specific it's not directly supported by Allegro, at least as far as I'm aware.)

So, instead, you need to consider alternate means of doing what you're doing, especially if you're going to support 4K.

First of all, you can't be using full screens of graphics the way you're doing; It's completely impractical. Instead, you need to draw the individual pieces of what you have at proper screen coordinates. For instance, the tabs which show up by your menu selections, you only need one little tab stored in memory, which you can then draw twice when a menu item is selected, once normally on one side, again rotated 180 degrees on the other side. You're probably also going to want to directly use the font you're using with Allegro's font-loading and rendering capabilities. The logo for the game could be put onto its own texture. Even that huge coloured background you have can be optimized by making it white and slicing off the edges as their own individual pieces and then just drawing a filled rectangle inside the rest. You can then use blending functions to colour it.

If you want to keep doing things the way you're doing you're absolutely going to have to remove 4K support and halve the resolution of all of your assets because it's simply a massive memory hog.

Lastly, the points I made about the framerate were for future reference. I know you're not suffering a framerate issue, but switching which ALLEGRO_BITMAP object you're drawing from more than a handful of times per frame WILL kill the framerate in the long run.

I know it may not seem like it but EVERY person who's ever made a hardware-accelerated 2D game using a standard programming language has to deal with all of this, and then some. There'll be even more optimization issues you'll run into and hardware configurations to consider down the road.

Chris Katko

Speaking of which, are MIPMAPS being generated? Wouldn't that swell the RAM size?

Kris Asick

Speaking of which, are MIPMAPS being generated? Wouldn't that swell the RAM size?

I'm pretty sure Allegro does not do this by default, but if Echogames IS enabling mipmapping prior to loading his assets then yes, this would increase the amount of memory needed by 33%. :P

I also neglected to mention that the OS uses some of the video memory as well for its own purposes, so a 1 GB video card is likely going to completely run out of memory trying to load all of those assets, if they are indeed 4K sized as I'm predicting since the program specifically has a 2160p resolution setting. An integrated chipset may not even support non-power-of-2 textures and thus that could be part of the problem too.

The simplest way to test all of this would be to just temporarily modify what order the assets are being loaded in and seeing if the crash still happens on the same files or not.

RPG Hacker

Well, lucky I won't run out of video RAM on my home system so quickly with my 24 GB of video RAM! ;D


Well, lucky I won't run out of video RAM on my home system so quickly with my 24 GB of video RAM!

24GB of VRAM? Do you have some kind of professional card?
I've got an 8GB graphics card and afaik that's the highest available for consumer cards.

You're not using an iGPU and dedicated all your ram to it right? Because that would be ridiculous.

RPG Hacker

Nah, I have two graphics cards, actually. Two NVIDIA GeForce TITAN X, each having 12 GB. They're running in SLI mode, so does that count? If not, even 12 GB is still a lot! ;D

If you wonder why the hell someone would waste that much money on two overpowered graphics cards (about $1000 each) that use way too much energy and will become obsolete within the next few years... well... I have no idea! ;D

I suppose the reason is that, for the past 15 or more years, I had to deal with real crap PCs at home, so when the time arrived to buy a new PC last year, I pretty much just said to myself "Well, fuck this! I'm just going to buy the most powerful, overpriced and ridiculous PC this shop has to offer, even if it makes me broke!" and did just that. Don't regret anything! ;D


Well technically you do have 24GB then, but with the way SLI works, resources have to be copied for each card, so current applications can only use ~12GB. With DX12 and Vulkan the developer does get enough control to not do this, but I think for technical reasons this won't be done much.

RPG Hacker

I'm currently working on a D3D12 engine and actually planning to make at least some use of multi-GPU-support. Though in what way, I'm not sure yet. I want it to be intuitive and easy to use, while still giving enough control over what you do. Can't say I have too much experience with D3D12 or Vulkan yet, so I'm not even familiar with core multi-GPU concepts. I'm sorta learning everything as I'm doing it. Considering how complex those APIs are (and how limited my time is working on my engine), you can imagine how long it'll take before I actually have a good understanding of modern graphics APIs.


Thank you all to those who helped me solve this. As I said before, I am a relatively novice programmer, so I do appreciate learning proper techniques from those who are far more educated with it.

The code works beautifully now, and I only have to use 4 images to create the menus as opposed to the 27 prior. :D (The border, the logo, the arrow, and the semi-transparent arrow.)

I tried too hard to take shortcuts, and this is what I deserved for it :P

P.S. I'll keep this topic open as it seems there is a nice discussion going on. In case you're curious about my specs, I have an NVIDIA GTX 960 4GB Graphics Card, an Intel i7 Processor with 3.4gH, 16GB integrated RAM, and a 1 TB hard drive.

RPG Hacker

I wonder: Are there any good tools for monitoring VRAM consumption of a graphics card? I know that some IDE's, like Visual Studio, have fairly advanced graphics debuggers (at least for D3D10 and newer, for D3D9, you'll ned to stick to pix or third party debuggers). Those debuggers let you do a lot of advanced stuff like looking at all textures currently in VRAM or debugging shaders etc. However, I'm not sure if any of those actually have some kind of monitoring on how much VRAM a certain card has used so far. That would be quite useful!

Kris Asick
echogames said:

In case you're curious about my specs, I have an NVIDIA GTX 960 4GB Graphics Card, an Intel i7 Processor with 3.4gH, 16GB integrated RAM, and a 1 TB hard drive.

OK, a graphics card like that shouldn't've been running out of video RAM. Then again, I was only guessing at the sizes of your assets based on what your limits seemed to be. If they were even LARGER, then there still would've been a possibility of running out of video RAM.

Still, for the kinds of games you're likely to be making in the near future, you should assume the smallest video RAM size an end user will have will be 1 GB, so try to limit yourself to using that much with your assets. You can tell how much video RAM something will use at the minimum, in bytes, by simply doing: Width * Height * 4

The actual amount of video RAM used will be slightly more as the video card is going to store extra details about each texture loaded in, plus some will be used by default by the OS.

I wonder: Are there any good tools for monitoring VRAM consumption of a graphics card?

I think some of the same programs used to monitor the temperature and fan speeds and such of GPUs are also able to monitor how much video RAM is in use.

Thread #616250. Printed from