sizeof() for BITMAP data
Kikaru

How should I get the total size of the memory occupied by the BITMAP pixels themselves? Can I do it with one call to sizeof(), or should I do something like this:

sizeof(/*pixel size*/)*WIDTH*HEIGHT;

Anyway, thanks in advance!

orz

sizeof(pixel_type)*bmp->w*bmp->h sounds good to me.

If you want something different, you could try
bmp->h * abs((int)(bmp->line[1] - bmp->line[0])), assuming that your bitmap has a height of at least 2.

Of course, there's also the memory for the BITMAP struct itself, and conceivably for other metadata for some BITMAPs on some platforms.

Peter Wang

You can use BYTES_PER_PIXEL(bpp) from aintern.h. But you'd have trouble getting an exact value as there could be padding between lines. We currently don't do that for memory bitmaps, although there is a single byte of padding at the end for 24-bit bitmaps.

Kikaru

This is to copy, using memcpy() to transfer one BITMAP to another, when both are the same size. It should be really fast. :)

orz

I would recommend that you only use that approach on ordinary memory bitmaps, never sub-bitmaps, video bitmaps, system bitmaps, etc. Also, on further thought, my recommended method for determining the size for that purpose is:
((int)(bmp->line[bmp->h-1] - bmp->line[0]) + bmp->w * sizeof(/*pixel size*/))

Really, I'd suggest just blitting. It's about the same speed unless your bitmaps are 24bpp.

Audric

Did you check the Allegro sources to make sure the "lines" are contiguous in memory?

Bob

What's wrong with blit() again?

Audric

Well, blit() is probably implemented as:

for (int y = start_y; y < end_y; y++)
   for (int x = start_x; x < end_x; x++)
      putpixel(dest, x + off_x, y + off_y, getpixel(orig, x, y);

Ohh wait I forgot the clipping check (redundant) in the loop!
and acquire_bitmap() and release_bitmap()!

orz

Blit (between memory bitmaps of the same color depth) is generally a lot faster than that. It's comparable to a memcopy per (horizontal) line region being overwritten. A single set of rectangle clippings is done at the start of the function, not per-pixel or anything stupid like that. Checks are also done to make sure the destination and source bitmap are the same color depth, but again, those checks are all per-blit, not per-pixel or anything stupid.

In sum, blits between same-color-depth memory bitmaps are comparable to memcopy in speed unless you're doing many extremely tiny blits or your bitmaps are 24bpp.

X-G

I like this "is-probably-implemented-as" approach to figuring out whether the existing solution is good or not.

Evert
Quote:

Well, blit() is probably implemented as:

Do you have any idea how increadibly slow the code you posted would be?
Had you actually checked the source, you would have found that blit() is special cased for each colour depth, for each bitmap type and for each blit between different colour depths and image formats (this sounds worse than it actually is, since a lot of code is similar and put into macros).
In fact, the normal blit() function workhorse is implemented as

1void FUNC_LINEAR_BLIT(BITMAP *src, BITMAP *dst, int sx, int sy,
2 int dx, int dy, int w, int h)
3{
4 int y;
5 
6 ASSERT(src);
7 ASSERT(dst);
8 
9 for (y = 0; y < h; y++) {
10 PIXEL_PTR s = OFFSET_PIXEL_PTR(bmp_read_line(src, sy + y), sx);
11 PIXEL_PTR d = OFFSET_PIXEL_PTR(bmp_write_line(dst, dy + y), dx);
12 
13 memmove(d, s, w * sizeof(*s) * PTR_PER_PIXEL);
14 }
15 
16 bmp_unwrite_line(src);
17 bmp_unwrite_line(dst);
18}

(OFFSET_PIXEL_PTR and PTR_PER_PIXEL are helper macros that are different for each colour depth), although hardware accelerated blits are again special cased.

Audric

Aww, my apologies... When posting, I considered putting <SARCASTIC> tags, but I thought it was obvious enough from the multiple levels of inefficiency in the 3 three lines of pseudocode :-/
I mean, acquire_bitmap() per pixel! dual-buffering a 1024x768 screen would mean 788736 times locking and releasing the screen... crazy!

I'm impressed by the patience of whoever thought I talked seriously !

Jonatan Hedborg

The sudden onslaught of ├╝bernoobs has dulled our sarcasm-senses.

Dustin Dettmer

Ya whats up with that? This last few months its been non-stop.

Evert

I think it may be a sign for us dinosaurs to go the way of the dodo. Figuratively speaking.
;)

Audric

So the n00bs will inherit the Earth ?

Thomas Fjellstrom

We were all n00bs at some point.

Matthew Leverton

But Al Gore hadn't invented the Internet yet, so we couldn't advertise our noobiness as readily, so more likely than not, we actually learned on our own how to do things.

Thomas Fjellstrom

Point taken ;)

though I didn't start this computer stuff till 1997 or so (could have been as late as 1998-1999, can't right recall). I thought that was after al gore invented the web :o

LennyLen
Quote:

though I didn't start this computer stuff till 1997 or so

n00b. :P

Kikaru

Well, I got it working. All memory go! :D
It works about the same speed as blit().
Thanks! ;)

Evert
Quote:

It works about the same speed as blit().

That's hardly surprising. Did you look at the source I posted above?

Thomas Fjellstrom
Quote:

n00b. :P

I know eh? But I think I did all right ;)

Actually, I had used some computers previously in school and whatnot, but I didn't get a "decent" computer till 1997 or so. P1 233 with windows 95 OSR2, which got ditched for linux after a year and a half of trying to learn how to program on windows with no budget, a dialup connection, and somewhat of a conscience.

LennyLen
Quote:

Quote:

n00b.

I know eh? But I think I did all right

I think so too.

I got credit just for calling you a n00b. :D

Thread #589600. Printed from Allegro.cc