Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » Getting portions of number back

This thread is locked; no one can reply to it. rss feed Print
Getting portions of number back
Don Freeman
Member #5,110
October 2004
avatar

Still learning bit manipulations, so sorry for not understanding...My question is how do I get my "packed" data back from using something like:

#SelectExpand
1enum CardColor 2{ 3 CardColor_Red, 4 CardColor_Blue, 5 CardColor_Green, 6 CardColor_Yellow, 7}; 8enum CardType 9{ 10 CardType_Zero, 11 CardType_One, 12 CardType_Two, 13 CardType_Three, 14 CardType_Four, 15 CardType_Five, 16 CardType_Six, 17 CardType_Seven, 18 CardType_Eight, 19 CardType_Nine, 20}; 21/////////////////////////////////////////////////////////////////////////////// 22int main( void ) 23{ 24 int card = 0; 25 char cardColor = CardColor_Blue; 26 char cardType = CardType_Seven; 27 char showFace = 1; 28 char reserved = 0; 29 card = ((cardColor<<24)|(cardType<<16)|(showFace<<8)|(reserved)); 30 // now how do I extract the cardColor, cardType, and showFace values back from card?! :( 31 return 0; 32} 33///////////////////////////////////////////////////////////////////////////////

My question is now how do I extract the cardColor, cardType, showFace, and reserved values back from the variable card?! :(

--
"Everyone tells me I should forget about you, you don’t deserve me. They’re right, you don’t deserve me, but I deserve you."
"It’s so simple to be wise. Just think of something stupid to say and then don’t say it."

Arthur Kalliokoski
Second in Command
February 2005
avatar

cardface = (card >> 8) & 0xFF;
cardcolor = (card >> 24) & 0xFF:

wouldn't that work?

They all watch too much MSNBC... they get ideas.

Don Freeman
Member #5,110
October 2004
avatar

Thanks for the fast response! I got it to work with:

  char reserved = (card >> 0 ) & 0xFF;
  char showFace = (card >> 8 ) & 0xFF;  
  char value = ( card >> 16 ) & 0xFF;
  char color = ( card >> 24 ) & 0xFF;

If you don't care, what is the &0xFF for? I don't know why this stuff confuses me at times. :(

--
"Everyone tells me I should forget about you, you don’t deserve me. They’re right, you don’t deserve me, but I deserve you."
"It’s so simple to be wise. Just think of something stupid to say and then don’t say it."

Arthur Kalliokoski
Second in Command
February 2005
avatar

The 0xFF masks off the stuff "above" the desired value, i.e. cardface = (card >> 8) would still have cardcolor in it.

They all watch too much MSNBC... they get ideas.

Don Freeman
Member #5,110
October 2004
avatar

Oh, okay! Thanks a million! :D

--
"Everyone tells me I should forget about you, you don’t deserve me. They’re right, you don’t deserve me, but I deserve you."
"It’s so simple to be wise. Just think of something stupid to say and then don’t say it."

gnolam
Member #2,030
March 2002
avatar

If you don't care, what is the &0xFF for? I don't know why this stuff confuses me at times. :(

& 0xFF => "Truncate to only the last 8 bits".

1 AND 1 == 1
1 AND 0 == 0
0 AND 1 == 0
0 AND 0 == 0

So say you have 0x5559DF50 & 0xFF. The AND is done bit-for-bit, so

01010101010110011101111101011001
(0x5559DF59)
&
00000000000000000000000011111111
(0xFF)
=
00000000000000000000000001011001
(0x00000059)

--
Move to the Democratic People's Republic of Vivendi Universal (formerly known as Sweden) - officially democracy- and privacy-free since 2008-06-18!

weapon_S
Member #7,859
October 2006
avatar

Is it considered clean to use an enum like that? I mean, it's practically deprecated in C++. Perhaps any of you code gods or monkeys could enlighten me please.

Thomas Fjellstrom
Member #476
June 2000
avatar

It might not be entirely clean, but it is handy. And enums will always cast down to ints.

In one of my projects I use them for bit flags, which means I can't actually use the enum type to store the final values, so any place the enum is used, I just pass ints around.

--
Thomas Fjellstrom - [website] - [email] - [Allegro Wiki] - [Allegro TODO]
"If you can't think of a better solution, don't try to make a better solution." -- weapon_S
"The less evidence we have for what we believe is certain, the more violently we defend beliefs against those who don't agree" -- https://twitter.com/neiltyson/status/592870205409353730

BAF
Member #2,981
December 2002
avatar

Enums are deprecated in C++? WTF?

ImLeftFooted
Member #3,935
October 2003
avatar

Only on Tuesdays. Since us indie developers code on weekends and take Tuesday off, we never heard about it.

weapon_S
Member #7,859
October 2006
avatar

Tobias Dammers
Member #2,604
August 2002
avatar

Not to my knowledge.
It is true though that enums and ints are different types in C++, and converting between them is subject to the same rules as every other cast.
Since an enums underlying type is int, every possible value of any given enum can be converted to int without loss of precision. The reverse is not true, though, because most enums do not define values for the entire possible range of integers (and in fact, they never can, because the size of an integer may vary between platforms and is usually larger than the minimum size mandated by the language standard). Hence, casting from enum to int is a widening cast, which may be done implicitly, but casting from int to enum is narrowing and needs to be explicit. It can also fail, and you must be prepared to handle (or prevent) this.
Consequently, the following operations are all OK:
- assigning enum to int
- comparing enum vs. int
- using enum as array index
- passing enum as int argument
- using enum as case label in a switch
- performing bit-wise logic with enums, as long as the result is assigned to an int, not an enum
Which covers most, but not all, cases that may arise when you use an enum to define integer constants.

An enum is still tremendously handy for a list of integer constants, mainly because:

  • it is evident at a glance that the constants belong together

  • by using the enum type for function arguments, you can force only valid values to be passed, even if you later assign them to an int internally

  • you don't need to keep track of which values are used for what (although you can), because enums auto-number by default

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

bamccaig
Member #7,536
July 2006
avatar

It can also fail, and you must be prepared to handle (or prevent) this.

  • by using the enum type for function arguments, you can force only valid values to be passed, even if you later assign them to an int internally

???

bamccaig@castopulence:~/src$ cat main.cpp
#include <iostream>

enum Example
{
    ZERO,
    ONE,
    TWO
};

void printExample(Example);

int main(int argc, char * argv[])
{
    Example e;
    unsigned long long l = -1;

    std::cout << "Example{"
              << ZERO << ", "
              << ONE << ", "
              << TWO
              << "}"
              << std::endl;

    std::cout << "l=" << l << std::endl;

    e = (Example)5;

    printExample(e);

    e = (Example)l;

    printExample(e);

    printExample((Example)0xdeadbeef);

    return 0;
}

void printExample(Example e)
{
    std::cout << "e=" << e << std::endl;
}

bamccaig@castopulence:~/src$ g++ -Wall main.cpp
bamccaig@castopulence:~/src$ ./a.out
Example{0, 1, 2}
l=18446744073709551615
e=5
e=-1
e=-559038737
bamccaig@castopulence:~/src$

Tobias Dammers
Member #2,604
August 2002
avatar

Hmmm... OK, suppose that first point is moot.
The second point I should probably rephrase as "by using the enum type for function arguments, you can strongly suggest a range of valid values to the using code, even if you later assign them to an int internally". While you can pass integer values into a function through an enum argument, you need to use an explicit cast to do so, and, assuming you have at least half a brain, this should flash an alarm bell and tell you that it's probably a very good idea to first check if the cast is actually valid, or better yet, if you shouldn't be storing the value as the correct enum type in the first place.

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

Go to: