Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » C++ Error: void value not ignored as it ought to be

Credits go to monkeyCode for helping out!
This thread is locked; no one can reply to it. rss feed Print
C++ Error: void value not ignored as it ought to be
northdakota91
Member #10,342
October 2008

Hi to all, I get this error when I try to compile a source:

[CODE]
In file included from main.cpp:13:
Object.h: In constructor `Object::Object(const char*, bool)':
Object.h:31: error: void value not ignored as it ought to be

Object.h: In member function `void Object::load_bitmap(const char*, bool)':
Object.h:54: error: void value not ignored as it ought to be

make.exe: *** [main.o] Error 1
[/CODE]
The methods that seems to have the error are this:

[CODE]
Object::Object( const char* extern_file, bool animated )
{
sharp = load_bitmap( extern_file, pal );
x = y = 0;
if(animated)
{
animate = true;
} else {
animate = false;
}
}
[/CODE]

and this:

[CODE]
void Object::load_bitmap( const char* file, bool animated )
{
sharp = load_bitmap( file, pal );
if(animated)
{
animate = true;
} else {
animate = false;
}
}
[/CODE]

I get an error when I try to use load_bitmap(). The class is this one:
[CODE]
class Object {
private:
PALETTE pal;
BITMAP *sharp;
int x, y;
bool animate;
unsigned frames;
public:
Object( int width, int height );
Object( const char* extern_file, bool animated );
~Object( void );
void render( void );
void setxy( int newx, int newy );
void load_bitmap( const char* file, bool animated );
void draw( void );

};
[/CODE]

Why i get that error?

monkeyCode
Member #7,140
April 2006

unsigned frames;
Unsigned what?

Other than that: fix the code tags and I'll actually read the code :)

northdakota91
Member #10,342
October 2008

I've found the error! I've called a class method with an allegro function name (load_bitmap()). Sorry, that was my stupid fault... ;D;D

Anyway, what's the tag to post the code? (sorry if I'm out of topic ;D)

monkeyCode
Member #7,140
April 2006

Mockup Help

Another thing.

if (state) {
    result = true;
} else {
    result = false;
}

// That's way to redundant, use assignments! :-)

result = state;
result = !state; // inverse.

Also

// GCC does default to int, can't remember if it's a part of the specification or just a GCC niche (I'm in the C# world now days)
// Anyways, even from just a readability standpoint, unsigned int is so much clearer.
unsigned value;

Speedo
Member #9,783
May 2008

Quote:

// GCC does default to int, can't remember if it's a part of the specification or just a GCC niche

C uses default-int, C++ doesn't.

Tobias Dammers
Member #2,604
August 2002
avatar

http://www.cppreference.com/wiki/data_types said:

Several of these types can be modified using the keywords signed, unsigned, short, and long. When one of these type modifiers is used by itself, a data type of int is assumed.

Using unsigned by itself is perfectly OK.

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

bamccaig
Member #7,536
July 2006
avatar

Tobias Dammers said:

Using unsigned by itself is perfectly syntactically correct.

Fixed. :P

Tobias Dammers
Member #2,604
August 2002
avatar

It is syntactically correct yes, which means it is the shortest way of specifying an unsigned int. Since the 'int' keyword is optional, and everyone with some semi-decent knowledge of C++ knows that it's implicit, I don't see why one needs to add it. Just like you don't write 'signed short int' but rather just 'short', or 'long' instead of 'signed long int'. I consider it good practice to not clobber the code with stuff that doesn't carry any added information.
C++ integer types are somewhat messed up anyway; they're halfway between the older type kit from languages like B, where you could basically build your own integer type by specifying the number of bits and the signedness, and newer systems, where each type is well-defined and has its own dedicated name (e.g. C#).

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

Go to: