Allegro.cc - Online Community

Allegro.cc Forums » Allegro Development » Arrow keys cause unichar field to be non zero in OSX

This thread is locked; no one can reply to it. rss feed Print
 1   2 
Arrow keys cause unichar field to be non zero in OSX
jmasterx
Member #11,410
October 2009

I've just noticed that my textbox, in OSX started writing those square [] looking characters when I pressed the arrow keys in OSX. Therefore the unichar field was non zero which is not consistent with Windows.

Up : 63232
Down : 63233
F1 : 63236

Docs say: This may be zero or negative if the event was generated for a non-visible "character", such as an arrow key.

Those numbers don't look zero or non negative

Since it happens in KeyChar, I have no way of filtering out these keys other than checking each one which seems wrong.

Thanks

Peter Wang
Member #23
April 2000

Apparently this might depend on the OS X version:

http://sourceforge.net/tracker/index.php?func=detail&aid=3194187&group_id=5665&atid=105665

Someone should try to fix it, otherwise it makes it hard to write code that works consistently across platforms.

jmasterx
Member #11,410
October 2009

Could you possibly filter out Apple's special Unicode range and return 0 when you hit a key in that range?

ex:

if(unicode > xxxxx && unicode < yyyyy)
{
   //Apple being *special*
   unicode = 0;
}

Elias
Member #358
May 2000

What should we do about these keys?

backspace key -> ascii 8
tab key -> ascii 9
return key -> ascii 13
esc key -> ascii 27

They seem to work on other platforms besides OSX right now.

--
"Either help out or stop whining" - Evert

jmasterx
Member #11,410
October 2009

What do these give for you on OSX? this is why in my gui I do if(unichar >= ' ') and that works on Linux and Windows.

Peter Wang
Member #23
April 2000

Elias said:

What should we do about these keys?

They should produce the ascii codes you have written there. The documentation should be clarified.

jmasterx
Member #11,410
October 2009

I've attached a patch that fixes my problem, and should make Allegro consistent with Windows and Linux, I basically filter out Apple's specialness.

Peter Wang
Member #23
April 2000

I guess so. I applied a modified version of your patch to the 5.1 branch. Please test. 5.0.2 should be coming soon.

Elias
Member #358
May 2000

Well, in OSX backspace gives me 127 instead of 8 it seems - so the <= 32 would not work for it either.

--
"Either help out or stop whining" - Evert

Audric
Member #907
January 2001

The SDL library has simlilar behaviour for backspace on OSX.
The person who ported my SDL program from linux/windows to OSX had to re-convert the backspace (unicode==127) to ascii code 8.
It's especially confusing that SDL has key identifier SDLK_DELETE=127, and normally all numbers <= 127 are expected to be ascii key codes.

Peter Wang
Member #23
April 2000

I think it would be nice to have Backspace return 8 on OS X as well.

On X we return 127 for Delete. On Windows (wine) we return 0.

jmasterx
Member #11,410
October 2009

I hadn't noticed the backspace / delete issue because I have a case for both of these in the form of allegro keycode, but yes, I do think an if(unicode == 127) { unicode = 8; } would be nice. But if backspace shows up as that, then does that mean delete shows up as ascii 8?

Also, I think I may know why Apple does this. On my MacBook Pro, in OSX, my key which says Delete on it actually deletes while the same key in Windows does backspace (for the whole OS). So I think this is intended by Apple and may not actually be a problem.

Windows is not emulated or anything, I run 7 Nativity with Apple provided Boot Camp drivers.

Also, your modified version works as expected for me.

Audric
Member #907
January 2001

Oww, after a very short googling I find that the key labeled 'delete' on a MacBook is the one that performs a backspace : To erase current character you'd do fn+Delete (*).
No that wonder that it gets confusing for cross-platform libraries...
Should allegro tell which key the user pressed ('delete') or what he wanted to do ?

(*) Does fn+Delete cause an event with unicode = 'the character for delete', different from 'the character for backspace' ?

jmasterx
Member #11,410
October 2009

@Audric

For me, in OSX, on my MBP:

Delete: 127
Fn + Delete: 8

Windows:
Inverse

Since it is inverse for every other application I say, lets not touch it, the patch Peter Wang uploaded makes it work consistent with everything else. I do not think we should invert Apples inversion since all software on OSX is already inverted.

In OSX, the delete key for me does the opposite that it does in Windows and I'm fine with that since all other software acts like this.

Elias
Member #358
May 2000

On my mac mini (using a normal, non-mac keyboard plugged into the USB port) the Backspace key produces 127 and the DEL key produces 63272. Both keys work as expected in native OSX applications. If we document Allegro to produce 8 for backspace and 127 for delete it seems we'd have to adjust for that.

An alternative solution would be to get rid of control codes and return 0 for backspace and delete as well as tab, return and esc. It seems to have things working on Macs you have to check the key code anyway - so allowing those ASCII codes under windows and linux would just mean more broken keyboard input code.

--
"Either help out or stop whining" - Evert

jmasterx
Member #11,410
October 2009

I agree, I think it would be a lot less confusing if across all platforms, all non renderable characters, arrow, fn, insert, del, backspace, etc, should just produce a Unichar of 0, which will indicate to the user to check the keycode.

Matthew Leverton
Supreme Loser
January 1999
avatar

So the keycode is always correct and the unichar is bogus?

I think the only thing that makes sense in 5.0 is to either a) document it, or b) convert OS X unichars to something sensible.

Elias
Member #358
May 2000

The idea would be that someone writing a text entry box would follow this logic only:

if (event.unichar != 0)
    al_ustr_insert_chr(editbox, pos, event.unichar);
else {
    check event.key for DEL/Backspace/Cursor/Tab/Esc/Return
}

Right now you can also sometimes use ascii codes... if we return always 0 you couldn't.

--
"Either help out or stop whining" - Evert

Matthew Leverton
Supreme Loser
January 1999
avatar

This is how I have always been doing it:

if (keycode == ALLEGRO_KEY_BACKSPACE)
{
}
else if (unichar >= 32)
{
}

I realize not all unichar >= 32 are going to be true printable characters, but that doesn't bother me. The point is, checking for non-zero isn't very correct anyway.

I don't think it should be assumed that Allegro translates or modifies the unichar in any way whatsoever. And I really don't think we should go about changing 5.0's behavior.

For 5.2 if the consensus is to zero out the unichar, then go for it. But I don't see why it's absolutely necessary. To me the field should simply be what the OS reports as the printable unichar. I don't consider LEFT/RIGHT/DELETE printable characters, so why would I ever think to use unichar for them?

To me, this is primarily a documentation issue.

jmasterx
Member #11,410
October 2009

Yea I do it Matthew's way too, but it's up to the Allegro Team, either one works for me.

Elias
Member #358
May 2000

I don't consider LEFT/RIGHT/DELETE printable characters, so why would I ever think to use unichar for them?

There's possibly more keys, maybe F keys produce non-standard unicode as well under OSX.

--
"Either help out or stop whining" - Evert

jmasterx
Member #11,410
October 2009

Yes, F Keys, Arrow, Insert, basically all the ones my patch blocked out.

Matthew Leverton
Supreme Loser
January 1999
avatar

Elias said:

There's possibly more keys, maybe F keys produce non-standard unicode as well under OSX.

I don't really care... Unless I'm checking every unichar code to see if it maps to some printable character with the font I'm using at the time, whether or not we mask some keys or not is irrelevant.

And if I do that (validate that the unichar key is in the font), then it doesn't matter if Apple uses strange characters. They wouldn't exist in the font, and they would be ignored.

To me unichar == something to be printed. The only time I would use unichar is if I want to print a string to the screen. If the user presses F1 when entering his name and that produces some strange symbol on OS X, I couldn't care less. If I'm looking for function keys or anything else that maps to a physical key, I'm just using the keycode constants.

Elias
Member #358
May 2000

Yes, but what I'm saying is we have these possibilities:

  • A) .unichar is always 0 for non-printable characters

  • B) .unichar reports whatever the OS reports

  • C) .unichar returns 8,9,13,27,127 but blocks out other non-printable characters

C is what is currently in SVN after the patch to set .unichar to 0 for those OSX symbols.

A is what I'm proposing - the only change to SVN would be that the five extra control codes we still put into .unichar are also set to 0.

B is how it worked before.

And both A and B seem better than C to me.

--
"Either help out or stop whining" - Evert

Peter Wang
Member #23
April 2000

Backspace, Tab, Enter, Escape, etc. have ASCII codes associated with them, so to me, that's what they should produce if you "type" them. Ctrl-A also produces 1, Ctrl-B produces 2, etc.

I can see the reasoning for all three choices, but my preferences would be: C, B, A.

 1   2 


Go to: