I've just noticed that my textbox, in OSX started writing those square [] looking characters when I pressed the arrow keys in OSX. Therefore the unichar field was non zero which is not consistent with Windows.
Up : 63232
Down : 63233
F1 : 63236
Docs say: This may be zero or negative if the event was generated for a non-visible "character", such as an arrow key.
Those numbers don't look zero or non negative
Since it happens in KeyChar, I have no way of filtering out these keys other than checking each one which seems wrong.
Thanks
Apparently this might depend on the OS X version:
http://sourceforge.net/tracker/index.php?func=detail&aid=3194187&group_id=5665&atid=105665
Someone should try to fix it, otherwise it makes it hard to write code that works consistently across platforms.
Could you possibly filter out Apple's special Unicode range and return 0 when you hit a key in that range?
ex:
if(unicode > xxxxx && unicode < yyyyy) { //Apple being *special* unicode = 0; }
What should we do about these keys?
backspace key -> ascii 8
tab key -> ascii 9
return key -> ascii 13
esc key -> ascii 27
They seem to work on other platforms besides OSX right now.
What do these give for you on OSX? this is why in my gui I do if(unichar >= ' ') and that works on Linux and Windows.
What should we do about these keys?
They should produce the ascii codes you have written there. The documentation should be clarified.
I've attached a patch that fixes my problem, and should make Allegro consistent with Windows and Linux, I basically filter out Apple's specialness.
I guess so. I applied a modified version of your patch to the 5.1 branch. Please test. 5.0.2 should be coming soon.
Well, in OSX backspace gives me 127 instead of 8 it seems - so the <= 32 would not work for it either.
The SDL library has simlilar behaviour for backspace on OSX.
The person who ported my SDL program from linux/windows to OSX had to re-convert the backspace (unicode==127) to ascii code 8.
It's especially confusing that SDL has key identifier SDLK_DELETE=127, and normally all numbers <= 127 are expected to be ascii key codes.
I think it would be nice to have Backspace return 8 on OS X as well.
On X we return 127 for Delete. On Windows (wine) we return 0.
I hadn't noticed the backspace / delete issue because I have a case for both of these in the form of allegro keycode, but yes, I do think an if(unicode == 127) { unicode = 8; } would be nice. But if backspace shows up as that, then does that mean delete shows up as ascii 8?
Also, I think I may know why Apple does this. On my MacBook Pro, in OSX, my key which says Delete on it actually deletes while the same key in Windows does backspace (for the whole OS). So I think this is intended by Apple and may not actually be a problem.
Windows is not emulated or anything, I run 7 Nativity with Apple provided Boot Camp drivers.
Also, your modified version works as expected for me.
Oww, after a very short googling I find that the key labeled 'delete' on a MacBook is the one that performs a backspace : To erase current character you'd do fn+Delete (*).
No that wonder that it gets confusing for cross-platform libraries...
Should allegro tell which key the user pressed ('delete') or what he wanted to do ?
(*) Does fn+Delete cause an event with unicode = 'the character for delete', different from 'the character for backspace' ?
@Audric
For me, in OSX, on my MBP:
Delete: 127
Fn + Delete: 8
Windows:
Inverse
Since it is inverse for every other application I say, lets not touch it, the patch Peter Wang uploaded makes it work consistent with everything else. I do not think we should invert Apples inversion since all software on OSX is already inverted.
In OSX, the delete key for me does the opposite that it does in Windows and I'm fine with that since all other software acts like this.
On my mac mini (using a normal, non-mac keyboard plugged into the USB port) the Backspace key produces 127 and the DEL key produces 63272. Both keys work as expected in native OSX applications. If we document Allegro to produce 8 for backspace and 127 for delete it seems we'd have to adjust for that.
An alternative solution would be to get rid of control codes and return 0 for backspace and delete as well as tab, return and esc. It seems to have things working on Macs you have to check the key code anyway - so allowing those ASCII codes under windows and linux would just mean more broken keyboard input code.
I agree, I think it would be a lot less confusing if across all platforms, all non renderable characters, arrow, fn, insert, del, backspace, etc, should just produce a Unichar of 0, which will indicate to the user to check the keycode.
So the keycode is always correct and the unichar is bogus?
I think the only thing that makes sense in 5.0 is to either a) document it, or b) convert OS X unichars to something sensible.
The idea would be that someone writing a text entry box would follow this logic only:
if (event.unichar != 0) al_ustr_insert_chr(editbox, pos, event.unichar); else { check event.key for DEL/Backspace/Cursor/Tab/Esc/Return }
Right now you can also sometimes use ascii codes... if we return always 0 you couldn't.
This is how I have always been doing it:
if (keycode == ALLEGRO_KEY_BACKSPACE) { } else if (unichar >= 32) { }
I realize not all unichar >= 32 are going to be true printable characters, but that doesn't bother me. The point is, checking for non-zero isn't very correct anyway.
I don't think it should be assumed that Allegro translates or modifies the unichar in any way whatsoever. And I really don't think we should go about changing 5.0's behavior.
For 5.2 if the consensus is to zero out the unichar, then go for it. But I don't see why it's absolutely necessary. To me the field should simply be what the OS reports as the printable unichar. I don't consider LEFT/RIGHT/DELETE printable characters, so why would I ever think to use unichar for them?
To me, this is primarily a documentation issue.
Yea I do it Matthew's way too, but it's up to the Allegro Team, either one works for me.
I don't consider LEFT/RIGHT/DELETE printable characters, so why would I ever think to use unichar for them?
There's possibly more keys, maybe F keys produce non-standard unicode as well under OSX.
Yes, F Keys, Arrow, Insert, basically all the ones my patch blocked out.
There's possibly more keys, maybe F keys produce non-standard unicode as well under OSX.
I don't really care... Unless I'm checking every unichar code to see if it maps to some printable character with the font I'm using at the time, whether or not we mask some keys or not is irrelevant.
And if I do that (validate that the unichar key is in the font), then it doesn't matter if Apple uses strange characters. They wouldn't exist in the font, and they would be ignored.
To me unichar == something to be printed. The only time I would use unichar is if I want to print a string to the screen. If the user presses F1 when entering his name and that produces some strange symbol on OS X, I couldn't care less. If I'm looking for function keys or anything else that maps to a physical key, I'm just using the keycode constants.
Yes, but what I'm saying is we have these possibilities:
A) .unichar is always 0 for non-printable characters
B) .unichar reports whatever the OS reports
C) .unichar returns 8,9,13,27,127 but blocks out other non-printable characters
C is what is currently in SVN after the patch to set .unichar to 0 for those OSX symbols.
A is what I'm proposing - the only change to SVN would be that the five extra control codes we still put into .unichar are also set to 0.
B is how it worked before.
And both A and B seem better than C to me.
Backspace, Tab, Enter, Escape, etc. have ASCII codes associated with them, so to me, that's what they should produce if you "type" them. Ctrl-A also produces 1, Ctrl-B produces 2, etc.
I can see the reasoning for all three choices, but my preferences would be: C, B, A.
I wondered about "non-printable", and in the context of text input in an allegro game or utility, I guess the question always resolves to ML's point:
If the unicode value has a character in the font (shipped with the game), it's printable.
If it doesn't, it doesn't make sense to "accept" it as user input, as he won't see what he's just typed.
One big advantage of this approach is that if you provide a rudimentary font, let's say only part of latin characters, the user can still swap it with a bigger font in order to support more characters or a different set, without recompiling or anything.
Ctrl-A also produces 1, Ctrl-B produces 2, etc.
Ah, I forgot about those. So in my A5 text box by simply checking for ASCII 8 instead of KEY_BACKSPACE Ctrl-H will work as backspace. I'm sold for option C now
The EVENT_KEY_CHAR .unichar documentation should have something like this added:
Some special keys will set the .unichar field to their standard ASCII code: Backspace=8, Tab=9, Return=13, Escape=27, Delete=127. In addition if you press the Control key together with A to Z the .unichar field will have the values 1 to 26. For example Ctrl-A will set .unichar to 1 and both the Backspace key and Ctrl-H will set it to 8.