I used this little program
to produce this output:
29 screenmodes all mode refresh rates mode 0 has a refresh rate of 50 mode 1 has a refresh rate of 51 mode 2 has a refresh rate of 52 mode 3 has a refresh rate of 53 mode 4 has a refresh rate of 54 mode 5 has a refresh rate of 55 mode 6 has a refresh rate of 56 mode 7 has a refresh rate of 57 mode 8 has a refresh rate of 58 mode 9 has a refresh rate of 59 mode 10 has a refresh rate of 60 mode 11 has a refresh rate of 61 mode 12 has a refresh rate of 62 mode 13 has a refresh rate of 63 mode 14 has a refresh rate of 64 mode 15 has a refresh rate of 65 mode 16 has a refresh rate of 66 mode 17 has a refresh rate of 67 mode 18 has a refresh rate of 68 mode 19 has a refresh rate of 69 mode 20 has a refresh rate of 70 mode 21 has a refresh rate of 71 mode 22 has a refresh rate of 72 mode 23 has a refresh rate of 73 mode 24 has a refresh rate of 74 mode 25 has a refresh rate of 75 mode 26 has a refresh rate of 76 mode 27 has a refresh rate of 77 mode 28 has a refresh rate of 78
I tried looking at source, and saw where the refresh rates were calculated by getting dotclocks/pixels, but that AL_INLINE stuff still whooshes right over both my brain cells.
]]>I didn't know this either:
https://wiki.archlinux.org/index.php/NVIDIA#Refresh_rate_not_detected_properly_by_XRandR_dependant_utilities
You could replace the binary nvidia drivers with nouveau I guess.
]]>You could replace the binary nvidia drivers with nouveau I guess.
I'd posted a couple of months ago where the nouveau drivers didn't do the ex_gldepth correctly.
{"name":"956181f2a4816479000cccded84c63b3.png","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/9\/5\/956181f2a4816479000cccded84c63b3.png","w":651,"h":505,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/9\/5\/956181f2a4816479000cccded84c63b3"}
I have some stuff that'll get the correct refresh rate but I'd have to get a lot of extraneous junk out first.
[EDIT]
This little program
gave this output
29 modes Mode 0 has width 1024, height 768, refresh rate 85 Mode 1 has width 1024, height 768, refresh rate 43 Mode 2 has width 1024, height 768, refresh rate 75 Mode 3 has width 1024, height 768, refresh rate 70 Mode 4 has width 1024, height 768, refresh rate 60 Mode 5 has width 832, height 624, refresh rate 75 Mode 6 has width 800, height 600, refresh rate 85 Mode 7 has width 800, height 600, refresh rate 85 Mode 8 has width 800, height 600, refresh rate 75 Mode 9 has width 800, height 600, refresh rate 72 Mode 10 has width 800, height 600, refresh rate 60 Mode 11 has width 800, height 600, refresh rate 56 Mode 12 has width 720, height 400, refresh rate 85 Mode 13 has width 700, height 525, refresh rate 120 Mode 14 has width 640, height 480, refresh rate 85 Mode 15 has width 640, height 480, refresh rate 75 Mode 16 has width 640, height 480, refresh rate 73 Mode 17 has width 640, height 480, refresh rate 73 Mode 18 has width 640, height 480, refresh rate 60 Mode 19 has width 640, height 480, refresh rate 60 Mode 20 has width 640, height 400, refresh rate 85 Mode 21 has width 640, height 350, refresh rate 85 Mode 22 has width 512, height 384, refresh rate 140 Mode 23 has width 512, height 384, refresh rate 87 Mode 24 has width 512, height 384, refresh rate 120 Mode 25 has width 400, height 300, refresh rate 144 Mode 26 has width 320, height 240, refresh rate 146 Mode 27 has width 320, height 240, refresh rate 120 Mode 28 has width 320, height 175, refresh rate 171
Libraries were -lGL -lGLU -lXxf86vm
]]>If you disable dynamic TwinView (whatever that is) the driver will stop reporting fake refresh rates:
Option "DynamicTwinView" "False"
I'll make the xrandr backend for Allegro zero out all the refresh rates if the fake ones are detected.
We already have the option of using xf86vm, but randr is preferred. You can disable it at build time with cmake -DWANT_X11_XRANDR=off.
]]>You can disable it at build time with cmake -DWANT_X11_XRANDR=off.
Now I get this output (from the A5 proglet)
1 screenmodes all mode refresh rates mode 0 has a refresh rate of 0
]]>
Sounds like cmake didn't find the xf86vidmode dev libraries.
]]>This was in CMakeCache.txt:
//Have library Xxf86vm
CAN_XF86VIDMODE:INTERNAL=1
Hm, don't suppose you can build a debug version and post the allegro.log could you?
]]>build a debug version
Well I can't seem to do that either, I tried
-DGRADE_DEBUG=on
-DDEBUGMODE=on
-DDEBUG=on
as well as using 1 for a suffix instead of 'on' and all I get are warnings like this:
CMake Warning: The variable, 'GRADE_DEBUG', specified manually, was not used during the generation.
Oh, yeah, I tried an evironment var too.
set | grep CMAKE
CMAKE_C_FLAGS_DEBUG=CMAKE_C_FLAGS_DEBUG
[EDIT]
Option "DynamicTwinView" "False"
If that refers to something in xorg.conf, it's not in mine.
Or I'm supposed to enter it? I'll try it.
I only have one CRT hooked up.
[EDIT2]
I put that line in the "Devices" section of xorg.conf, did a "make clean" in
build, followed by make, make install, and that first program still says 1 screen mode with a refresh rate of 0.
Well I can't seem to do that either, I tried
Easiest way is to use 'ccmake' rather than 'cmake'. Otherwise do '-DCMAKE_BUILD_TYPE=Debug'
I put that line in the "Devices" section of xorg.conf, did a "make clean" in
build, followed by make, make install, and that first program still says 1 screen mode with a refresh rate of 0.
Changes in xorg.conf need an X restart.
]]>OK, I got the debug version, and recompiled the test proggie, but I can't find(1) allegro.log anywhere. It still says one screen mode with refresh 0.
Also, I did restart X, twice, the second time to check for errors on tty1.
[EDIT]
I tried trace into the program with gdb, and several interesting bits merely said "Value optimized out".
Never mind. I'll go back to X11 and OpenGL. Sorry.
]]>OK, I got the debug version, and recompiled the test proggie, but I can't find(1) allegro.log anywhere. It still says one screen mode with refresh 0.
I tried trace into the program with gdb, and several interesting bits merely said "Value optimized out".
Then you didn't get the debug version. You'd get an allegro.log in 'cwd', and it wouldn't be optimizing anything out.
]]>you didn't get the debug version.
I forgot to specify the debug library
system I xsystem.c:220 xglx_initialize [ 0.00006] XGLX driver connected to X11 (The X.Org Foundation 10905000). system I xsystem.c:222 xglx_initialize [ 0.00011] X11 protocol version 11.0. system I xsystem.c:232 xglx_initialize [ 0.00023] events thread spawned. system I system.c:270 al_install_system [ 0.00024] Allegro version: 5.0.4 display I xfullscreen.c:184 xinerama_init [ 0.00043] Xinerama version: 1.1 display I xfullscreen.c:196 xinerama_init [ 0.00051] Xinerama is active display D xfullscreen.c:941 _al_xglx_get_default_adapter [ 0.00052] get default adapter display D xfullscreen.c:941 _al_xglx_get_default_adapter [ 0.00054] get default adapter system I xsystem.c:243 xglx_shutdown_system [ 0.00055] shutting down. display D xfullscreen.c:213 xinerama_exit [ 0.10086] xfullscreen: xinerama exit. system D xsystem.c:270 xglx_shutdown_system [ 0.10105] xsys: close x11display.
Anyway, it's all too clever by at least a factor of two.
Off topic: I think we just had a mini-earthquake!
]]>Anyway, it's all too clever by at least a factor of two.
It didn't even initialize the XVidMode code, are you running that same code as above? You're sure allegro has XVidMode enabled? What happens if you create an ALLEGRO_DISPLAY first? (this shouldn't be necessary).
If by too clever you mean how it lazily init's the vidmode/xrandr/xinerama stuff, thats because xrandr and xvidmode can take some time to query, and when it was called on startup, it would pause for half a second for all allegro apps, even ones that never use a full screen mode, or want/need monitor modes or resolutions.
]]>are you running that same code as above?
Yes.
You're sure allegro has XVidMode enabled?
I'm googling what that means now.
What happens if you create an ALLEGRO_DISPLAY first?
Same thing (250Kb log attached in paperclip)
By "too clever" I mean it's too complicated for me to understand, I might as well do my own stuff instead of understanding Allegro all the way down, evils of premature optimization and all that.
[EDIT]
As far as I can tell, xvidmode is a command line app to change screen modes? I've only done that with shift-alt-keypad_plus or something. What's wrong with xrandr stuff?
]]>I'm googling what that means now.
The log you posted in the thread earlier doesn't show it initializing the XVidMode extension, or fetching any modes, so something there is wrong. Did you make absolute sure Allegro was compiled with XVidMode support?
]]>As the edit above shows (while you were posting) I have no idea what xvidmode is, apropos doesn't say anything about it. Googling more.
]]>"XVidMode" is the name of the X11 extension that provides the "XF86VidModeGetAllModeLines" function.
What's wrong with xrandr stuff?
Nothing if your driver supports it properly (it doesn't if you use Nvidia's shitty TwinView extension). If you can get either XVidMode or XRANDR to work, you should be golden. Just make sure you have the -dev packages for both before running cmake for allegro.
]]>I've hooked up two monitors, things are... interesting. I'm going to fiddle with it a few days, hoping my table doesn't collapse in the meantime. Or maybe I should grab an old computer and stack some books on it to support the back edge of the table.
]]>I should still have my nvidia xorg.conf files laying about if you need some help with that.
]]>It seems to work as far as it goes, my own OpenGL program is bombing out somewhere in keyboard events somewhere when using separate X screens, working on that first.
A5 seems to work except ex_fs_resize, and ex_fs_window always opens window on primary screen. I seem to recall reading that before. Oh, yeah, ex_display_options won't do fullscreen with TwinView or separate X screens, I don't remember how well these programs worked with one monitor.
I doubt I'll continue using twin monitors after I'm done experimenting, it seems too weird.
[EDIT]
If I comment this out then the program can accept and use keypresses, but you can't close the window except with Control C in the parent terminal.
case ClientMessage: if (*XGetAtomName(gwin.dpy, event.xclient.message_type) == *"WM_PROTOCOLS") { quit = 1; } break;
[EDIT3]
I seemed to fix it with
if(event.xclient.message_type == 292) quit = 1; break;
but that seems mighty hackish.
It shows
X Error of failed request: BadAtom (invalid Atom parameter) Major opcode of failed request: 17 (X_GetAtomName) Atom id in failed request: 0x299 Serial number of failed request: 65 Current serial number in output stream: 65
[EDIT2]
If you minimize something on the secondary monitor, how do you restore it?
[SOLVED] just alt-tab to it
my own OpenGL program is bombing out somewhere in keyboard events somewhere when using separate X screens, working on that first.
Mesa really hates that. It won't work with the open source drivers. Trying to open a second gl context on a second X Screen will cause the program to lock up.
I think it'll work with TwinView if you have Fake Xinerama enabled. I think. That way allegro can still detect both monitors, but you're stuck with one mode per screen. There's a lot of comments in the xfullscreen.c code that talks about the fun I had getting things to work as well as they do (I did fairly extensive testing, as have others, so it works in many cases, just probably not all).
I doubt I'll continue using twin monitors after I'm done experimenting, it seems too weird.
When I'm hunkering down to do some serious programming work, I love a dual display setup. Sometimes I'll even setup my old LCD with my laptop to work dual display on my laptop
]]>Trying to open a second gl context on a second X Screen will cause the program to lock up.
I just meant getting one window to work with the driver settings specifying separate X screens instead of one wide Twinview screen. I haven't gotten to having more than one window open yet, I'd have to convert lots of stuff to using structs in malloc'ed blocks rather than globals using one GL window.
]]>