ex_prim crashes on my laptop with DirectX due to al_create_vertex_buffer failing due to _al_create_vertex_buffer_directx failing due to is_legacy_card() returning true.
It would be nice to at least get a warning, like "Your laptop's shitty integrated graphics card sucks. Fail, hoser."
I thought ex_prim was working for me before, so I hacked it to use OpenGL quick and it works again.
I'm running Windows Vista with an ATI Radeon X1270 integrated graphics card.
It would be nice to at least get a warning, like "Your laptop's shitty integrated graphics card sucks. Fail, hoser."
You realize that these sample programs were hacked together as examples of how to use the features, not to be shining examples of be-all-and-end-all with every loose end tied up user programs, right? If you don't like it, rewrite it and submit a patch.
Well of course I don't expect every single line of code to be perfect. I'm just saying, I shouldn't have to debug ex_prim just to find out that some obscure function named is_legacy_card() is returning true on my laptop. If my laptop isn't good enough or doesn't support something there should be some way to tell, right?
So fix it and send in a patch or the entire new ex_prim.c. Get your name in lights! I bet you could fix the lack of messages for minimizing and maximizing windows too. Just think how impressive it will be when people google your name and find out you've contributed to open source software! OTOH, you'd have to use your real name for that to work well.
But my dear friend Arthur, "There's just no point".
Edit :
Also, my point is not that ex_prim crashes, it is that al_create_vertex_buffer will never return true on my laptop and if I hadn't debugged it I would never know why.
"There's just no point".
What? ex_dualies (?) returns an error if you don't have two monitors hooked up.
I was quoting the comment from the _al_create_vertex_buffer_directx function where it says there is just no point to try with a 'legacy' card. Apparently my Pixel Shader version is too low or something from looking at is_legacy_card(). OpenGL can do it though?
Apparently this answers it :
/*
* In the context of this file, legacy cards pretty much refer to older Intel cards.
* They are distinguished by three misfeatures:
* 1. They don't support shaders
* 2. They don't support custom vertices
* 3. DrawIndexedPrimitiveUP is broken
*
* Since shaders are used 100% of the time, this means that for these cards
* the incoming vertices are first converted into the vertex type that these cards
* can handle.
*/
I meant hack the main program to put up a error dialog box with the message "Your laptop's shitty integrated graphics card sucks. Fail, hoser." As for why OGL can do it, maybe the root cause of the problem is DX capability bits, sometimes two mutually exclusive abilities of a card can conflict with those bits.
So according to that comment, those legacy cards should still work in ex_prim, just have their vertex types converted?
So according to that comment, those legacy cards should still work in ex_prim, just have their vertex types converted?
Yes. But conversion means that you have to download the vertex buffer data back to the host system, convert the vertices and send the new vertices to the GPU. Legacy cards don't support shaders, which means that in the second step you can't use vertex buffers... which means there just was no point in making a vertex buffer: they are meant to be an optimization and these workarounds make them worse than just using al_draw_prim directly.
ex_prim shouldn't crash though, that's an oversight.
Legacy cards don't support shaders,
Does that mean I can't run any shader code on my laptop? Or just directx shaders, since opengl seemed to implement it?
Nice fix for ex_prim btw.
Does that mean I can't run any shader code on my laptop? Or just directx shaders, since opengl seemed to implement it?
Primitives addon does not require OpenGL to implement shaders, as it is more flexible than D3D, so at least you'll be able to use the addon.
Now... in terms of shaders... I'm not sure. Seems unlikely for D3D, but maybe it's possible for OpenGL? Since you appear to have A5.1 installed, I'd just try and see if some of the shader examples (ex_shader, ex_prim_shader, ex_shader_multitex) run under OpenGL.
(using 5.1.8) ex_prim_shader fails with this message :
{"name":"608234","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/d\/d\/dd8149736548c0dc849d07c30f0c0eca.png","w":493,"h":253,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/d\/d\/dd8149736548c0dc849d07c30f0c0eca"}
and then crashes.
ex_shader shows a dark red screen.
ex_shader_multitex flashes a window then I have to restore its real window and it shows a black screen.
ex_shader_target shows a black screen.
What are these supposed to do?
What are these supposed to do?
Well, not that XD.
I meant to run them OpenGL (need to plop an allegro5.cfg file with [graphics] driver = opengl in the compiled examples directory).
Ah, okay.
Using OpenGL :
ex_prim_shader ran properly and showed the light source following the mouse
ex_shader, ex_shader_multitex, and ex_shader_target all crash.
ex_shader backtrace :
ogl_draw.c:366 is :
glGenVertexArrays(1, &o->vao);
ex_shader_multitex crashes in the same spot.
ex_shader_target crashes in the same spot, ogl_flush_vertex_cache.