|
OpenGL vertex arrays for dummies |
gnolam
Member #2,030
March 2002
|
... or rather, one dummy (me). I just resurrected my OpenGL project, and I thought I'd do it right this time. From what I gather of previous topics on this board, vertex arrays are the way to go when rendering stuff in OpenGL. Unfortunately, I have no idea how to use them So my questions are: -- |
X-G
Member #856
December 2000
|
Good luck - I've asked this before, and all everyone (*cough*Korval*cough*) ever tells me is to read the manual. -- |
gillius
Member #119
April 2000
|
I could tell you how to do it in D3D :\. Seriously though, did you try display lists? I think in many OpenGL drivers using display lists gives you the same benefits as vertex arrays for static data. If you are wanting to use vertex arrays of dynamic data for the puposes of AGP transfer and parallel processing between CPU and GPU I can't help you there, but if you find out let me know. I've heard it's "very easy" from a few OpenGL using friends. If I catch one of them online I'll ask what functions to use to point you in the right way. I will say this... I'm pretty sure that (real meaning AGP transfer or stored in VRAM) vertex arrays are extensions specific to NVIDIA and ATI -- each video card is different. There is a new extension... I mean BRAND NEW that support for it is only in the very latest drivers, called VBO, and it's supposed to be the OpenGL's efforts to combine it. In D3D, it has the benefit that the arrays work the same on each card... I just wish it wasn't so low-level, but D3DX is awesome. Gillius |
23yrold3yrold
Member #1,134
March 2001
|
I'll write a tutorial sometime; my OpenGL book covers this pretty well. Failing that ... try NeHe? At least ask on his forums; if there's a tutorial, they'll know ... -- |
X-G
Member #856
December 2000
|
Quote: I've heard it's "very easy" No doubt it is, but that doesn't much help when no one wants to tell you how to. -- |
Goodbytes
Member #448
June 2000
|
I figured it out myself by reading a few online sources, which I will now share with you. First, the write-up on the ARB_vertex_buffer_object GL extension. VBO's are sort of like a common interface to the blindingly fast versions of vertex arrays that are stored in resident memory on the video card. Not only that, but to use them, all you have to do is take a program that uses vertex arrays and add a few lines of VBO code without changing anything else. The write-up spec includes examples on how to use normal vertex arrays as a comparison with an example on how to use the VBO extension. The page is at the OpenGL Extension Registry at http://oss.sgi.com/projects/ogl-sample/registry/, more specifically, under ARB Extensions By Number #28. Scroll down about 7/8 to reach the code. Also, I found more information on the functions that matter at the OpenGL Reference at http://tc1.chemie.uni-bielefeld.de/doc/OpenGL/hp/Reference.html. Look under gl{Enable|Disable}ClientState, gl{Vertex|Color|TexCoord|EdgeFlag|Normal}Pointer, glDrawArrays, glDrawElements, and glArrayElement. Finally, back at the extension registry, for more useful ways to draw your vertex arrays or VBO's, which have the same drawing interface, look the Non-ARB extension number 148, GL_EXT_multi_draw_arrays. I know that this isn't what you were looking for, so here are the important lines from a project of mine that uses vertex arrays. glEnableClientState(GL_VERTEX_ARRAY); // Enable the vertex array. glVertexPointer(3, GL_FLOAT, 0, vertex_data); // Fill the vertex array. glDrawArrays(GL_QUADS, 0, number_of_vertices); // Draw the vertex array. glDisableClientState(GL_VERTEX_ARRAY); // Disable the vertex array. That's really what it boils down to. Of course, if you want to use the color array or the normal array or the texcoord array (etc...) you'll need to add the appropriate duplicates of lines 1, 2, and 4. Also, instead of glDrawArrays you may want to use something from GL_EXT_multi_draw_arrays to allow for added efficiency through stripification, whereby you convert your face data from triangle soup to a series of triangle strips. (Or quad soup to quad strips, triangle soup to triangle fans, etc.) I hope that this helps. |
Tobias Dammers
Member #2,604
August 2002
|
Quote: I think in many OpenGL drivers using display lists gives you the same benefits as vertex arrays for static data.
Almost true. --- |
gnolam
Member #2,030
March 2002
|
Thanks! My program still runs slow as hell, but at least now it's using vertex arrays -- |
Tobias Dammers
Member #2,604
August 2002
|
Quote: Thanks! My program still runs slow as hell,
Hm, maybe the problem is somewhere else? Texturing is very common to produce performance issues if not done wisely. In my current project, I could get something like a 400% speed boost when changing texture format from 32bpp (GL_RGBA8) to 16/8bpp (GL_RGBA4 or GL_ALPHA8 or what is it). --- |
gnolam
Member #2,030
March 2002
|
Learning vertex arrays was something I thought I'd learn for future use and not just for this program. But you were right about the textures, there is something really weird with it - if I don't bind the 256x256 terrain texture before drawing, my FPS jumps from 2 to 200! [EDIT] -- |
Korval
Member #1,538
September 2001
|
Quote: Whoa, even AllegroGL's 'tex' example program runs slow! And that only has 4 textured faces! It sounds like you're not getting hardware acceleration. If untextured polys are rediculously faster than textured ones, then you aren't getting acceleration. |
Zaphos
Member #1,468
August 2001
|
Gnolam: Perhaps you should state your 3D card so we can figure out if it's one of the notorious ones which thinks that OpenGL is some irrelevant api that need not be supported (ATI Rage, anyone?).
|
Thomas Fjellstrom
Member #476
June 2000
|
I'm dumb!.. good luck with ANY decent 3d on a rage though it wasnt TOO bad, I still couldnt play any of the games for the last several years on my Rage 128 Pro. while a TNT2 was a bit better, and a Gforce 1 was light years ahead -- |
gnolam
Member #2,030
March 2002
|
Nope, GeForce2 MX under Win98SE, so OpenGL support shouldn't be a problem. Normal OpenGL games (e.g. IL-2: FB, The Specialists (Half-Life), RTCW:ET) work fine, so it seems like only AllegroGL programs are slow I think I'll recompile AllegroGL and see what happens... I mean, it did work a few months ago, so I've probably just done something stupid with it. [EDIT] -- |
Kitty Cat
Member #2,815
October 2002
|
I seem to have a problem with AllegroGL under WinXP Home with a GeForce 4 MX, and under Win2k with a Radeon 7000. Whenever I ran an AGL game fullscreen, I'd get absolutely no hardware acceleration, and the entire screen would be tinted green, although running it windowed worked fine. I didn't have a problem with the Radeon 7000 under Win98SE, oddly. Dunno if it's related or not, though. -- |
gnolam
Member #2,030
March 2002
|
Reinstalled MinGW and then recompiled Allegro and AllegroGL. Didn't work. Damn! Guess I'll have to go Win32 -- |
Bob
Free Market Evangelist
September 2000
|
What drivers are you using? If you play with the app's color depth, does it help? -- |
gnolam
Member #2,030
March 2002
|
Yay! It turned out to be a driver bug! Still strange how it only affected AllegroGL and nothing else though -- |
|