Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » [DirectX 10]Is anyone following it?

This thread is locked; no one can reply to it. rss feed Print
 1   2 
[DirectX 10]Is anyone following it?
Archon
Member #4,195
January 2004
avatar

Is anyone interested in DirectX 10? I came across the beta for it while looking for the DirectX 9 SDK.

It's (will be) Vista only, but I've read that it's been redesigned (as opposed to simply 'improved DirectX'.

Secondly, I've read in the DirectX SDK, that you cannot use DirectX in conjunction with a non-Windows platform - is that a void statement (due to anti-trust laws or something)?

Thomas Fjellstrom
Member #476
June 2000
avatar

I'll bet Shawn has known about it for a while ;) I'll just bet that DX10 is just XNA's new framework.

--
Thomas Fjellstrom - [website] - [email] - [Allegro Wiki] - [Allegro TODO]
"If you can't think of a better solution, don't try to make a better solution." -- weapon_S
"The less evidence we have for what we believe is certain, the more violently we defend beliefs against those who don't agree" -- https://twitter.com/neiltyson/status/592870205409353730

Archon
Member #4,195
January 2004
avatar

Quote:

I'll just bet that DX10 is just XNA's new framework.

Possible - I don't know too much about XNA (CP just mentioned it to me).

Is DirectX 9 and below going to be scrapped with future Windows versions?

Goalie Ca
Member #2,579
July 2002
avatar

The whole reason to use a library is to make things simpler and easy. Opengl does that to some extent, sdl does it, allegro does it, and vtk does it. Directx might as well be obfuscated assembly. Not the easiest thing to learn or use quickly.

There's a few critical points though that it offers, but honestly, i haven't a clue if there's any useful changes, even in 3D land. Maybe someone could enlighten me as to why i'd want to go out and learn it!? especially version 10 vs 9 and 8.

If i were to draw the analogue, i would say that as many people care about 10.0 as they do about vista and probably for the same reasons.

-------------
Bah weep granah weep nini bong!

Archon
Member #4,195
January 2004
avatar

The reason I'm interested in DirectX 9 right now is for a part of a project for university. I want to make a wrapper (like I did for DirectX 7 and 8 in VB6).

And DirectX 7 and 8 helped me understand the concepts of Allegro and OpenGL respectively.

Bob
Free Market Evangelist
September 2000
avatar

Quote:

Is DirectX 9 and below going to be scrapped with future Windows versions?

No, they'll be converted to DX10 in real-time, by the runtime provided by Microsoft.

--
- Bob
[ -- All my signature links are 404 -- ]

Thomas Fjellstrom
Member #476
June 2000
avatar

Yay, more layers, more indirection, and more emulation! Just what the Wintel empire needs to make more people upgrade more times!

--
Thomas Fjellstrom - [website] - [email] - [Allegro Wiki] - [Allegro TODO]
"If you can't think of a better solution, don't try to make a better solution." -- weapon_S
"The less evidence we have for what we believe is certain, the more violently we defend beliefs against those who don't agree" -- https://twitter.com/neiltyson/status/592870205409353730

Bob
Free Market Evangelist
September 2000
avatar

Oddly enough, DX9 and under are likely to get faster by being layered on top of DX10 than not.
See Wloka's law for why DX9 (and under) is much slower than you think.

DX10 fixes a large portion of this (not quite reaching OpenGL yet, though), enough to make most DX9 applications run faster when emulated on top of DX10.

--
- Bob
[ -- All my signature links are 404 -- ]

Archon
Member #4,195
January 2004
avatar

Quote:

not quite reaching OpenGL yet, though

I'd thought that Direct3D would be more feature rich than OpenGL...

Bob
Free Market Evangelist
September 2000
avatar

Quote:

I'd thought that Direct3D would be more feature rich than OpenGL...

I was talking about performance, not features. See the posts above.

--
- Bob
[ -- All my signature links are 404 -- ]

Shawn Hargreaves
The Progenitor
April 2000
avatar

DX10 is going to coexist with DX9, so games can pick either API.

Why?

Because DX10 has a really aggressive set of requirements. Not only does it require Vista, it also needs a DX10 capable GPU with shader 4.0, virtual memory management, geometry shaders, etc. DX10 doesn't have any GPU caps at all: for a card to support any of DX10, it has to support all of it.

Hence, DX9 also has to stay around in case people want to make games that run on older hardware.

Realistically, I suspect most games will stick with DX9 for the next few years. But eventually everyone will have DX10 cards, and the great thing there is, finally games will have a totally fixed feature set that they can assume is always available on every machine. This should make life a lot easier for game devs 5 years from now.

In terms of significant new features, the biggest thing in DX10 is actually behind the scenes: the driver model has been totally redesigned to give some big performance improvements. They've moved to a model where more things have to be bound up front, allowing drivers to do aggressive optimisation at resource creation time and hopefully not have to do any fixup work at all during the actual rendering. These changes affect many parts of the API, for instance renderstates are now only set via a handful of state objects (no more SetRenderState), but the overall goal is robustness and performance rather than actual new features.

Also, DX10 introduces virtual memory on the GPU. This isn't so immediately interesting for games, other than it means alt+tab will finally work automatically, but it has big implications for multiple windowed applications cooperating to share the GPU (kind of important when the OS itself wants to use GPU effects for menus and so on).

Sirocco
Member #88
April 2000
avatar

Thanks Bob, Shawn. That clears up a lot of nagging concerns I was having moving into the Vista era.

-->
Graphic file formats used to fascinate me, but now I find them rather satanic.

Matt Smith
Member #783
November 2000

"This should make life a lot easier for game devs 5 years from now." this carries the strong assumption that nothing amazing will appear on new cards. 5 years is an awful long time in computers.

Shawn Hargreaves
The Progenitor
April 2000
avatar

Of course hardware will go on improving and new capabilities will arrive: nothing is ever going to stop that.

The benefit of DX10 will be a constant lowest common denominator sitting underneath the newer features.

Once we someday pass the barrier of saying "my game requires a DX10 card", that massively reduces the compatibility testing burden at the low end. As long as your game contains a code path that runs on a stock DX10 card, you can be sure that will always work on every machine in the world.

Then you can go off and use more advanced higher level features where they are supported, but where they are not, you always have a known feature set to fall back on.

That's a big big timesaving compared to the world today, where there are over 200 independent capability bits, plus a huge number of card oddities that can't even be expressed in the caps!

Plus, the DX10 feature set is "good enough" for a ridiculously wide range of rendering techniques. There is a very strong case to be made that now we have a fully general programmable GPU, most of the future advances will be on performance (longer shaders that run faster with bigger textures and better AA) rather than actual new capabilities. It has become more like CPU improvements: they get faster all the time, but you still program them with the same basic set of instructions.

Also, this lowest common denominator has huge implications for things other than high end 3D games. Today, how many UI apps use D3D to do alpha blending? Almost none, because that isn't consistently available and they didn't want to take on the compatibility testing burden. Imagine a universe where every silly little web card game can confidently rely on a decent set of GPU capabilities...

Jakub Wasilewski
Member #3,653
June 2003
avatar

Sure, it'd be cool and all, but GPU developers tend to quickly diverge from the lowest common denominator.

Look at OpenGL. Consequent versions of the API support more and more functionality that could become the LCD you mention, but instead we have the whole extension system, which is basically what you claim DX10 will avoid.

Once 90% of cards in people's computers supports DX10, the GPU industry will be far ahead of that. And no self-respecting commercial game studio will release a game that doesn't use all the capabilities of your card, regardless of the additional testing costs. A game that is not as beautiful as its competitors just doesn't sell as well - that is the common viewpoint held in the gaming industry.

So, I don't think DX10 will solve those problems. Instead we will get multiple extensions (vendor specific, standard or even Microsoft-designed) built on top of DX10, which will be implemented or not on the GPUs, and sometimes even misimplemented.

I'm not saying that DX10 is bad. I'm just not very eager to believe that it will become the remedy for GPU compatibility problems.

[edit]

Shawn said:

It has become more like CPU improvements: they get faster all the time, but you still program them with the same basic set of instructions.

Sure, the basic instructions haven't changed, but almost every line introduced it's own very useful extensions. Examples include conditional moves, MMX, SSE, 3dNow. Of course not every application will really benefit greatly from using those, but there are some that do, and aren't compiled for the "lowest common denominator" 386 instruction set.

---------------------------
[ ChristmasHack! | My games ] :::: One CSS to style them all, One Javascript to script them, / One HTML to bring them all and in the browser bind them / In the Land of Fantasy where Standards mean something.

Shawn Hargreaves
The Progenitor
April 2000
avatar

Sure, at the high end extensions will be important and people will use them.

But the high end isn't where the compatibility pain happens. It is the older hardware, the cards where your game doesn't look any good at all, but still has to run in some shape or form, that cause the huge testing burden. A modern 3D game spends probably 90% of its testing budget supporting cards from 3 or more years ago. Those represent a tiny percentage of the eventual sales, but a ridiculous amount of the dev cost. And you can't just not support them, because people are stupid and will still buy your game even if the box says it doesn't support your card, then they phone up tech support when it doesn't work, and every tech support call costs the publisher the profit from 5 to 10 sales! So you have to waste loads of time making it work on these crappy old cards.

That's where DX10 helps. The high end is still for you to solve, but that's ok: this is where game engine programmers can differentiate and try to sell their product, so it's a good place to spend their time. The point is that DX10 (will someday) give a constant platform to remove all the irritating crap around the low end support.

You should also consider the huge number of apps which aren't games and have no interest in chasing the high end. Today they use things like GDI, Flash, or DirectDraw, because those are constant and universally available. In the future they will be able to use DX10. I think that's pretty significant: it shifts the universal lowest common denominator of graphics rendering, which is currently just a 2D framebuffer containing pixels, into a world of universal hardware rendering, high quality filtering, blending, and sophisticated programmable shaders.

relpatseht
Member #5,034
September 2004
avatar

From what I've read in this thread, it seems that DirectX 10 will just be raising the current lowest common denominator to a much higher point, and though I know graphics can only get so good, I still wouldn't say we have come anywhere near graphical perfection. Basically, this means that in a few years, DirectX 10 will basically be what OpenGL 1.1 is now (I don't know what the DirectX equivalent would be), something that all cards fully support, but something that is never used without additional extensions for any modern game. Of course, by then, Microsoft will probably be on about DirectX 15 and as long as they continue what they started with DirectX 10, by which I mean continually raise the lowest common denominator, this wouldn't be much of a problem, unless you consider it troublesome to buy a new, top-of-the-line video card with every release of DirectX. Furthermore, stupid people will still buy games that say on the box that they require DirectX 10 when their card doesn't support it, and they will then still contact technical support. The only difference will be that they will ask tech support why their card doesn't support DirectX 10, what DirectX 10 is, and what they have to do to get DirectX 10 when the big flashing box appears that says that their computer does not support DirectX 10, rather than asking why the game doesn't work. Your average person is completely ignorant to computers, so this will undoubtedly happen.

[edit]

Don't get me wrong, it definitely is a good thing that soon all modern games will require DirectX 10, and thus require everyone to have what would currently be called a decent video card, I just don't think that requiring full support or none at all for DirectX 10 will be a solution for anything for more than a few years.

Shawn Hargreaves
The Progenitor
April 2000
avatar

I half agree with you, but also half don't.

There has been a huge sea change in GPU hardware over the last year. Finally, programmability has come far enough that there aren't really any nasty fixed function warts left: it's all just shader microcode. That means future improvements aren't going to be new features, they're just going to be faster ways of running the existing feature set (which allows people to write longer shaders and use more advanced techniques, but ultimately it is all just HLSL shader code).

Think Turing complete: once you have that, it's all just implementation details to make things run faster.

(I'm simplifying grossly here, because there are still way too many fixed function warts like limited numbers of interpolators, or the framebuffer blend silicon. But the theory is broadly true. And you can already see this in action. Most modern games that use ps 3.0 aren't actually caring about any of the new ps 3.0 features: all they want is ps 2.0 capabilities with longer shaders and more fillrate...)

Marcello
Member #1,860
January 2002
avatar

Quote:

But the high end isn't where the compatibility pain happens. It is the older hardware, the cards where your game doesn't look any good at all, but still has to run in some shape or form, that cause the huge testing burden. A modern 3D game spends probably 90% of its testing budget supporting cards from 3 or more years ago. Those represent a tiny percentage of the eventual sales, but a ridiculous amount of the dev cost. And you can't just not support them, because people are stupid and will still buy your game even if the box says it doesn't support your card, then they phone up tech support when it doesn't work, and every tech support call costs the publisher the profit from 5 to 10 sales! So you have to waste loads of time making it work on these crappy old cards.

If I understand what you're saying correctly, the exact same thing will still happen. The box will say "Needs a DX10 card" which is no different from saying "doesn't work with your card" or "requires opengl 2.0" or whatever. People are still going to try to play the game on a card that isn't DX10 compatible and they are still going to call up tech support when it doesn't run.

I'm failing to see how DX10 will change that.

Marcello

relpatseht
Member #5,034
September 2004
avatar

Yes, there have been massive changes in GPU hardware over the last few years, and people are currently mostly concerned with bigger, better shaders, but I am still quite hesitant to say there won't be even bigger changes in the years to come. If there is one thing I have been taught the hard way too many times its that as soon as I say something will not be done or cannot be done, someone does it anyway then punches me in the face with it.

Shawn Hargreaves
The Progenitor
April 2000
avatar

One day everyone will have DX10.

The trouble is today there is no such simple thing you can say. Publishers try things like "Requires DX8", which is kind of reasonable: the number of people who don't have a DX8 card today is low enough to be commercially acceptable.

But DX8 isn't a clear enough requirement. You end up with things like "Requires DX8 with pixel shader 1.1, or if not that, at least 3 multitexture blends supported, and must have hardware skinning for at least 2 bones, and must support D3DCAPSNON_POW2CONDITIONAL, and UBYTE vertex formats".

That is a crappy consumer experience, and causes loads of pain for the poor game devs in the trenches.

relpatseht
Member #5,034
September 2004
avatar

Yeah, I agree that the lowest common denominator needs to be increased, but DirectX 10 is only a temporary solution. Eventually, DirectX 10 will be what OpenGL 1.1 is now (once again, I don't know the DirectX equivalent), everyone will support it, but all games will require extensions, and say things on the box such as, "Requires DirectX 10 with SUPER-MEGA-HYPER-ULTRA-REAL-GFX-EXTENSION version 2.5."

Shawn Hargreaves
The Progenitor
April 2000
avatar

Sure, but then 5 or 10 years from now there can be a DX12 or DX15 or whatever that provides a new common base.

The point is a philosophical change: rather than just exposing whatever the hardware caps are and having bits to report this, for the first time DX is taking a strong stance on standardising a fixed set of functionality that all hardware can be expected to support.

I'm personally very excited about that.

The last time we had this level of universal standardisation was when GDI mandated that all screens had to consist of pixels, and it was no longer ok to use a text mode display!

Marcello
Member #1,860
January 2002
avatar

Although it's not all that standard if you're limiting yourself to just Windows Vista. I'd much rather have a cross-platform library. Otherwise I'm better off developing for a console, where none of this is an issue anyway.

What I want to know is why they didn't call it DirectXX.

Marcello

relpatseht
Member #5,034
September 2004
avatar

Quote:

Sure, but then 5 or 10 years from now there can be a DX12 or DX15 or whatever that provides a new common base.

The problem with every release of DirectX providing a new common base is that no one wants to go out and buy a new top-of-the-line, $500 video card every year with the new release of DirectX.

Quote:

The last time we had this level of universal standardisation was when GDI mandated that all screens had to consist of pixels, and it was no longer ok to use a text mode display!

I bet there were still people who had a text mode display for some years afterward. DirectX 10 isn't providing any new standardization, it is simply forcing the current standard (whatever the DirectX equivalent of OpenGL 2.0 is) upon consumers.

Once again, this really isn't a bad thing, the lowest common denominator definitely needs to be raised, I just don't think that keeping up the practice of providing a new common base with every release of DirectX is fair to the consumer, and not providing a new common base with every release of DirectX means that none of the current problems will be solved for very long.

 1   2 


Go to: