• Please make sure you are familiar with the forum rules. You can find them here: https://forums.tripwireinteractive.com/index.php?threads/forum-rules.2334636/

An interesting insight into the issue of draw calls on the PC

Apparently consoles run on some sort of magical software that is leagues ahead of what PCs have, which makes them inherently better managing draw calls.

What a silly article.

It's the lack of software layers that's letting it happen. If we bypassed directx and windows and just the the games use the hardware directly, it would be alot better for PC.
 
Upvote 0
There is a elephant in the room.

If console manafacturers leave Hardware manafacturers behind ie: Xbox decide to use Intel and Nvidia as compnents.

You are left with AMD and ATI who will most likely want to grow there market in the PC gaming.... They will themselves make the effort to get rid of the "Layers" that prevent high Draw call counts in PC's to match the console and sell more PC games as the PC will still be more versatile.
 
Upvote 0
haha codemasters wtf do they know, they couldnt make a good game if they had all the money in the world and the most talented programmers.

seriously what a joke.

i think the we as gamers are getting to the point now where we need to ask ourselfs this question, do we really need better graphics? i think we have hit the wall where if it gets much better performace suffers, and naturally the companies that rely on selling the "next gen" stuff dont like that idea. honestly what do we want as gamers? more bricks, or more FPS.
 
Last edited:
  • Like
Reactions: Rak
Upvote 0
I have good faith that in good time, Windows will get a sort of shell for gaming in which it can acces hardware much more directly, eliminating the disadvantage PC's currently have.

There's no reason why they shouldn't, it is still the largest individual platform in game sales. Too bad they never release the actual numbers of digital distribution sales.
 
Upvote 0
I've got a question, if the pc/console tech race is slowing, then why is BF3 for consoles so cut back and lacking, yet pc users get the full all singing all dancing version?

The article already answered this question: PCs have currently got more/faster speed RAM and higher speed clock on each CPU core.

If a next gen of consoles close this gap (which is very likely), then software architecture and the way hardware and software interact will leave PCs with little real advantage over consoles in terms of power to run games.
 
Upvote 0
haha codemasters wtf do they know, they couldnt make a good game if they had all the money in the world and the most talented programmers.

seriously what a joke.

[...]

Just because you dont like there games, it doesn't meen, there are have no idea whats going on. As a matter of fact, its totally irrelevant from which programmer or artist of whatever company this comes from. Every programmer has to work around limitations of there plattform, to get the most of his project.
 
Upvote 0
I have good faith that in good time, Windows will get a sort of shell for gaming in which it can acces hardware much more directly, eliminating the disadvantage PC's currently have.

There's no reason why they shouldn't, it is still the largest individual platform in game sales. Too bad they never release the actual numbers of digital distribution sales.


If you gain access to the hardware directly wouldn't optimization become a huge task to bear considering all the different architectures on the PC? I think it will be hard to get rid of drivers, directX and the like.

Hopefully AMD will significantly increase its budget for advancing their drivers > DX11, and try to talk to Microsoft on how to sort out DX11 on the Windows 7 platform. I am sure it would be of interest for them all if DX11 and the AMD graphics hardware worked optimally, and would somewhat help out the Xbox 3 performance in the long run too.

Id software is currently pulling all their strings to make this happen. Hopefully we will see some results.
 
Last edited:
Upvote 0
So, if I understand this a bit correctly:

In order for a game on a PC to be better able to handle draw calls overall and get more out of newer hardware, it would be better if it were designed for at least DX11 and not designing (any) game around DX9? I am not commenting on the PC vs Console aspect of the discussion, just what would Currently further PC gaming a bit.
 
Upvote 0
yes, dropping Dx9 altogether would bring pc-gaming forward, just look at BF3. i hope more and more developers will follow the example of Dice.

every new gfx-card sold in the past 5 years supports at least DirectX 10 (Geforce 8800 GTX was released 2006; first DX10 card) and according to statcounter.com Win 7 has overtaken Win XP usage, so now is the perfect time to drop that ancient API.
 
Upvote 0
Reise, please read mate. its not magic.

Of course it isn't.

Take any PC with current hardware and run a console port on it. If you don't blast away at 60+ frames at beyond maximum settings (mods/forced post processing effects, etc) there's either something wrong with your PC or the game's port is simply poorly done.

I'd bet you can't exactly do the same thing bringing a PC game to consoles now, can you? Draw calls don't really seem to be the issue here do they?

And to assume drivers and DX/whatever architecture won't advance to that same level of outperformance when new consoles roll around is absurd. As always, next gen consoles will be at least ON PAR with PCs once again for a short while after their release. But that equality won't last, it never has and never will. For the same reason any piece of hardware never stays top of the line for long.

All this aside, I don't really see how an old PC vs. Console argument about draw calls relates to RO2's issues with them. But at least the "OH but if we could only interface with the hardware directly!" comments are a little funny.
 
  • Like
Reactions: Gaizokubanou
Upvote 0
With current hardware, it's not feasible to "let the CPU access the GPU directly" for a few reasons:

1. There are approximately a zillion different GPU chips in widespread use, and the whole point of drivers is to avoid having to rewrite the renderers in games for each GPU family and all the variants. Effectively this means that game developers can never hit the metal directly on PC, because there are simply too many chips that would have to be supported. That work would have to be repeated for every single game (engine) developed, which would be a colossal waste of manpower. This is why we have drivers.

2. Letting applications (games) access hardware directly would open up a giant security hole that hackers and other deviants would try to exploit to take over a machine. This would not go down well with business users.

Some sort of virtualisation scheme might one day make it possible, but you won't see it any time soon.
 
Upvote 0
Apparently consoles run on some sort of magical software that is leagues ahead of what PCs have, which makes them inherently better managing draw calls.

What a silly article.

Fact: DX9 doesn't handle multithreaded draw calls.

That's probably the reason why this game is considered CPU intensive. But even this is way too much of a generalization. For a little more info and some bickering, read this:
http://www.gamedev.net/topic/601836-multithreaded-rendering/

With current hardware, it's not feasible to "let the CPU access the GPU directly" for a few reasons:

1. There are approximately a zillion different GPU chips in widespread use, and the whole point of drivers is to avoid having to rewrite the renderers in games for each GPU family and all the variants. Effectively this means that game developers can never hit the metal directly on PC, because there are simply too many chips that would have to be supported. That work would have to be repeated for every single game (engine) developed, which would be a colossal waste of manpower. This is why we have drivers.

2. Letting applications (games) access hardware directly would open up a giant security hole that hackers and other deviants would try to exploit to take over a machine. This would not go down well with business users.

Some sort of virtualisation scheme might one day make it possible, but you won't see it any time soon.

Great post. This is what I've been thinking as well.
 
Last edited:
Upvote 0
50,000 is A LOT of draw calls. It's been known for a very long time that draw calls should be minimized as much as possible with batching. Most games will do far, far less than that (like ~10% at most).
Overall his post was interesting, but I'm skeptical of his long term predictions that this alone will cause the death of PC gaming. Considering all of the other issues the business of PC gaming faces, I think this issue is very, very far down the list.
 
Upvote 0
This has simply happened because developers (no doubt due to financial pressure) have become reliant on DirectX and turned away from Direct-to-metal coding (oh the good ol' days of "Soundblaster compatiable" menu's).

I believe most console development use this method now to get the absolute maximum performance (upwards of 20,000 drawcalls compared to a PC around 2-3000), so why don't PC's? All about the money. The variety of GPU's means a hell of a lot more development time, and quality testing...so they sacrifice performance and innovation to save pennies.

Hopefully the newer API's will get us back to this, opening up low-level programming options...and the first games that do this will look light-years ahead.


Quoting some post that stuck in my head, "The best graphics driver is NO graphics driver."
 
Upvote 0
Okay, this is all well and good, but why can't you make the game use my damn CPU more?

I'm on a 4.8 gHz i7 2600k and can't get more than 40% CPU usage while the game is running like a slug and using 30% GPU.

If it was choking on draw calls wouldn't the CPU be maxed? If the game is only able to max out one thread of my CPU it's pretty useless.
 
Upvote 0