Are PCs holding back the console experience? (Witcher3 spawn)

  • Thread starter Deleted member 11852
  • Start date
Draw calls definitely aren't a problem on PC in this game, it tops 70fps even on a 3Ghz dual core:

Interesting. I'm not sure how to interpret from this graph that the CPUs are not draw call bound. It's not to say I don't believe you, I do know what you are saying but the min frames per second and the average frame per second are less than 10 fps apart. That means max frames are likely 20 frames more than the min frames, in theory. Overall this is really tight, all setups in that diagram the minimum frame rate is very close to the average frame rate. We'll need more data to know more - but without some sort of CPU and GPU utilization graph to go along with it, I'm not sure how to proceed.
 
Interesting. I'm not sure how to interpret from this graph that the CPUs are not draw call bound. It's not to say I don't believe you, I do know what you are saying but the min frames per second and the average frame per second are less than 10 fps apart. That means max frames are likely 20 frames more than the min frames, in theory. Overall this is really tight, all setups in that diagram the minimum frame rate is very close to the average frame rate. We'll need more data to know more - but without some sort of CPU and GPU utilization graph to go along with it, I'm not sure how to proceed.

I'd assume that if the game was draw call limited (which in effect means CPU limited) then it would be performing much worse on the slower CPU's. I see a game like Assassins Creed Unity being draw call limited (according to AMD) where almost no system, regardless of the CPU can even break 60fps. For example even at the lowest settings and 720p I can only hit around 50fps average on my 2500k (stock).
 
I'd assume that if the game was draw call limited (which in effect means CPU limited) then it would be performing much worse on the slower CPU's. I see a game like Assassins Creed Unity being draw call limited (according to AMD) where almost no system, regardless of the CPU can even break 60fps. For example even at the lowest settings and 720p I can only hit around 50fps average on my 2500k (stock).
Right. But the game should be optimized to be not draw call bound before release right? So i'm expecting a lot of batch calls as you approach the final product if you are over on draw call budget.
But that may not have been the case with the initial reveal, edit: which lighting and detail is more much inline with AC:U no?
 
Last edited:
I see a game like Assassins Creed Unity being draw call limited (according to AMD) where almost no system, regardless of the CPU can even break 60fps. For example even at the lowest settings and 720p I can only hit around 50fps average on my 2500k (stock).
lol the things I've heard... :X

But that may not have been the case with the initial reveal

Yeah the LOD seemed significantly higher in a number of scenes.

The Geforce guide does show a marked drop in framerate with the tweaked foliage/grass distance (still fugly), and that would be further compounded by increasing the shadow quality/distance/resolution/cascades (less fugly) -> more geometry setup/fillrate/memory & bandwidth etc.

Would be curious to test the tweaked ini LOD under varying CPUs.
 
Last edited:
I move to have the thread renamed Are PCs holding back the console experience! This interview confirms what I've always believed, that they made engineering choices so they could deliver the game. Lots of engineering solutions work on a smaller scale (like the world built for the VGX trailer) but not on a larger scale (he world in the finished game).

For a question that was about PC only? That somehow implies that consoles could have done it better? It's also entirely possible that consoles would have dealt with it significantly worse.

Regards,
SB
 
Console gamers definitely benefit from the presence of core pc gaming which creates a marketplace to drive advancement and innovation in gpu development for the past 20+ years.

Last non-amd/nvidia gpu in a console was the ps2. Sony/Microsoft/Nintendos going with AMD/Nvidia definitely brings down the cost and raises the performance metrics for console. There is absolutely no way nvidia/amd or any competitors (eg powervr) console gpus would be as advanced for as low a pricepoint as it is if they only created worstation cards and there was no CORE pc gaming market to speak of to pump year and years and years of R&D into. GDDR5 prices would be higher due to scales of economy. Might not be able to afford even 4GB of gddr5 ram for that all important $399 mass market price.

Console gamers have pcgaming to thank for Microsofts enterance into console gaming. No directx, no directxbox, no gaming division, little expertise within their ranks to go up against the Japanese consoles.

PC gaming was the birthplace of fps and refined the fps throughout the 90s.
PC gaming was the birth place of Bioware and Bethesda.
No Blackisle Fallout and no Bethesda Fallout. Probably no CD Projekt Red and Witcher.
 
Last edited:
Console gamers definitely benefit from the presence of core pc gaming which creates a marketplace to drive advancement and innovation in gpu development for the past 20+ years.

Every gaming technology ecosystem benefits from developments in the others. None of the ecosystems are isolated.

PC gaming was the birthplace of fps and refined the fps throughout the 90s.
Every Atari 2600 game ran at 60fps.
PC gaming was the birth place of Bioware and Bethesda.
Consoles were the birth of modern video games. Period.

These comments are nonsense.
 
Actually many game types including fps started with researchers messing around on expensive mainframes in the 60s and 70s. There was a 16 player midi-maze game on Atari ST (basically a PC) which is a 3D maze shooter that was my first multiplayer experience of that kind for instance, that if you look at wiki shows it was based on some mainframe experiment iirc. Anyway that was definitely a fps ;)
 
Consoles were the birth of modern video games. Period.

Lies!
Oscilloscopes and fixed-function discrete components were the real birth of modern video games. Period.

eCutyQH.jpg
 
I think DSoup used 'modern' as a distinct qualifier. Oscilloscopes were the birth of video games, not modern ones. Not sure what his distinction is though. If he meant the Atari 2600 etc, you'd be right - video games existed before the 70s consoles.
 
Every gaming technology ecosystem benefits from developments in the others. None of the ecosystems are isolated.
Exactly and i never said otherwise so you aren't disproving my statements, I have posts in other threads where I go into the benefits the console gaming market has had on pc gaming. There is no use in pretending I said something I didn't.

Every Atari 2600 game ran at 60fps.
fps as in first person shooter, not sure how you confused the term with frames per second. Its quite easy to see that 'frames per second' makes no sense in the context of the sentences you quoted.
Consoles were the birth of modern video games. Period.
Absolutely. There is a symbiotic relationship in which benefits flow both ways.
 
Last edited:
So DSoup's argument is that consoles invented the frames per second?
I'd say the "frames per second" concept was invented for the first motion pictures in the 19th century.

Modern video games. You play a lot of video games on oscilloscopes? :nope:
How do you define modern?
If we assume that videogames are a form of art, then 50 years would be considered modern for any historian.
If we assume modern videogames as interactive entertainment being played in a cathodic-ray tube screen, then I'm pretty sure the first computers did it before consoles.


The only thing I can attribute to consoles is spreading the audience and largely increasing the investment for gaming hardware, but that's a rather indirect contribution.
 
How do you define modern?
I like the dictionary definition, there's no need to go around redefining words :nope:

Oscilloscopes are not playing modern games like first and third person shooters, or RPGs.
 
I like the dictionary definition, there's no need to go around redefining words :nope:
I see no dates or numbers in the dictionary definition:
http://www.merriam-webster.com/dictionary/modern


Oscilloscopes are not playing modern games like first and third person shooters, or RPGs.
And neither are the black&white TVs where the first consoles connected to.
BTW, do the current gens even bring SCART or S-VIDEO connectors?


What you claim as clearly defined (modern videogames) isn't defined at all. Videogames are very recent as a form of art, so any videogame could be considered modern.
Other person could considered anything not made in 3D as not modern. Another one would say anything below DX9-level of shader effects as ancient.
When I went to New York's Museum of Modern Arts (MOMA), they had the 21-year-old Sim City 2000 as an example of a videogame. Is that modern enough?
 
Isn't it a relative descriptor since the scope is the timeline of video games, not the art medium as a whole?
 
I agree. 'Modern' video games doesn't define a clear category to me, and I'd have no idea where to draw the line. I also couldn't find a line that places games consoles as leading the way. PC's had FPS long before consoles. Consoles were shoot-'em-ups and beat-'em-ups and platformers while PC was first person gaming (shooters, flight sims) Not saying DSoup's wrong, but he'd have to explain his definition and compare consoles and PCs at that juncture to show consoles were/are the birthplace of modern gaming.

Unless he means consoles is where the current ('modern' as in 'right now') games are born?

:???:
 
Back
Top