The Next-gen Situation discussion *spawn

And if, at least according to that presentation mentioned above, not even a GTX 680 appears to be able to manage full 1080p at 30 fps in a demo of a "next-gen" engine, then how much "next-gen" would a HD 7770 really be?
PC Gamer reported that Star Wars 1313 demo [modified UE3] was running @ 1080p30fps with three gtx680 cards in SLI mode. ;)

The following article appears to state something else:

pocket-lint.com/news/46051/star-wars-1313-pc-specs said:
http://www.pocket-lint.com/news/46051/star-wars-1313-pc-specs

Star Wars 1313: The PC behind the E3 demo

[...]

Inside the PC that ran the game at E3 was an Nvidia GeForce GTX 680. This is a brand new card from Nvidia and is the quickest single-core GPU the company has built. Featuring 2GB of video memory and a 1GHz base clock, it is second only in speed to the GTX690 which features a pair of GPUs.

[...]

Next up was the demo computer's RAM, all 16GB of it. On top of that it featured a top-spec latest-gen Intel Core i7 processor, more than enough to handle most of the goings on in the Star Wars 1313 demo. Other than that, just plenty of big cooling fans and not a lot else. Naturally the whole setup is powered by SSDs, keeping loading times speedy. Total cost of getting 1313 looking that sweet, around about £1500.

What's particularly exciting is if we go on the proviso that new consoles on their release usually match up with high-spec gaming rigs, it is entirely possible that if 1313 were to launch on next-gen hardware, it would look just like that.

[...]


;)

We need to start demanding 200W of GPU power in nextgen consoles. :)

Finally someone agreeing :mrgreen:;).

It also needs to be quiet.

Fully agreeing ;).
 
Enough about GPU power, what about the CPU?

I remember Capom in an interview saying that Xenos is about as fast as a 3Ghz Pentium Dual Core so a modern quad core Intel CPU should be packing 4x+ the power?

What would developers want? An Intel quad core with high IPC or an 6-8 core AMD CPU with a lower IPC?
 
I don't think it is about Ipc but performances.
The answer with haswel close to release id Intel but it won't happen.
 
a HD 7770 appears to manage an [irony]astonishing[/irony] 33 fps at 1920x1200 in "Battlefield 3" and an even more [irony]astonishing[/irony] 20 fps at 1920x1200 in "Crysis: Warhead" :rolleyes:?

Seriously, do you want a PS3.5 / Xbox 540 or do you want a PS4 / Xbox 720 (or whatever they are going to be called) :rolleyes::eek::mrgreen:?

You do realize that the PC version of BF3 is doing several times more work? It has twice the pixels and considerably better IQ per pixel. 7770 HD is ~6 times faster than Xenos in raw numbers, and much, much more efficient. All in all, it would be an upgrade in the same ballpark as XGPU -> Xenos.

People seem to seriously over-estimate the speed of console GPUs because "BF3 runs at 30fps". Console BF3 is not the same game as PC BF3. PC budget hardware is now much faster than consoles. PC *entry* level GPUs (Intel Integrated) are now as fast as the consoles.

I don't think it will be like 7770. Their primary constraint is power and output pins, they will probably want a relatively large chip with lower clockspeeds. I'm personally predicting something in the ballpark of ~7850. I would be very surprised if it was as powerful as 7970.

However, console devs will be able to take advantage of shared memory space, really short distance between CPU and GPU, less OS overhead, and the benefits of optimizing against a single target to ship better experience than PC at the same raw specs. I would not be surprised if it would take a 7970 to get similar IQ and frame rates on a PC. (But that's only if they don't skimp on the memory.)
 
Finally someone agreeing :mrgreen:;).

Its not about agreeing, heck, I'd love a giant 24" x 24" x 7" monster high-end AV style box, with 600 watts of power dissapation and top end hardware 4Tflops GPU 4Ghz quadcore Intel CPU and loads of RAM. It would probably cost $1500+, about what you'd spend on a good PC gaming rig. However, perhaps ~1% of the market would even buy it, and because no one buys it, no one would make games for it.

Its about whats realistic, and sadly, whats realistic is pretty low in terms of specs right now. Any GPU over 100 watts isn't going to happen. That puts the ceiling at maybe ~1.5 Tflops or so, in the range of a 7770 GHZ edition, maybe a little faster. Those are the optimistic estimates. Unless they delay and push back into 2015.
 
1tflops, then 2.8tflops, then GTX 680 (because square, epic, ubisoft etc. Show games using that gpu), now 1.5tflops (optimistic)...

WTF??? Lol, it is a rollet coaster :LOL:

1.5tflops don't looks like what lherre have said.
Next will be over 9000 tflops
 
Next will be over 9000 tflops

over9000.jpg


:oops::LOL:
 
As an aside, I could imagine a scenario where one console had a clear 2x advantage in performance, and still couldn't differentiate itself significantly from the competition.

I'm not so sure, look how much the Cell hype helped the PS3, 6 years later we still have threads stealth trolling 360 games or posters holding up Killzone and Uncharted as examples of the PS3's mythical power, unattainable on a 360.
A lot of people drunk Sony's Kool Aid this gen, despite the parity between the systems; so I don't see how a console with a clear 2x performance advantage is going to have issues on this front.
 
I'm not so sure, look how much the Cell hype helped the PS3, 6 years later we still have threads stealth trolling 360 games or posters holding up Killzone and Uncharted as examples of the PS3's mythical power, unattainable on a 360.
A lot of people drunk Sony's Kool Aid this gen, despite the parity between the systems; so I don't see how a console with a clear 2x performance advantage is going to have issues on this front.

Sure but if your on a video game forum your already a part of the 1%.
I think you could have a significant power deficit and I don't know if the majority of people would notice.
There is a point below which the disparity is obvious and people vote with their wallets, but I think given where we are with polygon counts and shaders, that disparity may have to be pretty large.
I actually don't think there will be a big disparity, I suspect both sides have similar power constraints, and that will equate to similar overall performance. Though it's still possible we may see very different trade offs.
 
You do realize that the PC version of BF3 is doing several times more work? It has twice the pixels and considerably better IQ per pixel. 7770 HD is ~6 times faster than Xenos in raw numbers, and much, much more efficient. All in all, it would be an upgrade in the same ballpark as XGPU -> Xenos.

People seem to seriously over-estimate the speed of console GPUs because "BF3 runs at 30fps". Console BF3 is not the same game as PC BF3. PC budget hardware is now much faster than consoles. PC *entry* level GPUs (Intel Integrated) are now as fast as the consoles.

You would already be satisfied if "next-gen" consoles would be able to manage BF3 PC Ultra visuals at 1920x1080 and 30fps :???:?

Quite frugal expectations :???:?

Don't you think that would then be more like PS3.5 / Xbox 540 rather than PS4 / Xbox 720 (or whatever they are going to be called) ;)?
 
Last edited by a moderator:
You would already be satisfied if "next-gen" consoles would be able to manage BF3 PC Ultra visuals at 1920x1080 and 30fps :???:?

Quite frugal expectations :???:?

Don't you think that would then be more like PS3.5 / Xbox 540 rather than PS4 / Xbox 720 (or whatever they are going to be called) ;)?

What's the specs of your PC?

http://www.gamepur.com/files/images/2011/bf3_beta_pc_vs_ps3_screen_3.jpg
http://1place4tech.com/wp-content/uploads/2012/06/bf3_beta_pc_vs_ps3_screen_2.jpg

PC version is not even on Ultra and it look miles ahead of PS3.

Most high end console games are running 720p or less so if developers target 1080p next gen ( Which they better ) then you're looking at 2-2.5x the performance increase alone just to hit 1080p.
 
Last edited by a moderator:

With hardware available today (2012, almost 2013), it should be nothing special at all to be able to make something look "miles better" than what's possible with consoles featuring 2005/2006 hardware, shouldn't it?

But just because something looks better than with hardware from 2005/2006 doesn't necessarily mean that it automatically looks "next-gen" good, doesn't it?

Take a look at the following video for example:

gamefront.com said:
MediaInfo said:
File size: 1.80 GiB
Duration: 7mn 7s
Bit rate: 36.1 Mbps
Width: 1920 pixels
Height: 1080 pixels
Frame rate: 29.970 fps

:???:

If that would be all what "next-gen" would have to offer, then that would be quite disappointing, wouldn't it :???:?
 
With hardware available today (2012, almost 2013), it should be nothing special at all to be able to make something look "miles better" than what's possible with consoles featuring 2005/2006 hardware, shouldn't it?

But just because something looks better than with hardware from 2005/2006 doesn't necessarily mean that it automatically looks "next-gen" good, doesn't it?

Take a look at the following video for example:
:???:

If that would be all what "next-gen" would have to offer, then that would be quite disappointing, wouldn't it :???:?

Do you own a gaming PC?

And you still don't get it do you? Games on PC run everything higher... even with the same settings the PC version will be running things in higher precision then the console version thus requiring more power to get the same frame rate results.

So you can not directly compare them.
 
You're underestimating the performance cost of the OS, API and drivers, none of which are negligible on the PC, most unfortunately.
 
Games on PC run everything higher... even with the same settings the PC version will be running things in higher precision then the console version thus requiring more power to get the same frame rate results.

So you can not directly compare them.
How true is that really?

Just yesterday you posted the following:

1900XT Crossfire = 2x Xenos

360 has 1x Xenos

?

Then why do those Source Engine based games for example apparently "run" at 1280x720 with 30 fps (sometimes even below that) on current consoles, while, at leat according to the following review for example:

http://www.guru3d.com/articles_pages/ati_radeon_x1900_xtx_review,18.html

PC appears to not have any problem at all to achieve 60 fps (and more) with the same or very similar hardware?
 
Last edited by a moderator:
How true is that really?

Just yesterday you posted the followingThen why do those Source Engine based games for example apparently "run" at 1280x720 with 30 fps (sometimes even below that) on current consoles, while, at leat according to the following review for example:

http://www.guru3d.com/articles_pages/ati_radeon_x1900_xtx_review,18.html

PC appears to not have any problem at all to achieve 60 fps (and more) with the same or very similar hardware?

:rolleyes:

That's the second time you've not answered my question about owning a gaming PC so I'll assume that means you don't so all of your views and opinions means squat.

And if you knew anything about games you would know that the console version of HL2 Oange box runs on the updated HL2 : Episode 2 version of source engine which adds dynamic soft shadows, HDR, motion blur and other effects that were missing from the games original release in 2006 and thus were not present in review that you linked to.

Here's a review of HL2 : Episode 2, Look how much of a drop the cards have now sustained, RSX in PS3 is not that much faster then a 9600GT due to having lower bandwidth and local memory.

15749.png


15753.png


Now do you understand why the consoles only run at 1280x720 at 30fps?
 
:rolleyes:

That's the second time you've not answered my question about owning a gaming PC
Because it's completely irrelevant for this discussion?

so I'll assume that means you don't so all of your views and opinions means squat.

How plausible... :rolleyes:

And regarding ignoring questions/answers, a little bit more on topic:

In a closed box console you could get the same 1080p 30fps on less powerful hardware because of the differences in coding efficiency.

You still haven't posted how much less powerful a GPU in a "closed box" could be compared to a GTX 680, to still achieve the same visuals/framerate/resolution as in that demo mentioned over there?
 
Last edited by a moderator:
Because it's completely irrelevant for this discussion?



How plausible... :rolleyes:

And regarding ignoring questions/answers, a little bit more on topic:



You still haven't posted how much less powerful a GPU in a "closed box" could be compared to a GTX 680, to still achieve the same visuals/framerate/resolution as in that demo mentioned over there?

Well from my own testing with an old Nvidia 9800GT 512Mb running Metro 2033.

The consoles use the PC's 'Low' setting at 1280x720

The 9800 GT that I used was an low power energy efficient model with slightly lower clocks then a reference 9800 GT.

Anyway, the 9800 GT could manage Metro 2033 on the console's 'Low' settings at native 1080p with a frame rate that would range from 20-40fps depending on the scene.

If you dropped the resolution down to 720p and the card could manage medium settings with roughly the same frame rate.

An 8800GT is easily, 3x+ faster then a PC 7800GTX but it's not giving out 3x the performance of the consoles in Metro 2033.

In PC terms you're looking at at least a 9600GT in order to run consoles ports at console settings or slightly higher.

You could say that you can get at least double the performance from dropping any GPU into a console.

That UE demo that was running on a GTX 680 could be done possibly be done on a GTX 660 if the latter GPU was in a closed box system.
 
You still haven't posted how much less powerful a GPU in a "closed box" could be compared to a GTX 680, to still achieve the same visuals/framerate/resolution as in that demo mentioned over there?

There isn't going to be hard and fast answer to that.
 
Just hoping that "next-gen" consoles will be able to offer much better visuals than just "Frostbite 2" PC Ultra visuals.

Although it, for whatever reason, appears to require quite a lot of hardware power to render games like "Battlefield 3" and "Need for Speed: The Run" with fully maxed out visuals at 1080p60 on PC, they do not necessarily appear to have a "next-gen" feel to them, not even with fully maxed out visuals on PC at 1080p60, don't you think?
 
Last edited by a moderator:
Back
Top