Next gen games running on GTX 680...?

Until we have official confirmation, or a very well supported rumour, that AMD have secured all 3 GPUs next gen (or someone else), then nVidia are still technically not out of next gen yet. They are an option. At the moment though there's zero evidence pointing to an nV GPU in any console.

The same can be said for PowerVR as well. They are in Vita so there's still a chance that Sony could use a high end PVR part for PS4. Though in my eyes this is highly unlikely and AMD is probably the obvious choice. If PS4 is going to be a SoC and AMD is going to be the CPU side of it, I see no reason for them not to also have the GPU side of it too. It's reasonable to assume AMD has locked up all three (2 next gen, 1 current gen extender) consoles with their tech.
 
The same can be said for PowerVR as well. They are in Vita so there's still a chance that Sony could use a high end PVR part for PS4. Though in my eyes this is highly unlikely and AMD is probably the obvious choice. If PS4 is going to be a SoC and AMD is going to be the CPU side of it, I see no reason for them not to also have the GPU side of it too. It's reasonable to assume AMD has locked up all three (2 next gen, 1 current gen extender) consoles with their tech.

The Wii U is using a more modern part than the current generation of HD consoles.

Also of note, all rumours have pointed toward AMD securing all of the platform GPUs, as Shifty Geezer has mentioned.

Second-hand knowledge also says the same. ERP's assessment is the most likely case here.
 
Didn't someone at Epic said at GDC that 680 GTX was kinda of a gpu target for Ms and Sony consoles? Or is my memory really flawed?
 
What makes you think so?

You make a lot of assertions about future consoles, mainly considering nV and Cell. What exactly do you have to back those assertions?

Maybe I live in a country where the most graphically intensive console game ever (at least up until 2009 / release) was developed. And maybe people who work there are friends of mine since primary school. Maybe they, as a first party developer already have knowledge of the supposed specs. Maybe they were even complaining about a 2GB memory limit, with regards to the current memory footprint. :rolleyes:

(Hint: don't count nvidia out)
 
What they get is something that can be quite a bit more valuable: direct access to nV:s developer support program,

I remember a quote from someone at nv saying "we will port your physics code to physx for free"

ps: I would love Humus to do a week in the life of a Devrel article for b3d, he's been very quiet on the subject.
 
Maybe they, as a first party developer already have knowledge of the supposed specs.

Maybe those guys will also kick your ass a few times for leaking NDAed information all over the internet.

Although if that'd really be the case, you'd already be asked to delete your posts...
 
Maybe I live in a country where the most graphically intensive console game ever (at least up until 2009 / release) was developed. And maybe people who work there are friends of mine since primary school. Maybe they, as a first party developer already have knowledge of the supposed specs. Maybe they were even complaining about a 2GB memory limit, with regards to the current memory footprint. :rolleyes:

(Hint: don't count nvidia out)

So, you have friends that work at Guerrilla Games. They told you this stuff, and now you're putting it on the Internet without being very subtle about how you got the information?
 
Guys if people want to leak info, I dunno, I'd rather read it than try to get it hidden :p

Maybe I live in a country where the most graphically intensive console game ever (at least up until 2009 / release) was developed. And maybe people who work there are friends of mine since primary school. Maybe they, as a first party developer already have knowledge of the supposed specs. Maybe they were even complaining about a 2GB memory limit, with regards to the current memory footprint.

(Hint: don't count nvidia out)

I dont understand what Nvidia has to do with the amount of RAM here.

I'd also think it's too late to switch GPU's (no matter which vendor) for Orbis now (if that's what you're hinting is possible)
 
GK104 is a midrange GPU in all ways except for selling price and performance. It's smaller, consumes less power, dissipates less heat, runs on a narrower bus, and costs less to manufacture than the 7970, It's close to the older midrange GF114 in all those respects. It isn't so unreasonable to expect something that looks like it making it into the next gen consoles.

If, however, the next gen consoles fall short of its performance, consumers should at least be able to pick up something affordable that can handle what was demoed at this year's E3, by the time the consoles launch.
 
GK104 is a midrange GPU in all ways except for selling price and performance. It's smaller, consumes less power, dissipates less heat, runs on a narrower bus, and costs less to manufacture than the 7970, It's close to the older midrange GF114 in all those respects. It isn't so unreasonable to expect something that looks like it making it into the next gen consoles.

If, however, the next gen consoles fall short of its performance, consumers should at least be able to pick up something affordable that can handle what was demoed at this year's E3, by the time the consoles launch.

Unfortunately I'm not so sure that's going to make that much of a difference. People can build PCs now for rather cheap that kill current gen consoles, but they still play on the old hardware (me being one of them :p). As long as there is a noticeable jump, I'm sure the average consumer would be happy, even if it falls short of our expectations.
 
GK104 is a midrange GPU in all ways except for selling price and performance. It's smaller, consumes less power, dissipates less heat, runs on a narrower bus, and costs less to manufacture than the 7970, It's close to the older midrange GF114 in all those respects. It isn't so unreasonable to expect something that looks like it making it into the next gen consoles.

It still draws almost 3x as much power as RSX/Xenos though.
 
so the new rumor is xbox3 is using nvidia gpu...what if the old rumors of amd working with xbox3 is only for the xenos soc die shrink...and the real winner is jen sun hung again??? were there any next gen demoed on tahiti??
 
Considering the figures in the presentation over there for example (especially the highlighted (bolded/underlined) parts of the quote):

unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf said:
http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf

Elemental demo
  • GDC 2012 demo behind closed doors
  • Demonstrate and drive development of Unreal® Engine 4
  • NVIDIA® Kepler GK104 (GTX 680)
  • Direct3D® 11
  • No preprocessing
  • Real-time
    • 30 fps
    • FXAA
    • 1080p at 90%


anything below GTX 680 performance/specs would be quite disappointing, wouldn't it?
 
Last edited by a moderator:
Considering the figures in the presentation over there for example (especially the highlighted (bolded/underlined) parts of the quote):




anything below GTX 680 performance/specs would be quite disappointing, wouldn't it?

How would it? You're talking about 1080p 30fps on the PC platform, that alone makes all the difference.

In a closed box console you could get the same 1080p 30fps on less powerful hardware because of the differences in coding efficiency.

You just can't compare the way you are.
 
Quit nit picking...
May i remind you that you were the one who, just a few minutes ago, wrote the following post in another thread:

I do not want t see any sub-hd rendering what so ever next generation and I expect every decent game to be native 1080p

?

:rolleyes:

I'm sure you completely understood the point of the post.

Well:

In a closed box console you could get the same 1080p 30fps on less powerful hardware because of the differences in coding efficiency.

How much less powerful ;)?
 
Back
Top