Next gen games running on GTX 680...?

gongo

Regular
Watch Dog from Ubi
Starwars 1313 from Lucas
Agni from Square
Unreal from Epic

they were all using GTX 680 systems....strange or coincidental?
Strange when you think that AMD got the contracts for next gen systems...
Strange when you think that 7970 Tahiti looks more future proof..on paper...384bit bandwidth...3GB vram...2K shaders....4TFlops compute....widely availability...

What is turning devs away from AMD gpus..?
 
Most devs seem to have an affinity for Nvidia. Better support, more closeness to devs, and drivers I assume. Nvidia just seems more intertwined with developers.
 
ERP pretty much addressed this when Al mentioned all these games running on Nvidia hardware:

Again you read way to much into this stuff.

Not really the 680 is the fastest single GPU card available, and it's E3, home of the hacks.
If I were a 3rd party developing a nextgen title right now, I'd be focussed on techniques and tools, I'd worry about getting it running on real hardware when I had it.
I can develop those techniques whether ATI or NV graphics hardware is being used. At E3 I want to show something cool not necessarily how my current code might run on some none existent unannounced hardware, I'd bet the boxes were running Intel CPU's as well for much the same reason.
 
Just like top athletes mostly using Gatorade. It clearly has nothing to do with money exchanging hands. /sarcasm
 
I talk about this in Hardware utilization: PC vs console question thread. Every "next gen" demo was running on 680gtx...

RPGSite pushed a little to find out what graphics card was powering Square Enix’s demo and although Hashimoto didn’t reveal its name, he said that ‘what I can say is that what we’re using is about the equivalent as what is being used by any other companies for their tech demos.‘

The equivalent as what is being used by any other companies huh? Well, we do know that Epic Games demonstrated Unreal Engine 4 on a single GTX 680. And we do know that Crytek used a GTX 680 for their CryEngine 3 tech demos. Gearbox has also used Nvidia’s GTX 680 cards to showcase the PC, PhysX accelerated, version of Borderlands 2. It’s also no secret that Nvidia’s GTX 6xx series was heavily used in this year’s E3 and we also know that the freshly released GTX 690 was not used by any company to showcase their tech demos.

Put these things together, and you get the card that powered the Agnis Philosophy Tech Demo. In other words, yes. Agnis Philosophy was running on a single GTX 680. In addition, the build that was demonstrated was not optimized at all, meaning that Square Enix could actually produce these graphics in real-time (when all physics, AI, and animations are added to the mix).

Now guess what star wars 1313 was running on? Yes a 680 gtx. Not hard to connect the dot here guys....
http://www.dsogaming.com/news/the-i...phy-tech-demo-was-running-on-a-single-gtx680/

Epic saying everyone showing Sony and ms what they can do with this power. Maybe someone said here the target gpu, make what you can... target gpu was 680.
"In determining what the next consoles will be, I'm positive that [Sony & Microsoft are] talking to lots and lots of developers and lots of middleware companies to try and shape what it is. We've certainly been talking with them and we've been creating demonstrations to show what we think.

"And obviously the Elemental demo, same thing. We're certainly showing capability if they give us that kind of power, but so is everybody else."

Epic even say if they can't do that today then delay the consoles another year.

http://www.videogamer.com/xbox360/g...ive_leap_in_next-gen_console_performance.html

Like you said all the next gen system are running AMD so this gives the people a way out by saying its running on nvidia hardware. If it was running on AMD then people would start saying a 7970 or whatever was target for next gen console. Just like people are saying about the 680gtx. Pretty smart move...
 
Now guess what star wars 1313 was running on? Yes a 680 gtx. Not hard to connect the dot here guys...
How is it not as ERP says? You want to show a demo. You don't want it stuttering or looking rough. You don't know what the console performance will be. You can run it on the fastest GPU currently available. Why not do that? It's comparable to showing prerendered concept games as indicative of what you hope to be able to do, only less extreme. They could alternatively have picked some far less capable GPU and shown something less exciting, which might be more representative of the console we'll end up getting, but they don't have to at E3, and none of their rivals will be doing that, so why make life unnecessarily harder for themselves?

I agree with ERP. People reading connections into this are reading far too much.
 
How is it not as ERP says? You want to show a demo. You don't want it stuttering or looking rough. You don't know what the console performance will be. You can run it on the fastest GPU currently available. Why not do that? It's comparable to showing prerendered concept games as indicative of what you hope to be able to do, only less extreme. They could alternatively have picked some far less capable GPU and shown something less exciting, which might be more representative of the console we'll end up getting, but they don't have to at E3, and none of their rivals will be doing that, so why make life unnecessarily harder for themselves?

I agree with ERP. People reading connections into this are reading far too much.

Like the article said its not the fastest gpu out, they had the 690 gtx.

Epic came right out and said we made these demo to show ms/sony what we could do given the power, and he added so is everyone else...
 
Like the article said its not the fastest gpu out, they had the 690 gtx.
It's the fastest single GPU that does the job. If they bought 680's when they came out and they were running their engine as intended (they will have some hardware target in mind), then there's little point in spending hundreds of dollars a month later to update all the developer PCs.

ERP explains the mindset of a developer heading for E3. It has a lot to do with showcasing your wares, and not much to do with console development. There's no reason to associate the early game development hardware with the final console hardware appearing a year+ later, especially when PC ports are pretty much a given these days.
 
Like the article said its not the fastest gpu out, they had the 690 gtx.
The 690 is not a faster GPU, it's a set of two 680 GPUs (whom both share the same PCIe port, and are lower clocked I believe), running in SLI mode. Anyhow, SLI requires extra tinkering to work/run correctly when using some advanced rendering methods, and probably is more hassle than it's worth for demonstration software at a press convention.
 
For the fastest debate, it's not really faster than 7970 when both are overclocked to max, nor generally faster than the new 7970 Ghz Edition. Especially when under extreme conditions due to 7970's 384 bit bus and more VRAM.

Although I guess, maybe it was fastest at the time, simply due to 7970's clocks being lower. Not by a whole lot, it was arguable even then as you could always make the case 7970 was better at highest resolutions/settings due to more bandwidth.
 
Game demos were running on X1950s and X850s, inside of Apple G5s, back before XBox 360 came out. Pre-release dev kits AFAIR. They just aren't making these games for the current consoles.
 
Just like top athletes mostly using Gatorade. It clearly has nothing to do with money exchanging hands. /sarcasm

Seriously, the typical TWIMTBP title does not get money from nVidia. I got told this by a guy who was involved in a TWIMTBP title. What they get is something that can be quite a bit more valuable: direct access to nV:s developer support program, where they get to talk to/ ask help from engineers who are really good at doing gpu programming, and know their way around every bit of the apis. Their help in debugging and optimizing can make the difference between shipping on time or being months late, so devs tend to like them. AMD has nothing comparable.
 
Maybe RSX2 is a low power GTX 680 with slightly lower bandwidth?
Nvidia is definitely not out out of the next generation game yet.
 
Until we have official confirmation, or a very well supported rumour, that AMD have secured all 3 GPUs next gen (or someone else), then nVidia are still technically not out of next gen yet. They are an option. At the moment though there's zero evidence pointing to an nV GPU in any console.
 
Back
Top