user542745831
Veteran
7970 (presumably too powerful)
How could it be too powerful
What do you mean with "powerful"? Are you referring to the power comsumption or are you referring to the performance?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
7970 (presumably too powerful)
Probably both.
unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf said:http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf
Elemental demo
- GDC 2012 demo behind closed doors
- Demonstrate and drive development of Unreal® Engine 4
- NVIDIA® Kepler GK104 (GTX 680)
- Direct3D® 11
- No preprocessing
- Real-time
- 30 fps
- FXAA
- 1080p at 90%
Then how would a HD 6870 or HD 6950 cope with it ?
Most people buying a console are buying a years-out-of-date GPU. And the final part could be a modern architecture, with the current devkit part just being selected for equivalent performance but not features. Considering some of the rumours regarding targeting a lower performance bracket, a decent GPU is satisfying. Anyone wanting more should be gaming on PC.
I haven't seen target specs, but if I was to guess, I would say they put "1+ TFLOP" so that the accurate target doesn't leak.
Putting R6870 or R6950 in alpha dev kit that was dispatched early this year would mean that they took cards that would give developers idea of GPU they would get in retail unit, and those GPUs year ago were as powerful as you can get so I wouldn't put emphasis on exact performance based on the chips in dev kit.
It should be also noted that those cards are manufactured at 40nm process. Real silicon should be 28nm.
Actually, BG said his source told another guys guesstimate for GPU performance, so its very vague.From what BG says there arent really any paper GPU target specs for Durango. Just like there arent for Wii U (but there are for PS4). And in both cases we seem to be having a really hard time getting GPU info...
So any figures is presumably a developer approximation?
Also, 6970 was more powerful though. If they were limited to 6k series and wanted the most powerful single card they could get, 6970 was it. Or for that matter if they were really going big, Crossfire.
You don't want to give developers more power in the dev kits than they have in the final console as that means they could end up using techniques that run too slow on final hardware and need to be refactored. If the development hardware runs the same speed or slower, no changes will have to be made.More performance would be more "future-proof", wouldn't it?
If you want the latest GPU (user542745831's complaint), you need a PC.Dunno, consoles get a significant advantage with draw calls, on top of games targetting the hardware. If there's a very high bandwith side memory attached to the GPU then we could have an incredible amount of foliage and litter on the consoles with no concern for the costs.
You don't want to give developers more power in the dev kits than they have in the final console as that means they could end up using techniques that run too slow on final hardware and need to be refactored. If the development hardware runs the same speed or slower, no changes will have to be made.
If you want the latest GPU (user542745831's complaint), you need a PC.![]()
the xbox 360 had dev kits with radeon X800 and didn't end up with one![]()
so I guess we are getting either something better than those two or at least comparable if MS thinks they don't need anymore power and they would be better with savings duo to 28nm manufacturing process.
Non final hardware devkits are only indicative. As others say, look at what was in the XB360 devkits prior to the final Xenos silicon. A 6870 in the devkit doesn't mean that GPU will feature in the final console, nor that the final console GPU will be old tech (although it might be).What was meant was this:
If development would be going on on a HD 6870 or HD 6950 and that kind of GPU would end up in a 2013 console for example, then that would mean that this GPU would be around three years old by then, wouldn't it?
More modern tech means a higher price. As I've mentioned before, this gen could be executed as starting in 2011 with $600 consoles and bleeding edge tech, which drop in price to a $400 console in 2013, only without the hardware actually being released until 2013. If the hardware had been released in 2011, it would be modern, but people buying in 2013+ would be getting old hardware, yet still be happy with it. There's only 12 months or so of actually having the latest, greatest hardware in a console, and the rest of the time it's outdated, so there's no real reason to go with just the latest, greatest hardware other than for 12 months of bragging rights (which are rendered mute against a PC). You have to balance out the potential gains against the potential losses and pick the appropriate hardware. A GPU that's a few years old and shrunk small and cheap and cool and suitably capable isn't a bad choice in itself.A GPU that would already be three years old for a product that would probably be supposed to last several years again?
Would it be so far fetched to expect present technology for a new console?
so there's no real reason to go with just the latest, greatest hardware other than for 12 months of bragging rights
So, then what are the odds for HD 8970 or HD 9970 (or what ever they are going to be called)?
![]()
So, then what are the odds for HD 8970 or HD 9970 (or what ever they are going to be called)?
I'd say pretty close to zero. I'm guessing the performance of the final silicon is going to be around 6870-6950 or slightly faster, but with clearly lower power consumption. I think it's going to be 7850/8770/8850 or something like that. 200W GPU is not going to happen.
There has been one long-cycle generation. Next gen might see a refresh in 3 years for all we know.How can you say something like that, especially considering how long console cycles apparently are these days?
How do you know MS are targeting the 'hardcore'? Maybe they'll leave the PC to target the hardcore and have their console as the family machine, where cost is more important?A "hardcore" console coming out at the end of 2013 for example, containing a GPU with the specs and/or performance of a HD 6870 or HD 6950 would be kind of "ridiculous", wouldn't it?
Five year old GPUs can sell just fine. No-one buys a console based on it's innards, but what they see on screen. Most console gamers will see marked improvement from an old GPU.Just think about it:
Two years after the release of such a console for example, the GPU in it would already be around five years old?
It's all about business decisions, for which there is a proper thread for this discussion. That's why early rumours of low power parts were a valid option and a concern for plenty of core console gamers. Now it's looking like the machine won't be as low-end as that, which has people breathing a sigh of relief. The notion of very expensive loss-leading consoles running the latest, high-end hardware is looking increasingly unsustainable. There are lots of other business models and opportunities to be sought, so it's wrong to expect the consoles to launch with the latest tech. Maybe they will, maybe they won't, but the attempts to predict the next-gen consoles can't rely on that assumption.Something like a HD 8970 or HD 9970 (or what ever they are going to be called) would probably be more appropriate for that at that time, wouldn't it?