Welcome, Unregistered.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Closed Thread
Old 05-Mar-2012, 15:13   #10276
AlNets
A bit netty
 
Join Date: Feb 2004
Location: warp
Posts: 14,814
Default

Quote:
Originally Posted by Acert93 View Post
Some of these modern game engines are using a *lot* of buffers for deferred rendering. What are we looking at in terms of buffers with 4xMSAA with a G-Buffer, full resolution buffer for transparencies, etc?
Well, the framebuffer size balloons once you factor in 64-bit render targets. Though who knows if devs will just choose other 32-bpp HDR formats like RGB9E5 (DX10+) or if Microsoft comes up with a new format ala 7E3/FP10.

There's really no reason to use full res alpha if they fix up the edge cases.

At most you're probably looking at no larger than 160MB for the G-buffer with 4xAA and 32bpp @1080p (the BF3 golden target).

I'm actually quite curious to see if devs will consider using >4MRTs considering hardware support for 8 has been there since DX10, though the memory cost will obviously be ridiculous.
__________________
"You keep using that word. I do not think it means what you think it means."
Never scale-up, never sub-render!
"UC3's story had more platforming than the gameplay."
AlNets is offline  
Old 05-Mar-2012, 16:20   #10277
bgassassin
Member
 
Join Date: Aug 2011
Posts: 507
Default

Quote:
Originally Posted by Dr Evil View Post
It was just a small brain fart from him. I'm 99.9 % certain that he meant that it could run on only one of those 580s with proper optimisations. I don't know if that is really the case, but that was the point he wanted to make. Imo you'r taking your angle a bit too far with your assumptions. Also the point Rangers was making about needing 1.5 x 580GTX was obviously just to have a another chip that provides that sort of performance, not that you could saw of .56 of a card and throw it in there
Yes, I was obviously assuming he meant sawing another card in half even though he himself said he wasn't. You weren't reading my responses or his based on this post. He talked about "1.5 580s" when Epic's own formula doesn't drop in half like that. 2.5 TFLOPs is not "1.5 580s" it's actually more, but I stuck to how he was viewing it. And there aren't any single GPU's out there that equal that level of performance which is what I'm getting at. And at the same time what I expect to be in a console wouldn't achieve that on it's own either. I assumed I didn't have to point that out like that. That's why you still need to two 580s to cover that extra 67% in reality. Is it more than what you need? Yes, but there's nothing available as a single GPU that can handle that.

And I think you're assuming too much on him having "just a small brain fart" for him to choose a 10+ year old card. You don't brain fart like that, especially someone apparently like Rein. Show me where he made an indication that he meant one 580 and I'll change my view. Like I said that's my first time seeing the actual quote, but I don't see how anyone can take a reference to an old card as a simple mistake. And if that's the only proof, then I don't think Rein is at fault for other people expecting something that Epic never indicated would happen.

Quote:
Originally Posted by Rangers View Post
We also know the demo was made with just a few people, so that also suggests little optimization.

Now that 7870 benches are out we see it's not running far behind a 580. I think it can do Samaritan, I guess we have to agree to disagree.

Also if we drop to 720p, then there's no debate at all...
The latter is something we wholeheartedly agree on. 720p should be easily obtained by what I expect from MS and Sony.

I think what we would agree to disagree on is the performance gains from being in a closed environment. Personally I expect at a best a GPU between a 7850 and a 7870, and I don't see the gain being up to the point where it's equal to 1.5x to 1.67x one 580. I could be wrong, but I think that's expecting a lot.

And I definitely acknowledge they originally made it with just a few people. My issue is that according Epic's official info, it's almost a year later and nothing has changed based on their own numbers. All they did was reduce the resolution.
bgassassin is offline  
Old 05-Mar-2012, 19:23   #10278
Dr Evil
Anas platyrhynchos
 
Join Date: Jul 2004
Location: Finland
Posts: 4,695
Default

Quote:
Originally Posted by bgassassin View Post
And I think you're assuming too much on him having "just a small brain fart" for him to choose a 10+ year old card. You don't brain fart like that, especially someone apparently like Rein. Show me where he made an indication that he meant one 580 and I'll change my view. Like I said that's my first time seeing the actual quote, but I don't see how anyone can take a reference to an old card as a simple mistake. And if that's the only proof, then I don't think Rein is at fault for other people expecting something that Epic never indicated would happen.
Imo something like that is fairly typical example of a brain fart. Just a little misfire. Even with the large time gap between those two cards, they were both top end nVidia GPUs causing similar associations etc., and the focus wasn't on the model, but on the amount of cards, that's exactly the type of moment when errors like that happen.

http://www.geforce.com/News/articles...re-of-graphics

Not Rein, but Martin Mittring: Senior Graphics Architect at Epic Games.

Quote:
As already mentioned, the demonstration ran in real-time on a 3-Way SLI GeForce GTX 580 system, but even with the raw power that configuration affords, technological boundaries were still an issue, and for that reason, Daniel Wright, a Graphics Programmer at Epic, felt that "having access to the amazingly talented engineers at NVIDIA’s development assistance centre helped Epic push further into the intricacies of what NVIDIA’s graphics cards could do and get the best performance possible out of them." Being a tightly controlled demo, Samaritan doesn’t include artificial intelligence and other overheads of an actual, on-market game, but with enough time and effort, could the Samaritan demo run on just one graphics card, the most common configuration in gaming computers? Epic’s Mittring believes so, but "with Samaritan, we wanted to explore what we could do with DirectX 11, so using SLI saved time
Dr Evil is offline  
Old 05-Mar-2012, 21:12   #10279
Acert93
Artist formerly known as Acert93
 
Join Date: Dec 2004
Location: Seattle
Posts: 7,806
Default

http://www.tomshardware.com/news/pat...ope,14878.html

MS granted a patent filed in 2006 for 3D mouse.

I had deposited a long time ago prior to the WiiU I thought MS/Sony would either have a screen on the controller (not a wild prediction considering the Dreamcast kind of did this years ago) or that we could see a quasi Move-Classic controller where the wands "break out" essentially a normal pronged controller that could be separated for 3D wands. I am pretty curious at this point what MS and Sony will come up with. Personally a Kinect like camera with a traditional/breakout Move controller would pretty much fill a huge array of input scenarios.
__________________
"In games I don't like, there is no such thing as "tradeoffs," only "downgrades" or "lazy devs" or "bugs" or "design failures." Neither do tradeoffs exist in games I'm a rabid fan of, and just shut up if you're going to point them out." -- fearsomepirate
Acert93 is offline  
Old 05-Mar-2012, 21:32   #10280
MrFox
Senior Member
 
Join Date: Jan 2012
Posts: 1,510
Default

Quote:
Originally Posted by Acert93 View Post
http://www.tomshardware.com/news/pat...ope,14878.html

MS granted a patent filed in 2006 for 3D mouse.

I had deposited a long time ago prior to the WiiU I thought MS/Sony would either have a screen on the controller (not a wild prediction considering the Dreamcast kind of did this years ago) or that we could see a quasi Move-Classic controller where the wands "break out" essentially a normal pronged controller that could be separated for 3D wands. I am pretty curious at this point what MS and Sony will come up with. Personally a Kinect like camera with a traditional/breakout Move controller would pretty much fill a huge array of input scenarios.
That's interesting, Sony have a patent on adding Depth channel to their eyeToy. If both plans happen, it means third party devs would be able to make motion based games multi platform... to a certain extent.
MrFox is offline  
Old 05-Mar-2012, 21:32   #10281
bgassassin
Member
 
Join Date: Aug 2011
Posts: 507
Default

Quote:
Originally Posted by Dr Evil View Post
Imo something like that is fairly typical example of a brain fart. Just a little misfire. Even with the large time gap between those two cards, they were both top end nVidia GPUs causing similar associations etc., and the focus wasn't on the model, but on the amount of cards, that's exactly the type of moment when errors like that happen.

http://www.geforce.com/News/articles...re-of-graphics

Not Rein, but Martin Mittring: Senior Graphics Architect at Epic Games.
Honestly I would rather see that kind of info come from someone like Mittring. But it's like they pointed out "Samaritan doesn’t include artificial intelligence and other overheads of an actual, on-market game". Don't get me wrong. I think next gen consoles will have very nice looking games, but I see it achieved more from developer creativity than console hardware power.
bgassassin is offline  
Old 05-Mar-2012, 21:48   #10282
TheChefO
Naughty Boy!
 
Join Date: Jul 2005
Location: Tampa, FL
Posts: 4,656
Default

Quote:
Originally Posted by bgassassin View Post
...I think next gen consoles will have very nice looking games, but I see it achieved more from developer creativity than console hardware power.
Those two go hand in hand. Without better hardware, there's only so much creatively one can do and at this point, I think devs have got all they can from this gen. Going into nextgen with roughly the same spec will lead to ... lack of creativity.

____________

Interesting coincidence with the Pitcairn info finally in the wild we see a competent chip weighing in at 212mm2, 100-130w, and 2.8b transistors ... Where did I hear that number before?

Quote:
Assuming 32/28nm launch in 2012 would yield 8x trans count, this would amount to a budget of roughly 4billion (497m x8 = 3,976m) if we are to assume equal budget/process node.

This leads to some pretty interesting potential hardware:

With that budget, MS could extend the xb360 architecture to the following:

10MB EDRam (100m) => 60MB EDRam (600m) Enough for a full 1080p frame buffer with 4xaa

3 core xcpu (165m) => 9 core xcpu (495m) - or an upgraded 6 core PPE with OoOe and larger cache along with an ARM core (13m trans)

This leaves a hefty 2.8b trans available for xgpu...
http://forum.beyond3d.com/showpost.p...postcount=8292

I just hope someone sacks up and produces a decent console. Sony, MS, Nintendo, whoever.
__________________
"...the first five million are going to buy it, whatever it is, even if it didn't have games."
"I don't think we're arrogant"

...it seems laughable, laughable I tell you, that early 2012 technology that is under the 2005 budgets for the consoles cannot fit into a next gen box.
- Acert93
TheChefO is offline  
Old 05-Mar-2012, 22:37   #10283
babybumb
Member
 
Join Date: Dec 2011
Posts: 537
Default

I think there is no chance in hell of Epic investing tons of money to develop fancy new tech for UE4 if the specs of Sony/MS consoles were a not up to the task..

If they dont have devkits they know the ballpark
babybumb is offline  
Old 05-Mar-2012, 22:43   #10284
Squilliam
Beyond3d isn't defined yet
 
Join Date: Jan 2008
Location: New Zealand
Posts: 3,146
Default

An AMD 78xx derivative GPU would likely be an excellent choice for a higher power console releasing in 2013. All they'd have to do would be likely cut the bus down to 128/192 bits to allow for future shrinking and reduce the number of memory chips required and they're good to go. 4-6 4Gb RAM chips ought to give them a very strong 2-3GB of system RAM which ought to be good enough for a next generation console. They could push it to 8-12 chips if they were to go with a clamshell design and that'd give them an obvious cost saving when 8Gb chips are released as well as double the potential memory on the same bus size.
__________________
It all makes sense now: Gay marriage legalized on the same day as marijuana makes perfect biblical sense.
Leviticus 20:13 "A man who lays with another man should be stoned". Our interpretation has been wrong all these years!
Squilliam is offline  
Old 05-Mar-2012, 22:48   #10285
TheChefO
Naughty Boy!
 
Join Date: Jul 2005
Location: Tampa, FL
Posts: 4,656
Default

Quote:
Originally Posted by babybumb View Post
I think there is no chance in hell of Epic investing tons of money to develop fancy new tech for UE4 if the specs of Sony/MS consoles were a not up to the task..

If they dont have devkits they know the ballpark
Eh ... agree and disagree.

Just because Sony/MS/Nintendo may not be pushing high spec doesn't mean there aren't others that may be.

We've heard rumblings of a Valve console, and Apple console (I know), and frankly, there are likely to be quite a few gamers such as myself that say: "if all your offering is a <125mm2 GPU in your nextgen box, then I'm going to PC gaming".

So even though UE4 might be gimped in hardware from MS/Sony/N, that doesn't mean they won't be demoing the top end on PC's and aiming for PC customers ... or using the Corvette (BF3) model of show and hype the uber experience, and sell the gimped one.
__________________
"...the first five million are going to buy it, whatever it is, even if it didn't have games."
"I don't think we're arrogant"

...it seems laughable, laughable I tell you, that early 2012 technology that is under the 2005 budgets for the consoles cannot fit into a next gen box.
- Acert93
TheChefO is offline  
Old 05-Mar-2012, 22:52   #10286
TheChefO
Naughty Boy!
 
Join Date: Jul 2005
Location: Tampa, FL
Posts: 4,656
Default

Quote:
Originally Posted by Squilliam View Post
An AMD 78xx derivative GPU would likely be an excellent choice for a higher power console releasing in 2013. All they'd have to do would be likely cut the bus down to 128/192 bits to allow for future shrinking and reduce the number of memory chips required and they're good to go. 4-6 4Gb RAM chips ought to give them a very strong 2-3GB of system RAM which ought to be good enough for a next generation console. They could push it to 8-12 chips if they were to go with a clamshell design and that'd give them an obvious cost saving when 8Gb chips are released as well as double the potential memory on the same bus size.
Either that or go with XDR2 ...
__________________
"...the first five million are going to buy it, whatever it is, even if it didn't have games."
"I don't think we're arrogant"

...it seems laughable, laughable I tell you, that early 2012 technology that is under the 2005 budgets for the consoles cannot fit into a next gen box.
- Acert93
TheChefO is offline  
Old 05-Mar-2012, 23:02   #10287
Squilliam
Beyond3d isn't defined yet
 
Join Date: Jan 2008
Location: New Zealand
Posts: 3,146
Default

Quote:
Originally Posted by TheChefO View Post
Either that or go with XDR2 ...
XDR2 doesn't actually exist anywhere but in peoples imaginations... There aren't even samples of it as far as I am aware. If anything it'll be some kind of DDR or GDDR variant in the next generation consoles due to economies of scale and if they can take advantage of chip stacking I doubt that'd apply to XDR2 nearly to the same extent that it'd apply to say DDR4 for instance.
__________________
It all makes sense now: Gay marriage legalized on the same day as marijuana makes perfect biblical sense.
Leviticus 20:13 "A man who lays with another man should be stoned". Our interpretation has been wrong all these years!
Squilliam is offline  
Old 06-Mar-2012, 00:34   #10288
AlNets
A bit netty
 
Join Date: Feb 2004
Location: warp
Posts: 14,814
Default

It is pretty odd there are no plans mentioned yet for a GDDR derivative of DDR4 (16-bit prefetch). Not even a hint. Maybe they're running into a lot of problems.
__________________
"You keep using that word. I do not think it means what you think it means."
Never scale-up, never sub-render!
"UC3's story had more platforming than the gameplay."
AlNets is offline  
Old 06-Mar-2012, 00:36   #10289
Acert93
Artist formerly known as Acert93
 
Join Date: Dec 2004
Location: Seattle
Posts: 7,806
Default

Could it just be graphics companies are reserved to the fate of stacked memory and interposers? Seeing as AMD had some sort of sample they showed SA in late 2011 indicates they are seriously looking at addressing memory issues outside of the traditional format.
__________________
"In games I don't like, there is no such thing as "tradeoffs," only "downgrades" or "lazy devs" or "bugs" or "design failures." Neither do tradeoffs exist in games I'm a rabid fan of, and just shut up if you're going to point them out." -- fearsomepirate
Acert93 is offline  
Old 06-Mar-2012, 00:40   #10290
RudeCurve
Senior Member
 
Join Date: Jun 2008
Posts: 2,612
Default

Quote:
Originally Posted by bgassassin View Post
Honestly I would rather see that kind of info come from someone like Mittring. But it's like they pointed out "Samaritan doesn’t include artificial intelligence and other overheads of an actual, on-market game". Don't get me wrong. I think next gen consoles will have very nice looking games, but I see it achieved more from developer creativity than console hardware power.
CPU says hello...
__________________
I'd rather have 1680x1050 at 48fps...than 1920x1080 at 30fps...
RudeCurve is offline  
Old 06-Mar-2012, 02:47   #10291
TheD
Member
 
Join Date: Nov 2008
Posts: 214
Default

Quote:
Originally Posted by Rangers View Post
Haha, I dont think you understand optimization at all.

Regardless, it does not say "only nvidia flops" on Epic's slide, period.


You are the one that does not understand.
The simple fact is that the AMD GPU performance is not being held back anymore by the PC compared to nvidia's GPUs!

Internally AMD GPUs can not make use of the high peak floating point speed due to a number of factors, Nvidias GPUs on the on the other hand have a lower peak but the sustained performance is better.

If AMD could have pushed the floating point performance up without killing other things performance, they would have!

To think that AMDs GPUs have a huge increase in FP power just waiting to be unlocked by use in a console is naive.

Last edited by TheD; 06-Mar-2012 at 03:09.
TheD is offline  
Old 06-Mar-2012, 03:06   #10292
bgassassin
Member
 
Join Date: Aug 2011
Posts: 507
Default

Quote:
Originally Posted by TheChefO View Post
Those two go hand in hand. Without better hardware, there's only so much creatively one can do and at this point, I think devs have got all they can from this gen. Going into nextgen with roughly the same spec will lead to ... lack of creativity.
I agree. To reach "that level" though I expect developer "tricks" to fill the void that the power can't cover on it's own.

Quote:
Originally Posted by RudeCurve View Post
CPU says hello...
I had a small brain fart.

Anyway just like I said I'd need to see 1080p Samaritan one AMD GPU to believe it, I'd apply the same to needing to see Epic get that down to one 580 before I believe they could do it. And as I said almost a year later and that hasn't happened.
bgassassin is offline  
Old 06-Mar-2012, 03:37   #10293
Acert93
Artist formerly known as Acert93
 
Join Date: Dec 2004
Location: Seattle
Posts: 7,806
Default

Quote:
Originally Posted by TheD View Post
Internally AMD GPUs can not make use of the high peak floating point speed due to a number of factors, Nvidias GPUs on the on the other hand have a lower peak but the sustained performance is better.
Workload conditions (compute versus rendering) have a lot to say about this. The needs for many compute problems are quite different from a raster problem. A VLIW4 design at say 28nm 200mm^2 versus a GCN/Fermi on the same process size would see quite different performance (the former probably being better for most games up to this point as it has a high overlap for utilization, the later better for future work where compute becomes an important aspect and where previous GPU designs were not robust enough to maintain performance in these specialized programs).
__________________
"In games I don't like, there is no such thing as "tradeoffs," only "downgrades" or "lazy devs" or "bugs" or "design failures." Neither do tradeoffs exist in games I'm a rabid fan of, and just shut up if you're going to point them out." -- fearsomepirate
Acert93 is offline  
Old 06-Mar-2012, 03:49   #10294
Jedi2016
Member
 
Join Date: Aug 2005
Posts: 937
Default

Quote:
Originally Posted by RudeCurve View Post
CPU says hello...
Hehe.. good point.
__________________
On the Soap Box - My online ranting spot.
Jedi2016 is offline  
Old 06-Mar-2012, 08:19   #10295
TheWretched
Member
 
Join Date: Oct 2008
Posts: 669
Default

Bitcoin mining is a nice "comparison" between the flops of Nvidia and AMD... at least it was, half a year ago, as I am not sure of todays numbers. Back then, even the highest end NVidia GPUs couldn't hold a candle against my meager 5650 in my laptop. Though, the calculations needed for Bitcoin mining probably do favor AMD to a large margin, hence they are much faster.

Does this translate to better gaming performance? Hell no, but neither is a motorcycle a good choice for a Stock Car Race...
TheWretched is offline  
Old 06-Mar-2012, 08:22   #10296
Dr Evil
Anas platyrhynchos
 
Join Date: Jul 2004
Location: Finland
Posts: 4,695
Default

Quote:
Originally Posted by TheWretched View Post
Bitcoin mining is a nice "comparison" between the flops of Nvidia and AMD... at least it was, half a year ago, as I am not sure of todays numbers. Back then, even the highest end NVidia GPUs couldn't hold a candle against my meager 5650 in my laptop. Though, the calculations needed for Bitcoin mining probably do favor AMD to a large margin, hence they are much faster.
Was that due to speed or performance/watt/$?
Dr Evil is offline  
Old 06-Mar-2012, 09:02   #10297
hoho
Senior Member
 
Join Date: Aug 2007
Location: Estonia
Posts: 1,218
Send a message via MSN to hoho Send a message via Skype™ to hoho
Default

Quote:
Originally Posted by TheWretched View Post
Bitcoin mining is a nice "comparison" between the flops of Nvidia and AMD.
And what about folding@home? IIRC NV was miles ahead of AMD there
hoho is offline  
Old 06-Mar-2012, 09:43   #10298
Prophecy2k
Senior Member
 
Join Date: Dec 2007
Location: London
Posts: 1,533
Default

Quote:
Originally Posted by RudeCurve View Post
CPU says hello...
What? Didn't you get the memo? Next-gen GPUs can do GPGPU mang! No need for a CPU

Of course i'm being facetous
Prophecy2k is offline  
Old 06-Mar-2012, 09:46   #10299
Prophecy2k
Senior Member
 
Join Date: Dec 2007
Location: London
Posts: 1,533
Default

Quote:
Originally Posted by Acert93 View Post
http://www.tomshardware.com/news/pat...ope,14878.html

MS granted a patent filed in 2006 for 3D mouse.

I had deposited a long time ago prior to the WiiU I thought MS/Sony would either have a screen on the controller (not a wild prediction considering the Dreamcast kind of did this years ago) or that we could see a quasi Move-Classic controller where the wands "break out" essentially a normal pronged controller that could be separated for 3D wands. I am pretty curious at this point what MS and Sony will come up with. Personally a Kinect like camera with a traditional/breakout Move controller would pretty much fill a huge array of input scenarios.
This is what i've been papping on about for eons!

It'd cover all the traditional and motion control bases and would basically enable more inventive gameplay out of the box simply by being included in with every console.

Sony... MS... make it so number one!

:Jean-Lucpickardface:
Prophecy2k is offline  
Old 06-Mar-2012, 11:19   #10300
itsmydamnation
Member
 
Join Date: Apr 2007
Location: Australia
Posts: 796
Default

Quote:
Originally Posted by hoho View Post
And what about folding@home? IIRC NV was miles ahead of AMD there
last time i checked ( quite a while ago) that was no one but F@H's fault
itsmydamnation is offline  

Closed Thread

Tags
$599, 1 million troops, 1.21 gigawatts, blast processing v2.0, deal with it, don't cry for me acertina, duct tape, finfets everywhere, flops capacitor, george foreman, giant enemy crabs, i want to believe, impossibru, iphone disappear acert, it belongs in a museum, liquid cooling, little big grumpy mod, ludicrous speed, microsoft-sony.com, noooooooooooooooooooooooo, nothing but bits, over 9000, subscriptions everywhere, unlimited power

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:05.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.