AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
Probably because it provides a solid implementation of DX12 to demonstrate future performance. TW:Warhammer will probably get used as well once the DX12 version is available. All indications were that it performed similarly to AOTS though. Same situation with Doom and the Vulkan backend.
So far Vulkan works well on NV, better than AMD as well.
 
Last edited:
we've def seen the low end ($50-$100) boards shrink or disappear.
I'm still waiting for a fanless $50 card capable of decoding h265 video. Low-end Intel Skylake can do that, but low-end AMD and Nvidia cards can not. It's shameful.
 

Not a good reference as guru3d review doesn't say what the graphics score was and their overall score is tainted by the fact they use a 4.4Ghz 5960X 8 core processor.

I did a run with my Nano at near Fury X speed and a 4770k at 4.2Ghz, similar to the one used in your supposed RX480 score.

http://www.3dmark.com/3dm11/11314896

The graphics score is almost the same as your link. I'm actually beaten there by a tiny margin, but there is definitely some margin of error there. Regardless the RX480 score isn't looking too bad.

edit:

Had to do another run just to squeeze slightly on top, also noticed my RAM speeds had dropped at some point so clocked those back to 2133Mhz...

http://www.3dmark.com/3dm11/11314931

I also want to point out that my GPU isn't throttling there to any meaningful degree. I had the fan at 85% and power limit at +50% with a minor undervolt.
 
Last edited:
They don't punch their weight beyond what we can do with PC graphics. Hell no, you sound like this guy I was talking to about 2 years ago at a bar, the topic came to gaming somehow, and he thought the the PS3 had better graphics than computer games, and cited look at the sweat from Kobie Bryant's forehead. At that point I started ignoring him and went about why I sat at that specific part of the bar to talk to the girl to the left.

The landscape of creating games changed, as PC was the pinnacle in pushing graphics and games forward, as the Xbox and later PS3 came about, with the increase of piracy, PC games became a shell of themselves when it came to potential profits, so developers and publishers switched to consoles games as their primary focus. This is what you are seeing. And in the short term which ever IHV has console contracts do have some advantage, but usually the other IHV adapts within a generation and negates that advantage.

Everything comes down to the all might $

I said "Consoles punch above their weight", don't talk non-sense.
 
I said "Consoles punch above their weight", don't talk non-sense.

Really? There's a lot of 300-350 USD PC's that are even close to matching what a console can do? If so, I'd love to know where I could buy one.

Regards,
SB


That's the problem They DON'T punch above THEIR weight lol. When you have developers focused on a minimum amount of graphics performance the consoles provide the software is made around those limitations. When the Xbox one and PS4 was released, there were PC graphics cards that had 4 times the graphics horsepower on the market, console graphics performance were low end mainstream. And then you had the next gen after that which were 6 times more then another generation that is now 8 times more lol.

But games are targeted at consoles, so they won't be using all that extra graphics horsepower unless they make a ton of changes to their engines and games. That won't happen unless they use $. Now this is where the entire "consoles" give you PC marketshare advantage comes from, but that doesn't materialize because the IHV that has the consoles, doesn't have much PC marketshare, so the games that do come out to PC's are focused on optimizations to nV hardware without the increase in fidelity of using all that extra horse power that could have been harnessed if consoles weren't the main focus of development (which as explained earlier its about $)
 
That's the problem They DON'T punch above THEIR weight lol. When you have developers focused on a minimum amount of graphics performance the consoles provide the software is made around those limitations. When the Xbox one and PS4 was released, there were PC graphics cards that had 4 times the graphics horsepower on the market, console graphics performance were low end mainstream. And then you had the next gen after that which were 6 times more then another generation that is now 8 times more lol.

But games are targeted at consoles, so they won't be using all that extra graphics horsepower unless they make a ton of changes to their engines and games. That won't happen unless they use $.

They perform significantly better than a 300-350 USD PC. That's the entire definition of punching above their weight. This was true even at launch when they were 399 and 499 USD respectively.

Everything else is just excuses.

Yes you can get a faster graphics card. But you can't game on just a graphics card alone.

Regards,
SB
 
They perform significantly better than a 300-350 USD PC. That's the entire definition of punching above their weight.

Everything else is just excuses.

Regards,
SB


Well you just don't know how development is done, no big deal, just don't see why you are in this forum then?
You take a very myopic look from the end user, you aren't looking at business side of it, why the reason for what has happened happened, and why the future doesn't change by leaps and bounds one direction of another.
 
Well you just don't know how development is done, no big deal, just don't see why you are in this forum then?

So basically what you're saying then is.

All these past years Nvidia has had inferior hardware to AMD, right? After all development on PC focused on Nvidia hardware. Hence it was only faster due to development focus? Nonsense.

And even assuming equal development focus on console and PC. A PC at the same price point as a console will never match a console in performance.

Regards,
SB
 
So basically what you're saying then is.

All these past years Nvidia has had inferior hardware to AMD, right? After all development on PC focused on Nvidia hardware. Hence it was only faster due to development focus? Nonsense.

And even assuming equal development focus on console and PC. A PC at the same price point as a console will never match a console in performance.

Regards,
SB


BS. I suggest you look at why Crysis and games like it never made it to consoles, cause they would never be able to run those types of games.

The damn argument that people have used to say PS1, 2 were better graphics than PC games, they never where better. And never will be unless the GPU used in consoles are just as capable of a high end GPU. If games targeted high end cards, no console would be able to run them.
 
BS. I suggest you look at why Crysis and games like it never made it to consoles, cause they would never be able to run those types of games.

You should watch the BS you're spewing.

Could Crysis ever run even remotely well on a 300-350 USD PC? Hmmm? I'm waiting?

Regards,
SB
 
You should watch the BS you're spewing.

Could Crysis ever run even remotely well on a 300-350 USD PC? Hmmm? I'm waiting?

Regards,
SB


hell yeah, why couldn't it? I was running it on a 8800 GTX at the time it came out, at 1600x 1200 resolution. yeah had to turn so of the effects down but it ran. If I dropped resolution I could increased some of the graphics settings. Just wait 6 months after it would have ran at the High midrange easily at the same settings. And if I just dropped resolution it would have ran on an 8800 gts.
 
Thanks for the heads up, I have all but forgotten many things about Cayman. But this looks not much like asynchronous compute, since it talks specifically about executing multiple compute kernels without a context switch, but no mix of graphics and compute, which is what today's rage is all about. Or am I getting it wrong again?
 
hell yeah, why couldn't it? I was running it on a 8800 GTX at the time it came out, at 1600x 1200 resolution. yeah had to turn so of the effects down but it ran. If I dropped resolution I could increased some of the graphics settings. Just wait 6 months after it would have ran at the High midrange easily at the same settings. And if I just dropped resolution it would have ran on an 8800 gts.

Ummm, an 8800 GTS alone was 350 USD. That's not even counting the other components you would need to actually run Crysis.

A 220-230 USD 8800 GT could run it somewhat well but again, you still need the rest of the computer.

http://www.anandtech.com/show/2396

Yeah, no.

Regards,
SB
 
Honestly, i absoolutely not understand why Nvidia have not allready do it. I was prettty sure that it willl happend on Pascal allready, but like it is still not there, i will bet now for Volta. Its a logical evolution, it was extremely logic when AMD have introduct it, and it will be a logical suit for the nvidia architecture.
To change direction like that, would take time, most likely Volta would be the target to make such a change, as pipeline was what we have seen from nV since Kelpar has been evolutionary.
 
Ummm, an 8800 GTS alone was 350 USD. That's not even counting the other components you would need to actually run Crysis.

A 220-230 USD 8800 GT could run it somewhat well but again, you still need the rest of the computer.

http://www.anandtech.com/show/2396

Yeah, no.

Regards,
SB


Ok I see what you are saying the complete cost of a console, yeah consoles are a "better" buy from a hardware perspective but that is because the console manufactures aren't taking much profits and losses in the past. That is the consequence for them because they make it up form game sales. They are looking at total potential sales of the entire ecosystem.

Graphics card market can't sustain that type of model for obvious reasons.
 
I don't think the price is the main factor when people are saying punching above their weight. It's usually about whether the console can outperform a PC, which has similar power level GPU in it.

The consoles punch above their weight in either type of comparison, so you have to wonder why that argument was even being tried to be had in the first place.
 
I also wonder what happens with vertex attributes in GCN, those are stored in shared memory since attributes interpolation has been moved to ALUs in Cypress, shouldn't vertex attributes have to be spilled into memory before CU could proceed with compute shader? If they have to be spilled first, how long does it take to do so?
You realise that NVidia was doing attribute interpolation in shader code from barycentrics stored in shared memory way before AMD, don't you?

Don't you?

You've been paying attention this past, erm, 10 years or so?

Anyway, vertex attributes consume a pitiful amount of the shared memory of each CU.

Oh hang on, I've just seen Razor1 is posting about how he talked to a girl ... I'm in the wrong thread.
 
Status
Not open for further replies.
Back
Top