AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
Usually the cutdown parts are faaaar better value these days.
I know, but all I can think of is those disabled units just sitting there, not contributing. It irks my hardware-obsessive-compulsive nature. I rather wait longer between upgrades and get a full card, then use it for a good while and be comforted in that it's firing on all its cylinders! :p

GTX970 vs R290: it isn't a difference of ~150W if you live in a hot place. Your AC must also compensate for the added heat.
Not sure how it is for continental European homes, but vast majority of scandinavian homes don't have AC at all (including my own, incidentally...) Usually we don't have hot enough summers, but add a firing-on-all-its-cylinders R9 390X to an already very warm summer week and it becomes quite uncomfortable inside after a prolonged gaming session. It's not for nothing that I shut down folding@home over the summer...
 
In the end less power consumption is good, as it is easier to cool the card silently, but in the end the extra operating costs are not much of a thing unless you do run the cards 24/7 on folding or crunching.
 
Do people not understand how important conservative raster is?
(...)
Anyhow I am glad NVIDIA isn't on 16nm Fermi, thus price/performance (taking into account heat in my room) being equal I reward them with my dollars.

So back when nvidia refused to adopt DX10.1 and 1st-gen tesselation and had much lower efficiency and price/performance on Tesla vs. RV7xx cards you definitely rewarded AMD with your dollars, right?

Also something people never seem to consider: GTX970 vs R290: it isn't a difference of ~150W
Lol the difference is close to 75W, not 150W.
 
Last edited by a moderator:
DX 10.1 had nothing to do with tesselation.

And, meant "and 1st gen tesselation". Corrected.
Scratch that, it was 2nd gen tesselation because the Xenos had AMD's 1st gen.
 
Last edited by a moderator:
Let's get back on topic .... will Vega be performant enough for the enthusiast market when it launches in 1H 2017 since Volta may appear on 16nm FinFET about the same time for the Power9 servers? What performance level should AMD be targeting with Vega ... 1080, Titan XP, or Volta?

Well if we compare with latest release of Nvidia Tesla and the deep learning combo, Nvidia have present Pascal P100 in April, with some "specific order" for specific contract for some supercomputers, in June 2016 they had announces the Tesla P100 with availability in Q4 2016 ( so still not available ), and so far the Titan "XP" based on GP102 who have been launched some weeks ago. ( only available through Nvidia shop anyway for the moment) . ( In between ofc, they have launch GP104, GP102 etc )

So far we have an availability for GP100 Tesla based Q2016- 2017, and we know that Nvidia dont like to sold gpu for 3-6months only, knowing only Power 9 server of IBM will include Nvlink 2.0, and knowing how Nvidia like to present the gpu with availabilitty 8 months to year then, .... Can we really think that Volta Tesla and Quadro, will be available in 2017, or before Q4 2017 ? Outside some specific contract with Server using IBM Power9 ?

So just for said, that the IBM Power9 series availbility dont mean forcibly that you will see Geforce based product, quadro and Tesla, released the day after.

As for the gaming part, a replacement to 1080, 1070 and maybe a titan, ( but at this date, GP102 dont exist outside Titan ?) i think AMD know that Nvidia refresh his lineup every year. But when they will be released is a bit unkown.
 
Last edited:
So back when nvidia refused to adopt DX10.1 and 1st-gen tesselation and had much lower efficiency and price/performance on Tesla vs. RV7xx cards you definitely rewarded AMD with your dollars, right?

Yes the rv7xx cards gained marketshare, it wasn't till the Fermi 2.0 (5xx) series came out, nV's marketshare slide stopped and nV started gaining back share. Marketshare slip could have been due to Fermi's apparent delay and Tesla 3.0 bad perf/watt, but the slip continued after its release too.

Forgot to add in there, the 2xx series was the time nV had a lose posted since the FX series.
 
Last edited:
And, meant "and 1st gen tesselation". Corrected.
Scratch that, it was 2nd gen tesselation because the Xenos had AMD's 1st gen.
The stretch gets longer and longer and longer.

1st gen tessellation arguably was at R200/NV20 timeframe, with AMD's version dubbed TruForm even showing up in some non-niche games like CoD and Wolfenstein. But after that, I am not aware of any pre-DX11-tessellation that was accessible by game developers via open-standard APIs.
 
And, meant "and 1st gen tesselation". Corrected.
Scratch that, it was 2nd gen tesselation because the Xenos had AMD's 1st gen.
Why are you even bringing tesselation into debate? That was DX11 feature not DX10(.1) which had nothing to do with tessellation (well there are geometry shaders, but that's not really it). And it doesn't help you counting generations either as RV770 wasn't DX11 capable.
But given how you are bringing it in I assume that D3D10 strap-ons are completely fine in which case I feel obliged to point out that NV had it's own D3D10 strap-on that exposed majority of actually useful features from D3D10.1. So which features are you missing?
 
The stretch gets longer and longer and longer.
Care to explain?

1st gen tessellation arguably was at R200/NV20 timeframe, with AMD's version dubbed TruForm even showing up in some non-niche games like CoD and Wolfenstein.
http://www.anandtech.com/show/2231/12

It's no secret that R600 is AMD's second generation unified shader architecture. The Xbox 360 houses their first attempt at a unified architecture, and the R600 evolved from this. It isn't surprising to learn that some of the non-traditional hardware from the Xenos (the Xbox 360 GPU) found its way into R600.

AFAIK, the tessellator unit didn't change until TeraScale 2 DX11 because AMD's previous tessellator only supported up to 16x geometry increase.




Why are you even bringing tesselation into debate?

It was a feature that nvidia cards didn't support.
Why do you care so much that I brought it up?



But given how you are bringing it in I assume that D3D10 strap-ons are completely fine in which case I feel obliged to point out that NV had it's own D3D10 strap-on that exposed majority of actually useful features from D3D10.1. So which features are you missing?

B-but nvidia had.. like.. strap-ons for almost all DX10.1 features that I personally find the most important...
Okay, good to know.






I wonder why a question I made to homerdog turned into all these knee-jerk reactions from third parties. I guess I touched a nerve.
Totally logical to buy nvidia hardware if their hardware is superior in features (arguably) and efficiency. Even if doing so contributes to progressively plunging the GPU market into a very dark monopoly era as I have shown in a previous post (though I completely understand why a consumer would never take that into account).

But to do so with AMD's hardware when the positions were reversed? Then we feel obliged to mention those strap-ons and be super picky with changes on semantics!
 
Last edited by a moderator:
Talking about marketshare you want to know how AMD is not getting it back? Selling the 480 for more than 300(270 is cheapest Ive found) when you can get a better performance 1060...Yes in newer/future games the 480 may be better and yes maybe the resellers are the ones that actually sells most of the cards so expensive(but some of them have that official price) but for the end users all of that doesn't exist. People don't care about resellers, process problems, or better for the future, most of the people doesn't think of that kind of stuff. And to be honest right now there is no reason to buy a 480 or even a 260 dollars MSI 4GB 470 when you can get a 250 6GB 1060 that outperforms then in most current game.

Again if there is any AMD employee reading this I have to tell you this: Thats how you don't get marketshare
 
The stretch gets longer and longer and longer.

1st gen tessellation arguably was at R200/NV20 timeframe, with AMD's version dubbed TruForm even showing up in some non-niche games like CoD and Wolfenstein. But after that, I am not aware of any pre-DX11-tessellation that was accessible by game developers via open-standard APIs.

Dont forget Half Life 2 ( even further than COD or Wolfenstein ) http://www.ign.com/articles/2004/10/09/half-life-2-optimization-guide?page=4 ....

But is Trueform was really based on Tesselation ?, it was even available on ATI 8500, 9500-9700-9800 ( availble witth half life 1 ( only game i know where a setting change in 2001 was able to enable it ) .
This said, tesselation was really show if im aware on the DX demo of ATI ( Ruby, mountains ) for HD 2000 series ( 2009 )

Now this had little to do with DX.

As for the DX10.1, i think the addition was mainly on shader SM 4.1 and shader performance improvement , Gobal illlumination ( Dirt ),

FInd an article about it. ( HD 3800 ) http://www.tomshardware.co.uk/amd-hd-3800-dx-10,review-29720-3.html

But dont know what it is doing in this topic.
 
Last edited:
AMD's biggest problem seems to be not having enough product to sell.

Fictional RRPs don't matter too much when 4GB 470s are going for more than the RRP of the supply constrained 8GB 480s.
 
Care to explain?
See below.
From your link:
„This is something that Microsoft is planning on adding to future versions of DirectX as well, but in the meantime developers will need to take special steps to utilize the hardware.“

As I said, the count is pretty much off and matter of fact, there was no open standard - and in fact no standard at all, before DX11. Which…
AFAIK, the tessellator unit didn't change until TeraScale 2 DX11 because AMD's previous tessellator only supported up to 16x geometry increase.
… surprise surprise, was dubbed 6th gen tessellation, emphasizing how much epxerience AMD had with Tessellation hardware already.
TjHHaqJ.png


So, in the end it is of absolutely no importance whether or not AMD already supported/carried over some kind of tessellator inside their R600 generation, since it was apparently not accessible from the outside or at least no game developer cared enough to try to make use of it outside of an industry standard.

Tessellation was there before, as I said, in R200 and NV20 already, so no innovations here for R600/RV7xx.

Dont forget Half Life 2 ( even further than COD or Wolfenstein ) http://www.ign.com/articles/2004/10/09/half-life-2-optimization-guide?page=4 ....

But is Trueform was really based on Tesselation ?, it was even available on ATI 8500, 9500-9700-9800 ( availble witth half life 1 ( only game i know where a setting change in 2001 was able to enable it ) .

Oh of course! But it was Half-Life 1 with a patch, not HL2. Truform was a software-based solution after R200, due to little success it had (and even less sucess, that Nvidias competing solution had). And Serious Sam had bubble-guns too, IIRC.
 
Last edited:
Talking about marketshare you want to know how AMD is not getting it back? Selling the 480 for more than 300(270 is cheapest Ive found) when you can get a better performance 1060...Yes in newer/future games the 480 may be better and yes maybe the resellers are the ones that actually sells most of the cards so expensive(but some of them have that official price) but for the end users all of that doesn't exist. People don't care about resellers, process problems, or better for the future, most of the people doesn't think of that kind of stuff. And to be honest right now there is no reason to buy a 480 or even a 260 dollars MSI 4GB 470 when you can get a 250 6GB 1060 that outperforms then in most current game.

Again if there is any AMD employee reading this I have to tell you this: Thats how you don't get marketshare


Agreed with all that.. but the problem with pretty much all FinFet graphics cards in general is that they're all flying off the shelves at whatever prices the resellers are asking for.
As soon as the supply and demand for Polaris 10 gets stable enough, you'll get the MSRP they announced.
Until then, all you can do is either wait or buy a 1060...

So, in the end it is of absolutely no importance whether or not AMD already supported/carried over some kind of tessellator inside their R600 generation, since it was apparently not accessible from the outside or at least no game developer cared enough to try to make use of it outside of an industry standard.
I'm pretty sure it was used in X360 games. Why it was never ported to the PC, along with an ATi/AMD specific path is whole other question...
Regardless, ok it was mostly useless in TeraScale 1.
 
Last edited by a moderator:
Agreed with all that.. but the problem with pretty much all FinFet graphics cards in general is that they're all flying off the shelves at whatever prices the resellers are asking for.
As soon as the supply and demand for Polaris 10 gets stable enough, you'll get the MSRP they announced.
Until then, all you can do is either wait or buy a 1060...
Sadly my friend when that happen almost everyone who wanted a new VGA in that price ranged will have bough a 1060 or -1050- 1060 3GB that are selling for 199...
 
Status
Not open for further replies.
Back
Top