AMD: Speculation, Rumors, and Discussion (Archive)

Discussion in 'Architecture and Products' started by iMacmatician, Mar 30, 2015.

Thread Status:
Not open for further replies.
  1. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Are you sure?
    Because it seems from all documentation out there the TSMC 16FF+ in general is a higher power/"performance" design to the Samsung and was designed to replace their 28HP/HPM technology.
    Probably makes sense for there to be a thread soon combining all the documentation relating to both Samsung and TSMC 16nm options.

    Cheers
     
  2. Otto Dafe

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    400
    Likes Received:
    59
    It's unclear to me. Cadence is certainly marketing 16FF+ as the successor to TSMC's 28HP, but I don't know if it's a high performance process in the same sense. Certainly we can say that it's HP compared to TSMC 16FFC, but is it compared to Samsung/GF 14nm? I don't know. From reading the various foundries descriptions I'm getting the sense that maybe from now on everything is low power FinFET and that's that.

    On the other hand, didn't Nvidia and AMD pass on 20nm because it was LP, and wouldn't they have also known that forthcoming nodes would also be strictly LP?
     
  3. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    715
    Likes Received:
    220
    Location:
    india
    Now a 280X is like 20% faster than a 770 at 1080p while it was like 2% faster in your 'Now' review.

    http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_Waterforce/23.html

    The difference remains similar between 970 and 7970 though, so it seems the 2GB is hitting its limits besides of course
    *cough* kepler gimping *cough*
    .
     
  4. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,183
    Likes Received:
    1,840
    Location:
    Finland
    20nm had multitude of other issues too, and according to Cadence 20SoC was "high performance family" too. They do mention that TSMC has 2 variations of 16FF+, but AFAIK they're still both low power processes, one just more area optimized than the other. For what it's worth, Apple A9 is acording to some reports built on 16FF+ too, the supposed "high performance process"
     
  5. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Well TSMC imply 16FF+ is related to the 28HPM, and that lower power is as mentioned 16FFC.
    They do compare 16FF+ to 20SoC.
    However where the 20SoC may be deemed similar to HP family is due to the work I thought was done to create high performance FPGAs for Xilinx on that technology, but yeah I agree with you all it is questionable.
    Remember there were technical issues with developing the 20nm technology that skewed what was released or its limitations in real world (this may be different to what could be achievable with more cash thrown at it and critically more time that is rarely an option if already over-running).

    We know the GPU related from Samsung is coming from their LPP, as GF mention they have had success in adapting this (to what extent though is not publicly mentioned).
    I get the feeling this is going to turn into a semantic debate about the difference in terminology of TSMC using the words "than its 28HPM technology" and also "comparing with 20SoC technology" :)
    For now if going by with what released I would tend to think it is related to 28HPM, which ties a bit into the 2015 Technology Symposium at San Jose presentation they did.

    But yeah the only one where we have a true and full picture is that of the 28nm ecosystem: http://www.tsmc.com/english/dedicatedFoundry/technology/28nm.htm
    Some of the variables mentioned by each of us were touched earlier in the year on Semiaccurate forum: http://semiaccurate.com/forums/showthread.php?t=8647
    Will be interesting when more details come out on both technologies from Samsung and TSMC.
    Cheers
     
  6. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    I don't think so. But I haven't seen hard quantitive numbers to know for sure.
     
  7. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    331
    Likes Received:
    85
    Anandtech had a good piece on this and concluded that you couldn't conclude much unless you had a large amount of samples: http://www.anandtech.com/show/9708/analyzing-apple-statement-for-tsmc-and-samsung-a9

    EG they're close enough that it probably won't matter that much. Samsung actually started producing chips first, but TSMC has had better yields so far. But by next year? Who knows.
     
  8. FriendlyNeighbour

    Newcomer

    Joined:
    Sep 18, 2013
    Messages:
    21
    Likes Received:
    8

    If someone got the 7970 Ghz edition, they're now looking at a GPU roughly 30% faster than the 670. The 280X is the 7970 Ghz. The 960 is more or less on the same performance delta as the 670.

    [​IMG]

    AMD GPUs have aged far better than their NV competitors. But that's aside the point, because how many people keep their GPUs for over 3 years if they are serious gamers? Unless you're gaming mostly indie games or old DX9 titles, not many.
    [​IMG]
     
  9. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    You have to consider that not everyone is always playing the latest benchmarking suites. While a 280x may be much faster in say GTAV, things are likely still much more even in older games and perhaps just less high profile games.
     
  10. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    My 1GHz 7970 has been going strong since summer 2012. Admittedly World of Tanks is most of my gaming. The Japanese heavy tanks at tiers 5 and 6 are very entertaining :lol2:

    Another way of looking at it might be which card has the longer list of games that are unplayable at "max settings". I think it could be argued that at 1080p that list isn't very different between 670 and 7970.
     
  11. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    715
    Likes Received:
    220
    Location:
    india
    There is like a 20-30% or even more discrepancy which doesn't require large number of samples to verify. 3-4 reviewers ought to be enough. From what more I read of it, it seems to be confined to Geekbench battery test for now.
     
  12. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    331
    Likes Received:
    85
    That's not the point the article is trying to make, the point is you need to have a minimum N (sample number) under controlled condition to actually get anything useful, a test of two single phones isn't nearly enough. You'd need a minimum of about 20 of both samsung and tsmc each (depending on the acceptability range of each chip) to get anything useful. 20-30% could be fully within expected voltage/leakage variance for Apple's SOCs. Thus the entire difference could come from one chip just coming out much better than the other rather than a distinct process advantage.
     
  13. Rikimaru

    Veteran Newcomer

    Joined:
    Mar 18, 2015
    Messages:
    1,014
    Likes Received:
    395
    Do I have to change my Radeon 7870 to R9 370X to be a serious gamer? :lol:
     
  14. FriendlyNeighbour

    Newcomer

    Joined:
    Sep 18, 2013
    Messages:
    21
    Likes Received:
    8
    I never said serious gamers don't keep their GPUs for over 3 years, just that it's not very often ;)

    And yes, you probably should. AMD needs all the help in the world right now. Do your duty!
     
    Kaarlisk and pjbliverpool like this.
  15. flopper

    Newcomer

    Joined:
    Nov 10, 2006
    Messages:
    150
    Likes Received:
    6
    Huh?
    most gamers play on old games.
    dx9 engines are still used and new games are created on them.
    load of people dont upgrade their cards due to playing on 1920x1080 or less resolution due to for example cs:go or such e-sport games.
    dont need new cards for 200fps in those games.

    bought a new game yesterday, I have over 200fps using 1440p resolution at max with a 390.
    for any amount this 390 an 290 serie card which is soon 3 years old does just fine.

    if amd can deliver people would buy more of their cards but they often choose a more technical approach than what the customer wanted to have.
    the guy who work there stating a OC dream with the fury and he was drunk was talking smoke from his ass.
    such comments will hurt a company for decades.
    thats the issue with engineers they cant understand their own market.
     
  16. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Dx9 games are going to be here for a while, its not the age of the game that does that either. Most engines have been built with Dx9 in mind and then updated to newer Dx's. But keep this in mind, since the new consoles were released, there has been quite a few games that were Dx11 only and up hardware wise even though the engine they were built on was an updated Dx9 engine. So yeah developers are taking the easy road and dropping support for more than 2 generations of Dx's. And this is mainly due to console development.

    Upgrade cycles are personal preferences, and I think most people who spend 300 + bucks have expendable cash and they will upgrade when they feel like the upgrade is worth it, and this release they didn't show that the upgrade was worth it because nV's cards where there. AMD's market share lose shows that too. I'm sure AMD has lost most of its marketshare at the high end, a little less loss at the mid range brackets (they lost at the low end too, but break down wise its probably less loss)

    All this stuff does't matter if AMD cards are "aging" better. Get this shit out right if you want sales, cause once something is launched and reviewed, there are no take backs.
     
    #336 Razor1, Dec 28, 2015
    Last edited: Dec 28, 2015
    pharma and homerdog like this.
  17. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    715
    Likes Received:
    220
    Location:
    india
    And how do you think this minimum N is calculated? ;)

    They lament at the end that they only had TSMC chips, but they could have easily used them to check how much variance is there between chips from TSMC.
    Apple claim 2-3% variance for their phones, so a 20-30% difference is way out on the curve to bother with such testing.
     
  18. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,288
    Location:
    Helsinki, Finland
    Radeon 7970 is a surprisingly potent GPU. It sucks in tessellation and triangle throughput, but it even beats the Geforce 780 in many common compute tasks. DX12 async compute improves the GCN performance even further. With proper next gen engines based on compute shaders (and optimized for GCN architecture), the old AMD GCN GPUs are going to remain competetive for quite a long time.

    GCN also has tier 3 resource binding and many other DX12 features not supported by Geforce 600/700 series. It looks like AMDs architecture was too forward looking, too much focusing around excellent compute peformance, compute flexibility and DP. Now that developers can finally drop support of last gen consoles (and DX 9/10 PCs) the new engines can fully be designed around GPU compute. This is unfortunately a little bit too late for AMD, as they have already lost lots of their GPU market share. Nvidia's focus on geometry performance, ROP delta compression and other rendering related improvements was the right call for games designed around old console hardware limitations.
     
    #338 sebbbi, Dec 28, 2015
    Last edited: Dec 28, 2015
  19. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    I think that has always been AMD/ ATi's problem, they pretty much "over" engineer their chips or think software is going in a certain direction quickly, but not taking into consideration time to release of software that will use those features. If those features can't be shown as useful right off the bat, those features are pretty much "nice" to have but not essential for them to be marketable.

    PS happy holidays everyone!
     
    pharma likes this.
  20. dogen

    Regular Newcomer

    Joined:
    Oct 27, 2014
    Messages:
    335
    Likes Received:
    259
    Well, wasn't GCN made with consoles in mind? If so, it makes sense that it would be designed for the future, in a sense, given it would need to last for 5+ years.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...