AMD: Sea Islands R1100 (8*** series) Speculation/ Rumour Thread

silentguy is right. The market is really hot for anyone with experience in graphics or low power. I'm sure other areas are hot as well.
It doesn't have to be that specific, if only because everybody should be low power oriented already by now. ;) Video, fabrics, CPU architecture, caches, signal theory, overall SOC expertise etc. I see tons of recruiter emails, old colleagues leaving, interviewing candidates, people joining incl. fresh from college even not first tier(!). Lots of people changing chairs and chairs being added.
 
everybody should be low power oriented already by now. ;) Video, fabrics, CPU architecture, caches, signal theory, overall SOC expertise etc.

What is the benefit/ profit? :LOL: Workstations with performance of a phone or tablet? :LOL:
This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
Low power= less work performed per any given period of time or pure waste.
 
Qualcomm in particular is in a good position to pick-up AMD employees. They have a Markham office not a Tim Horton's away from AMD's with plenty of open positions. Meanwhile it was recently pointed out to me that they also have an Orlando office with open job positions in graphics (AMD's Florida location houses the ArtX guys).

Qualcomm and Samsung are hiring a lot from AMD in Austin.
 
My group currently has no budget, but if there was someone available who was truly "elite", we'd hire them.
Elite people rarely change jobs, and even the best teams have too few of them.

I agree, I work for a very large corporation but we do not let go of our elite people, the few that do decide to leave are replaced with young talent that is very good with fresh ideas.

There has been a recession going on in North Amercia for seven years (maybe more), your comments about not having a budget reflects what alot of companies are dealing with.
When you sit back and think of what AMD attempted to do, take on Intel and win marketshare (which they did) and at the same time compete with Nvidia (beating or sometimes losing in performance) looking at what they have done it is pretty impressive but I think it was too aggressive for them, taking on both Giants at the same time.
If AMD had good forcasting they could have had a mobility chip and beat Intel and Nvidia to market, their current business model focusing on servers is sound as all these hand helds do need to talk to something, but AMD does have expertise with CPU/GPU development and could have been a major player.
 
What is the benefit/ profit? :LOL: Workstations with performance of a phone or tablet? :LOL:
This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
Low power= less work performed per any given period of time or pure waste.

He didn't mean that we should give up on 100W chips. He meant everybody should be focused on power efficiency.
 
What is the benefit/ profit? :LOL: Workstations with performance of a phone or tablet? :LOL:
This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
Low power= less work performed per any given period of time or pure waste.
If you don't use low power design techniques on a chip that has a 100W budget, your competitor will make the same chip with a 50W budget.

Thanks for playing, though.
 
So, what is happening guys?. Will GCN be the last great architecture for desktops from AMD?. Do we blame Apple and the smartphone/tablet revolution?. What about Nvidia?. For us, graphics freaks, this all sounds a little depressing... above all because people with the latest Iphone has no clue about the GPU driving its little fashion device and however we, gpu lovers all have tried to understand the insides of the trannies moving the polys in our devices from de Voodoo era... All will be lost like tears in the rain!?.
 
Last edited by a moderator:
GPU architectures will continue to evolve. The only change will be that the product cycles will slow down and new processes won't be used right away.
 
You know with AMD doing so badly and everybody jumping ship, you really have to wonder what is so wrong with Nvidia that they can basically only get a tie on performance while being months late.
 
You know with AMD doing so badly and everybody jumping ship, you really have to wonder what is so wrong with Nvidia that they can basically only get a tie on performance while being months late.
This is absurd.

The one time Nvidia doesn't make a massive monolithic die of doom, they get jumped on for "only tieing" AMD. Let me also point out that most of the "abandon ship" has been happening well after the release of the 7000 series.

In reality, Nvidia's position has improved relative to AMD's. Power efficiency is much better, perf/mm^2 is better, their microstutter is drastically reduced...

But no, because their top chip is smaller than AMD's, there's "something wrong" with them.

In reality, the only thing that's wrong here is the argument you've just made.
 
This is absurd.

The one time Nvidia doesn't make a massive monolithic die of doom, they get jumped on for "only tieing" AMD. Let me also point out that most of the "abandon ship" has been happening well after the release of the 7000 series.

In reality, Nvidia's position has improved relative to AMD's. Power efficiency is much better, perf/mm^2 is better, their microstutter is drastically reduced...

But no, because their top chip is smaller than AMD's, there's "something wrong" with them.

In reality, the only thing that's wrong here is the argument you've just made.

Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?

So Kepler wins in perf/watt at the enthusiast end when AMD saddles their gpu with a crapload of compute and a memory bus + memory that is 50% bigger than it needs to be? Jeez I'm so impressed!

AMD could barely be in a worse state yet they are still ahead of Nvidia's "great" Kepler architecture where it matters. That should make you and rpg wonder just how godawful bad Nvidia is.
 
They couldve made a gk110 consumer card if they absolutely had to.

On the other hand i dont feel amd's situation is as grave as its made out to be, they have the next gen in the oven already im sure. And its not like critical engineers just walk one day, and everything they touched gets delayed while their replacement gets up to speed.
 
Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?

I think GK107 is great and GK106 not terrible either.
Of course AMD does well too. They both do efficient GPU and there's not much flame material unless you want to nitpick here and there.

I think we can order the low-midrange products this way and that it's true for both power use and performance :
GTX 650 < 7770 < GTX 650 ti < 7850

If a vendor sucked and the other not we'd see a card both more power hungry and slower than the competition, as could happen in the past.
( I happen to think the GTX 650 and 650ti are appealingly power efficient :p )
 
I think GK107 is great and GK106 not terrible either.
Of course AMD does well too. They both do efficient GPU and there's not much flame material unless you want to nitpick here and there.

I think we can order the low-midrange products this way and that it's true for both power use and performance :
GTX 650 < 7770 < GTX 650 ti < 7850

If a vendor sucked and the other not we'd see a card both more power hungry and slower than the competition, as could happen in the past.
( I happen to think the GTX 650 and 650ti are appealingly power efficient :p )

Yes I agree, however that's what I said. Nvidia basically ties while being months late, and while Nvidia fanboys are happily putting the boot in they should remind themselves that the best their team can muster while AMD is imploding is to get a draw. I see no reason for anyone to be smug right now.
 
Whatever helps you sleep at night. Meanwhile, in the real world, Nvidia's raking in the dough because AMD flopped terribly with OEM wins.

Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?

So Kepler wins in perf/watt at the enthusiast end when AMD saddles their gpu with a crapload of compute and a memory bus + memory that is 50% bigger than it needs to be? Jeez I'm so impressed!

AMD could barely be in a worse state yet they are still ahead of Nvidia's "great" Kepler architecture where it matters. That should make you and rpg wonder just how godawful bad Nvidia is.
You aren't thinking rationally. Your post is saturated with confirmation bias, and you have no business partaking in rational discourse until you can demonstrate an ability to critically think.

Why are you unimpressed? Because you want to be. Because you want AMD to win, and because you want Nvidia to lose. So you come here and post a ridiculous argument with no factual backing in order to paint Nvidia as the loser.

This isn't the Reddit hardware hivemind. This is Beyond3D. Your drivel does not belong here, and the majority of us around here are just going to roll our eyes at you.
 
Last edited by a moderator:
Whatever helps you sleep at night. Meanwhile, in the real world, Nvidia's raking in the dough because AMD flopped terribly with OEM wins.

Yeah I remember running the numbers on it. Nvidia has made a profit of about $10m a quarter on average for the past 5 years. Great stuff, really raking in the dough. :LOL:
 
You aren't thinking rationally. Your post is saturated with confirmation bias, and you have no business partaking in rational discourse until you can demonstrate an ability to critically think.

Why are you unimpressed? Because you want to be. Because you want AMD to win, and because you want Nvidia to lose. So you come here and post a ridiculous argument with no factual backing in order to paint Nvidia as the loser.

This isn't the Reddit hardware hivemind. This is Beyond3D. Your drivel does not belong here, and the majority of us around here are just going to roll our eyes at you.

:LOL: Taxi for "Homeles"
 
Back
Top