AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
That area appears to be divided into 32 blocks or tiles on each side.

I would guess that the little row of units above those tiles would be the ROPs.
How many of those would it appear to be from that image?

The number of SIMD rows also looks high.
If still the same number of units per block, the ALU count should be much higher.

Something about this appears a touch high or there is an insane amount of redundancy, given the numbers already circulating.
 
Was the talks of multicore just a trick for nvidia, or there's some kind of multicore on a die ala phenom?

ok ok I know that a gpu is already multicore, but you understood what i'm trying to say :p
 
All i'm saying is that it's funny to see people praising DX11 and bashing PhysX at the same time. To me it all comes down to graphics in games I play. And for now I see more potential in PhysX then in DX11 for deliveling better graphics. _For now_.

I'm fairly excited for DX11 while PhysX leaves me only so-so. I just did a clean install of Sacred 2, patched, and fired it up with PhysX enabled. Game crashed every 15 minutes or so, and moreover the frame rate was horrible on my GTX 285 (Core i7 765, 6GB of memory, game loaded on an Intel SSD). Turn PhysX off and the frame rate greatly improved and the crashes stopped. I've yet to see PhysX do anything but add a little bit of visual enhancement(s) to a game, which really isn't what I envisioned when first hearing about hardware accelerated physics.
 
Those benchmark graphics. Wonder what happened on LP Colonies as performance tanked vs the 295 on Ultra High settings.
 
Last edited by a moderator:
Something about this appears a touch high or there is an insane amount of redundancy, given the numbers already circulating.
You're not wrong. It's either a fake, they're being extremely conservative, or a snowjob... 128KB L2/mem controller = 1MB total for 256bit bus. Looks too big an area somehow.
 
AMD next generation RV870 key specs revealed?

It is said that AMD next generation graphic core possible be named RV870 and according to TSMC technology, it will use 40nm or 45nm technology. The core area of RV870 will be about 140m㎡ which is much smaller than RV770 260m㎡. As we know by now, it will have 192 ALU. RV770 each ALU matched 5SP and then RV870 will have 960SP. In order to control the core area, it is still 256bit. We believe RV870 will be 1.2 times than RV770 in performances, but this will be decided by the clock of RV870.

It is also said that AMD next generation R800 will use new design. We know Radeon HD3870X2 and the coming Radeon HD4870X2 both used single PCB+dual graphic core design while R800 will possible use real dual core design. If so, AMD next generation flagship R800 will be the first dual core GPU. The specs of R800 will double RV870.

Advanced 45nm (40nm?) will bring RV870 smaller core area. The current RV770 did well in performances but the temperature is really terrible. If RV870 can settle this problem and further improve performances, it will be really excited for us and it will be the first real dual core GPU possibly. http://en.hardspell.com/doc/showcont.asp?news_id=3768
Excuse me but how and why did anyone even consider this piece of news for a minute? 5 minute penalty on Shtal for posting such a thing, you cannot post or speculate for the next 5 minutes. Here is your dunce cap. :p :p
 
Here, this should be in scale now, 1mm = 20px
rv770-870scale.png
 
Breath deeply...

09091421276f23be6536170.jpg


I think that with some more effort, there is a room for this thing to house at least 384-bit memory interface, looking at the perimeter features.
Fake.

I made it to show how to make a good fake (Maybe too good?) -> French Forums

It seems that someone from bbs.pczilla.net have visited the french Hardware.fr forums recently, and it's possible that he did not see the word "fake" above the image. I have never posted this image on another forum.
 
Last edited by a moderator:
No it doesn't. Nvidia is making loss quarter after quarter since HD 4xxx launch.

They have lost money the last 3 quarters one of which (last) was because of a one time charge. They always made money before that and are going to make money for the next few quarters going by the estmates.

They still added cash to the bank last quarter since the charge is a whole differnent beast.

They have 1.47 billion in cash no debt. They are financially strong.

I'll grant you that part of NVDA's downswing was because of the success of ATI's GPUs.....because they kicked ass. Doesn't discount the fact that AMD loses massive amounts of money every quarter for many years in a row and has a huge debtload.
 
They have lost money the last 3 quarters one of which (last) was because of a one time charge. They always made money before that and are going to make money for the next few quarters going by the estmates.

They still added cash to the bank last quarter since the charge is a whole differnent beast.

They have 1.47 billion in cash no debt. They are financially strong.

I'll grant you that part of NVDA's downswing was because of the success of ATI's GPUs.....because they kicked ass. Doesn't discount the fact that AMD loses massive amounts of money every quarter for many years in a row and has a huge debtload.

NVDA income past 12 months: -392.60 Million

They may be recovering now, but for the past year they've been losing money.

Note - this obviously isn't to say that AMD isn't losing money, which they are. But that is almost entirely due to the CPU side. As you noted before the graphics division only lost 12 million last quarter which is a drop in the bucket compared to wha the CPU side is bleeding. And the 12 million loss is still better than how Nvidia did.

Anyway, this is the WRONG THREAD for this. If you want to continue, I suggest we move it to either the AMD or Nvidia doom and gloom thread.

Regards,
SB
 
Back
Top