The New and Improved "G80 Rumours Thread" *DailyTech specs at #802*

Discussion in 'Pre-release GPU Speculation' started by Geo, Sep 11, 2006.

Thread Status:
Not open for further replies.
  1. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    This is basically what nVidia's been saying all along.
     
  2. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    You're giving them a lot more credit than I am :) I'm just speculating as to where these numbers could have possibly come from. A whisper about a 1.5Ghz memory clock can easily turn into a rumour about a 1.5Ghz core clock as the whisper is passed along.

     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    If a current part is doing 1.0Ghz I don't think 1.5Ghz is too unbelievable for the maximum speed of a next-generation GDDR4 memory controller. G8x is going to last quite a while.

    It's not really a jump either for GDDR4 itself - 1.25Ghz was done and dusted last year and 1.6Ghz is on the way. http://www.xbitlabs.com/news/memory/display/20060214062714.html
     
    #424 trinibwoy, Sep 20, 2006
    Last edited by a moderator: Sep 20, 2006
    Jawed likes this.
  5. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    In other news, a reliable source told me the ATI lineup will be the X2800XTX/XT/GT, the X2600XT/XL and the X2300XT/Pro.
    Look at me, my info is reliable!!1!1!!oneoneone...... sigh.

    Uttar
     
  6. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    :lol:
     
  7. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Wrong thread! :evil:


    :wink:
     
  8. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I was actually thinking about the first high end release of the G8x family; that 1.25GHz you're mentioning equals roughly >50% more bandwidth than on G71. Cost is always a consideration, as well as keeping reserves for refreshes, when prices there drop even more.

    Currently the 64GB/sec of the R580+ plays out mostly in ultra high end resolutions (something like 2048 and beyond) and I doubt that for the next 6-8 months more than 80GB/s bandwidth are actually an absolute necessity. Personally I'd be glad if a G80/R600 can play coming games like UT2k7 with all bells and whistles on at 16*12; if anything should keep those from reaching playbility in higher resolutions I have severe doubts that the primary bottleneck will be bandwidth.
     
  9. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    Yep.

    Jawed
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Sure, 64-80GB/s is a very good possibility for the first G80 SKU's. Which is even more reason to believe that the memory controller would happily handle speeds higher than 1.25Ghz. And we don't know what's coming in terms of AA - maybe there are some cool things there that can sap up any extra bandwidth if things are shader bound. However, I'd still wager a lot on things continuing to be bandwidth bound next generation with HDR+AA and other fancy framebuffer stuff going on.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, considering that nVidia will probably be making use of the G8x architecture for the next 2-3 years, 1.5GHz isn't that outlandish an upper limit on clock speeds for the architecture. But it won't be seen any time soon, so it seems odd to have that number in any list of specs.
     
  12. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    Depends on how you look at it. Intel made a big fuss of however Netbust was supposed to scale to 10GHz. ATI made a big fuss about how the memory controller in R5xx was supposed to scale to GDDR4. PR PR PR.

    Anyway, what we've seen are hardly specs, more like a bunch of rumours / made-up-stuff flying in close formation.
     
  13. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism

    The memory controller does and did scale to GDDR4....The damn thing was BUILT for GDDR4 because they thought thats what the cards would be using at the time the R520 was planned for.

    I suppose what you should have said was that ATI touted some hidden performance increase when the memory controller was configured for GDDR4 which, frankly, the X1950 failed to show completely. Its given some nice perks, namely the OpenGL ones, but no difference between the memory can be seen.
     
    #433 SugarCoat, Sep 20, 2006
    Last edited by a moderator: Sep 20, 2006
  14. Sunrise

    Regular

    Joined:
    Aug 18, 2002
    Messages:
    306
    Likes Received:
    21
    It would and i really don´t see anything that could hinder it to scale even higher.

    First, there isn´t any reason why the memory controller would suddenly just stop working if you would fit the board with GDDR4 memory chips that are >1.5GHz, because, if the specs are met, you´ll just have to provide sufficient voltage and your PCB needs to cope with it. Second (they were refering to core frequency, after all) there is simply no way to estimate clock-improvements over time, counting in process maturity, new process-nodes, maybe SOI etc., so that number can only be derived from some obscure misunderstanding, or it was just put in there to make it look like it´s genuine, which is kinda funny, because it just looks like the exact opposite.

    Concerning BW, R600 will have >80GB/s and i highly doubt that NV will be far below that, even on their parts which are based on the first production run of G80. They will only get even more ambitious, and G80 was designed for >2 years in mind.
     
    #434 Sunrise, Sep 20, 2006
    Last edited by a moderator: Sep 20, 2006
  15. bdotobdot2

    Newcomer

    Joined:
    Jun 7, 2004
    Messages:
    56
    Likes Received:
    1
    Location:
    Tonawanda, NY
    Nvidia GeForce 8800 Specs Posted by Web-Site - http://www.xbitlabs.com/web/display/20060919075610.html

    ...
    The whole Nvidia G80 specifications list by VR-Zone looks as follows:
    * Unified Shader Architecture;
    * Support FP16 HDR+MSAA;
    * Support GDDR4 memories;
    * Close to 700M transistors (G71 - 278M / G70 - 302M);
    * New AA mode: VCAA;
    * Core clock scalable up to 1.5GHz;
    * Shader peformance: 2x Pixel/12x Vertex over G71;
    * 8 TCPs & 128 stream processors;
    * Much more efficient than traditional architecture;
    * 384-bit memory interface (256-bit+128-bit);
    * 768MB memory size (512MB+256MB)
    ...
    * GeForce 8800GTX: 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. $649;
    * GeForce 8800GT: 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499;
    ...
     
  16. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    Well, what I meant was that ATI make a big play of their forward-looking design, their future-proofing of their memory controller, and so on. This was positive spin on the design which, at the time it was announced, was a feature which was totally useless as far as the people buying the boards then was concerned, because it wasn't exploited. But it was good PR, because it made ATI look like a company with "vision"; made the best of a bad situation maybe.

    So you could judge "scales to 1.5 GHz" either as as specification (which it clearly isn't). Or as the same sort of positive PR spin. Or as complete nonsense.
     
  17. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    I can't believe even you're bungling supar sekret info, U. That last "oneone" should obviously be "eleventy" for anyone who knows anything about 3D. Hang your head!

    I guess bdotobdot2's unnec'y repetition is as good a way to bookend this craziness as any. Now, let's all take Rys' recommended cold salt shower and move on ... to the leaked R600 specs! Like salt, they also contain zero calories and are likely to lead to hypertension: good stuff. Oh, right, geo, wrong thread. ;)
     
  18. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    did these guys piece together that xbox 360 :lol:
     
  19. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Talk about it in the R600 thread instead, or papa spank. Danke.
     
    _xxx_ likes this.
  20. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Actually from what I've seen Nvidia's pricing is surprisingly UN-aggressive.

    Basically what they're doing is being competitive in price/performance at every segment, but only that. In fact a case can be made ATI leads in performance in many price segments. The X1900GT vs the 7950 GT, for example. Or considering the X1900XT can be had for nearly only $300 these days.

    But Nvidia is competitive in every segment with MUCH smaller dies, ~80% smaller.

    So what that tells me is, unless ATI is somehow getting hugely better yields which is highly unlikely, what Nvidia is really doing is maximizing profit rather than market share (although their market share is certainly not suffering either). They are selling tiny dies at the same prices ATI is selling big dies. They are pushing that 40% profit upwards, while ATI is what, 20-30%?

    The one exception to this price/performance analysis is the 7600GT, which while ATI owns performance in the $200+ segments, ATI has absolutely no reasonable answer for.

    Also I would like to note the latest INQ G80 article states in a end footnote that Fuad thinks the leaked G80 specs unreasonable for this process generation. A quick thought exercise proves him wrong imo, 700 million transitors is just above double G71, while ATI is already selling ~80% larger dies at willy nilly sub $300 prices! Sure, G80 would be the largest and most expensive chip yet, but it's NOT unreasonably larger than R580.

    Although I just realized he may also have meant the clockspeed, ah well I leave my comment.
     
    #440 Rangers, Sep 21, 2006
    Last edited by a moderator: Sep 21, 2006
    bloodbob likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...