NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Discussion in 'Console Technology' started by Barso, Aug 30, 2012.

Thread Status:
Not open for further replies.
  1. Ruskie

    Veteran

    Joined:
    Mar 7, 2010
    Messages:
    1,291
    Likes Received:
    1
    Thats why I said not to listen "imaginary insiders". Suddenly everyone has source. The only guys (beside Iherre and bkillian) that knew something and actually said it were Sweetvar26 and AndyH. Sweetvar26 said they dropped Steamroller for Jaguar and that clocks are 1.6Ghz. He also said both gpu's are based on 7850-7770 and that they are doing something "special" for Durango.

    All the other talk seems to be based on legit rumor of downclocked Pitcairn and 2GB GDDR5 that will likely go to 4GB if densities allow. You spray a bit of imagination on it, post a "wink" and people think you know something. Suddenly you start to enjoy it and next thing you know, you just trolled people for few months without knowing shit.
     
  2. trainplane

    Newcomer

    Joined:
    Jan 16, 2013
    Messages:
    151
    Likes Received:
    54
    Are you referring to Durango or Orbis? Orbis is already strongly rumored to have 4GB of shared DDR5 and Durango is rumored to have none, only DDR3 or maybe DDR4. Durango having 8GB of main RAM and 2GB of DDR5 VRAM hasn't been floated in the mainstream rumors.
     
  3. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    No, he is referring to the fact a half year ago Orbis (PS4) was a 4 core / 2GB GDDR5 design and now the rumors say 8 core 4GB GDDR5 design.

    Now the "in the know" insiders will say, "Those were just revisions" wink wink. i.e. "Rumors weren't wrong! They only change." But if you look at the various leaks and how the various CPU architectures changed, etc. this is just code for "I was leaking accurate information, the target only changed." Which, you know, really haven't changed much 18 months out from launch. People get very old design stage info that is years old or just plain bad information--or intentional disinformation--and it is always reconciled away.

    Seriously, I don't see anyone eating crow over getting the Orbis CPU count wrong by 2x! Instead I see a lot of crowing over "look at me, I got new information!" and a lot of posturing how insiders "have real info." Really? Then how is it something as basic as the core count could be so wrong right up until 10 months before product launch?

    What else do they have wrong? It isn't like Sony magically decided in December to double the CPUs. So what other information do insiders flat out have wrong?
     
  4. trainplane

    Newcomer

    Joined:
    Jan 16, 2013
    Messages:
    151
    Likes Received:
    54
    Okay thanks. He was using future tenses so that threw me off.

    So are you implying that self proclaimed insiders didn't really have info and that even now they could be wrong with Orbis and Durango info? I mean, since all of this info is supposed to be guarded secrets, it would make sense that anything coming out to the public is half-truths and numbers get crossed. The next Xbox having only DDR3 and EDRAM doesn't make sense to me but I wouldn't put it passed them. Also for Orbis, Pitcairn has 154 GB/s so I don't know where insiders are getting 192 GB/s unless it is a highly modified Pitcarin.
     
  5. Bagel seed

    Veteran

    Joined:
    Jul 23, 2005
    Messages:
    1,533
    Likes Received:
    16
    Potentially everything. People are hanging off every letter and syllable insiders and devs are posting and treating it like gospel from now until Orbis and Durango are discontinued. Things can quickly change from now until launch so I'm keeping an open mind and not going around carrying and waving a big 'this is it' flag.
     
  6. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
    Sorry all if was discussed before,but something that I don't remember reading here ....it seems Durango SDKs have about 12GB RAM (not sure if DDR3 and GDDR5), but what about the Orbis?
     
  7. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    It makes perfect sense! there's no pointing combing fast eDRAM with a fast main RAM. If you're going to use eDRAM, you'll use it as an economical way to add significant bandwidth, just as 360 did. And with 360's design, use of that eDRAM can be extremely efficient. IMO the only economically sound reason to go with GDDR5 on a wide bus like Sony is rumoured to be doing is if you have an eye on a stacked RAM module in a future iteration. If that doesn't happen, the minimum cost of your machine is going to be kept much higher than a machine offering similar BW to the whole system via eDRAM.

    There's a whole thread discussing eDRAM's value (there's probably a whole thread here discussing each aspect of the consoles if someone cared to go look) and it's a suitably complex argument to have no clear 'best option'. Personally, if the choice is 190 GB/s unified RAM or 60 GB/s system RAM + 100 GB/s Xenos-style RAM (ROPs having own BW), I'd prefer the latter. The former is easier to work with but more limited. The ROPs having their own BW is going to save a massive amount from that system BW. That of course depends on eDRAM capacity, as too small could be a pain to work with. Orbis's unified RAM is the simplest option for devs to use.
     
  8. trainplane

    Newcomer

    Joined:
    Jan 16, 2013
    Messages:
    151
    Likes Received:
    54
    8 GB of DDR5 is not feasible but what about using DDR3+DDR5 and not use eDRAM at all? 6 GB DDR3 Main and 1-2 GB DDR5 VRAM like a PC. My understanding is that the eDRAM in the 360 can't fit 720p with AA so devs needed to use lower res or tiling. Is 32GB enough for 1080p with AA? Having a PC-like architecture means there is really little to almost no difference in creating the PC version and next Xbox version of a game. Using a different memory arch throws a wrench in that. I sure there are size and cost considerations with a traditional PC model vs unified DDR5 (Orbis) or DDR3+eDRAM (Durango) but that model has worked for PC's for ages.
     
  9. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    You'll be stuck with a fat bus to GDDR5 the whole life of your console, limiting cost reduction. DRAM chip prices will plummet over the life of the console as a commodity part with strong competition, so it's not the chip price that's the issue. eDRAM is a far more cost effective solution, but at the compromise of capacity.
     
  11. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    I will take a cert. 93 of them maybe. Could help with my bad breath :razz:
     
  12. ultragpu

    Banned

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,306
    Location:
    Australia
    So if using the current rumored spec of Orbis, assuming it's programmed to the metal and fully optimized utilizing every bits of closed box console advantages, can it perform on the same level of a 680gtx setup? What kind of real world difference are we looking at?
     
  13. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    I doubt you'll get 100% benefit from to the metal considering the rumor is essentially an off the shelf part. The real advantage vs PC is that no one making a game for PC with the 680 as the base target. The PC version will probably get most/all of the effects with the advantages of greater resolution and AA if you have the performance.
     
  14. Xenus

    Veteran

    Joined:
    Nov 2, 2004
    Messages:
    1,316
    Likes Received:
    6
    Location:
    Ohio
    Yeah but the question is at what point is the crossover point. Is the difference in Total space bigger then the the bandwidth difference or vice versa?
     
  15. Docwiz

    Newcomer

    Joined:
    May 29, 2002
    Messages:
    162
    Likes Received:
    3
    According to the old information that I have it uses DDR3 + ESRAM which is > 32MB. So, I would guess 64MB of ESRAM, which would work just fine.

    Plus it has two GPU's a System GPU (probably an APU) which is probably 1.2 TF and an Application GPU which is much higher.
     
  16. (((interference)))

    Veteran

    Joined:
    Sep 10, 2009
    Messages:
    2,499
    Likes Received:
    70
    Yes, two GPUs indeed....why not also add a raytracing chip and a flops capacitator while your at it....
     
  17. Docwiz

    Newcomer

    Joined:
    May 29, 2002
    Messages:
    162
    Likes Received:
    3
    It's in the original documentation from September 2010. Raytracing is unrealistic, two GPU's isn't (one low-end and low powered and the other much higher powered maybe in the range of 1.8 to 2 TF)

    There is a System and an Application part. The Application part is for gaming and apps. The system part is for transcoding, NUI offload, Always on mode (which will probably need a low powered mobile GPU). It includes CPU support with 2 cores with a GPU (again it's probably mobile and low powered and thus only has 1 TF of power).

    The Application part includes 6 CPU cores with a GPU (much higher power for state-of-the-art gaming), this is for gaming and applications. This is the GPU we know nothing about yet and it makes sense.

    Some of the information from that September 2010 document is still correct. It says nothing about raytracing but it does have 8 cores total with 2 GPU's total.

    The Xbox 720 is two devices in one. Mid-end gaming and also transcoding/nui/apps/cloud gaming/recording device.
    This is what I really expect it to be and not some gimped machine everyone expects it to be.

    Even I have two GPU's in my system (one inside of the processor as an APU) and then I have a dedicated 670 GTX, it's not rocket science.
     
    #217 Docwiz, Jan 20, 2013
    Last edited by a moderator: Jan 20, 2013
  18. Proxy

    Newcomer

    Joined:
    Jan 16, 2013
    Messages:
    13
    Likes Received:
    0
    I'm going to be honest, some people are coming off like desperate fanboys vainly attempting to look for anyway their favorite company could beat out the other.
     
  19. Docwiz

    Newcomer

    Joined:
    May 29, 2002
    Messages:
    162
    Likes Received:
    3
    This is high speculation but it could have this...

    Application (High Powered - Not always on, for gaming and other high-end applications)
    6-8 specialized Jaguar cores for gaming (possibly could be 8 here, we will see)
    1 Mid-end GPU 1.8 - 2 TF

    System (Low powered and Always on)
    2 ARM cores (like for Windows RT)
    1 Low-end GPU (Low Powered and maybe a mobile variant) 1 TF (This is the GPU we hear about from rumors)

    Arm is traditionally used for low powered situations and would fit perfectly into the low powered always on mode of the system for stuff such as the OS and transcoding. A highly NUI modified version of Windows RT could fit here. It needs very little ram for the OS alone (512 Megs of ram for the console version) and then 1 GB for the apps (so a total of 1.5 GB taken up for this). That leaves around 6GB for games.

    DDR3 would work fine along with 64 MB of ESRAM. If I am correct (correct me if I am wrong here), you could do 2/1080p for 3D.

    All of this makes a lot more sense than Raytracing or a gimped Xbox 720 that people think it is.
     
    #219 Docwiz, Jan 20, 2013
    Last edited by a moderator: Jan 20, 2013
  20. Docwiz

    Newcomer

    Joined:
    May 29, 2002
    Messages:
    162
    Likes Received:
    3
    I hope you were not talking about me here.

    I am not trying to be a desperate fanboy. I am trying to look at a document and figure out the direction that Microsoft is heading and showing it. Microsoft wants gaming and transcoding with NUI. The document that I have from September 2010 shows that.

    It shows TWO GPU's in the diagram, it's not that hard FFS. I don't have a document on what Sony is doing, so I don't know about them. I do have the old September 2010 pdf document and it might be old, but it still might have something that they are working.

    What I think it's funnier is that we ignore evidence (even if it's old) and act like the console is gimped.

    Here is the page that I am talking about (this does not mean that everything in that document is correct, but it does show what they are shooting for).

    http://i1048.photobucket.com/albums/s372/Rotmm/Slide9.jpg

    WOW. LOOK AT THAT, IT HAS TWO GPU's!

    Maybe you might want to learn something without calling other people fanboys, idiot.
     
    #220 Docwiz, Jan 20, 2013
    Last edited by a moderator: Jan 20, 2013
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...