NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Discussion in 'Console Technology' started by Barso, Aug 30, 2012.

Thread Status:
Not open for further replies.
  1. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3
    102 GB/s...continuously or in bursts?
     
  2. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    LOL now you are just screwing with people. Do you enjoy this? :twisted:

    So, are you going to stand behind such big-talk rhetoric? Because if not your post should be deleted; if it comes true you can look like a bastion of common sense; if you are proven wrong at all you should be banned because if you are wrong and slink away from such talk, well, there is no place for it here on B3D.

    1080p may be the most popular resolution but 680/7970 class GPUs are not the most popular.

    Going back to Epic, why would Epic target their next game at hardware less than 5% of the discrete PC gaming market has? It is pretty certain that Epic would not want to abandon the NV 4xx, 5xx, and low end 6xx hardware; ditto AMD. There is just far too much of the market share.

    That being said I do believe quality level check boxes will probably even drown the 680/7970 hardware--set UE4 to 11 (insane texture resolution, or whatever they choose to call the new IQ level) and it will drop all HW to their knees. I would even go further--these new engines often ship with corner case features that KILL hardware performance, even on the best hardware. So in these corner cases going 1080p, all settings Ultra High, 4xMSAA with Post Process AA, and every feature check box enabled (3D, etc) yeah, watch the best GPUs drop well below 60Hz.

    Those conditions aside, take their new engine/game, set it at 1080p, High Settings, and adjust AA based on bottlenecks, and don't enable the inefficient whiz-bank new/unoptimized lense flare effect and I would guess a 680GTX sings.

    Epic better hope so, or they didn't learn the lessons from Crysis 1. And from a practical stand point Epic probably isn't in a financial position to be adding features that require unique asset creation and testing on the PCs when the consoles are fairly close to eachother and hit a fairly high baseline of PC performance well above the core PC demographic. You really would have to wonder why Epic would create/test major game features <5% of the PC market could use at 1080p30. But don't worry there will be some nice compute check boxes and such that kill all PCs (but the same effects will down the road will have much more efficient implementations anyways, so all they will really tell us is Epic throwing PC gamers a bone).
     
  3. DJ12

    Veteran

    Joined:
    Oct 20, 2006
    Messages:
    3,105
    Likes Received:
    198
    So I take it you are going to ban yourself when orbis has nothing reserved for the OS except the memory we already know about?

    I have said many things in this thread that the likes of you have laughed at, yet slowly these things started to appear closer to the truth than the fantasies being pedalled.

    I may not have your technical knowledge but I do not make things up and repeat them once a day because I want it to be true.
     
  4. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3
    In that case, one would not comment on being a "super computer", right? That would show interest, within itself. That would require more than a low level, isolated look. You couldn't verify something as low level as electrical signalling of an IO and say "super computer", right? Anyone at a high enough level to say "super computer" should know what their equipment is for and to quantify that, right?
     
  5. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,106
    Likes Received:
    380
    Wow, so Durango will be doing 480p then? That's alot worse than I thought.
     
  6. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    I think you're trying way too hard to read anything into it. The relationship to super computer could mean a lot of things.
     
  7. Averagejoe

    Regular

    Joined:
    Jan 20, 2013
    Messages:
    328
    Likes Received:
    0
    The thing is how much faster than Orbis jaguar durango jaguar is,because Orbis Jaguar is no vanilla either,so Durango jaguar been 100% stronger or faster doesn't translate into the same advantage vs Orbis Jaguar.

    Because Vanilla Jaguar is from 2 to 4 cores not 8 like Orbis.
     
  8. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    People comment on things they don't really know much about all the time.
    Look at me...

    Anyone can say the words "super" and "computer", and comparing what the rumors are coalescing towards and an actual supercomputer, I get a stronger impression that this source doesn't really know enough about the topic for a turn of phrase to be taken as gospel.
     
  9. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    Come on! All this negative talk of sub 720p..does anyone here actually believe we will be seeing that scenario..really??

    Current gen consoles play the same games as high end pcs do. .obviously pcs look substantially better..but they are not worlds apart in the eyes of joe bloggs.

    Durango looks to be the weakest. ..but will still be something like 10× ps360 in real world terms...maybe more once optimised..so how the hell anybody thinks we are staying 720p on next gen is beyond me.
     
  10. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    It would be a curious console if Orbis had no CPU or GPU resources utilized, magical even.

    Actually the problem with your quote is this part, "So too xbox fans mention twice the flops and the rest of you grasp it like it's gospel, it reeks of desperation."

    I have no clue (nor made a value judgment) on whether the rumor is true. That said bgass. seems to back it up; seeing as some of his other info has been correct why not? We will wait and see. But the problem is you aggressive and downright trolling comments like "it reeks of desperation".

    The difference between you and me is I didn't tell others when they interpreted the 14+4 rumors as indicating only a "small" impact on graphic performance that they "it reeks of desperation". I have repeated over and over these are unverified leaks without context/architecture; many leaks have ambiguous wording. So why attack and insult others like you always do?

    Sure, when the issue of the 14+4 CUs was discussed I mentioned I thought 3dil. theory of QoS (something MS has as well in their PDF leak) sounded like it jived (I also think the 2CU Jaguar blocks also has potential to explain the remarks in the leak) but I have not been evangelizing this position as a factual leak.

    And I surely have not told those who have a different take that they "reek of desperation". So why would I need a ban?

    Then again telling people who disagree that they leak of desperation because they are taking a rumor seriously really just tells us what we always knew: you have never liked it when Sony's competitors look good. Hence your aggressive posturing and insults.

    The funny part is the 2x Durango Jag flops may not even be true and a misunderstanding :lol: But it was sure fun watching you attack the posters who took a double "confirmed" leak seriously. Ahhh the good old days!
     
  11. Averagejoe

    Regular

    Joined:
    Jan 20, 2013
    Messages:
    328
    Likes Received:
    0
  12. Aeoniss

    Regular

    Joined:
    Mar 23, 2007
    Messages:
    557
    Likes Received:
    0
    Location:
    Nebraska
    I kind of doubt that the consoles will be targeting anything but 1080p. Or, to put it another way, most games will be 720p+ resolutions. Whether or not we see games that aren't quite 1920x1080p is another story, though I am willing to bet we will see a good portion which aren't quite there.

    But I think both MS and Sony will apply pressure for devs to target higher resolutions.

    Didn't MS, back in the day, have some kind of requirement when they certified a game for the 360 to be 720p? I mean obviously alot of games came out that weren't, but were those special exceptions or was there no resolution requirement for the platform?


    I am very excited for the next gen. In particular Orbis, as I am a Playstation guy. So this coming Wed should be awesome!

    I'm not a hardware guru like alot of chaps here, but I am quite certain when I say that it really doesn't matter a whole ton what current top of the line PC innards are like vis a vis the rumored orbis\durango specs. Compare the PS3 and 360 to the best PC's of their time and you will see a largely similar theoretical performance disparity.

    The difference is, as has been pointed out at length in this and many other threads, is that consoles will always hit a far greater level of efficiency than any PC title can dream of. Every component in a console is placed there with due consideration and net performance of the whole unit is the target for the engineers (within a budget).

    So yes, it won't be as powerful as a 400$ GPU, but it will output games that look mind blowing and insane all the same. Proof? God of War 3\Ascension, Killzone 2/3 and the Uncharted Trilogy. GT5.

    For 360 it'd be Gears of War trilogy and Halo 3, 4 and Reach. Forza.
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    No. When the 360 launched it was arguably 90% as powerful as the fastest PC GPU of the day while being more advanced (feature wise). This generation, Orbis will be level (or even behind) feature wise and roughly 35% as powerful. That's not the same situation at all.

    No one has ever ignored or tried to wash over this fact. But the reality is that the best case performance increase you could gain overall in the current generation (around 2x) would not be close to enough this coming generation to gain performance parity. And of course API's have got thinner and more efficient on the PC since the last generation so that 2x advantage is likely a fair bit lower now.
     
  14. Ketto

    Newcomer

    Joined:
    Jul 30, 2012
    Messages:
    39
    Likes Received:
    0
    Location:
    Winter Park, Florida; and London UK.
    The TCR was 720p 4xMSAA as a standard, but it was dropped as quickly as it was mandated.
     
  15. Aeoniss

    Regular

    Joined:
    Mar 23, 2007
    Messages:
    557
    Likes Received:
    0
    Location:
    Nebraska

    What? The fastest being like the GTX 690 or the AMD equivalent? Those monster cards whose power consumption and heat generation is through the roof and costs bucket loads of money?

    Was there anything even like that kind of tier back then? The super enthusiast models of graphics card?

    I don't know about you, but I don't consider that to be a fair comparison. How about a more reasonable target- how does the Orbis or Durango GPU compare to a GTX 680 on paper or the flagship single card from AMD (I don't really keep up with their lines, I'm an Nvidia guy).

    Unless this is what you were comparing it too the whole time.

    I know that Orbis will certainly not be 1:1 with either of those cards, but is its GPU really only barely a third as powerful as a GTX 680? To be thorough, at the time of the 360's launch what was the best single graphics card available. How much did it cost, what was the power consumption\heat generation, and its specs.

    From what I have read, the Orbis GPU does not appear to be a third as powerful as this card, HD 7950, which is second to the top of the line (HD 7970). It seems to be alot more competitive than that?


    http://www.amd.com/us/products/desktop/graphics/7000/7950/Pages/radeon-7950.aspx#3


    Hell, for that matter (from a raw flops perspective) Orbis is about half as powerful as HD 7970. Clocks in at 3.79 Terraflops while Orbis is in the 1.80 Terraflops ballpark right?

    Here's the 7970

    http://www.amd.com/us/products/desktop/graphics/7000/7970/Pages/radeon-7970.aspx#3
     
    #2455 Aeoniss, Feb 18, 2013
    Last edited by a moderator: Feb 18, 2013
  16. itsmydamnation

    Veteran

    Joined:
    Apr 29, 2007
    Messages:
    1,349
    Likes Received:
    470
    Location:
    Australia
    well the Opteron 3380 is an 8 core 2.6ghz base and has a 65watt TDP.

    the Opteron 6328 is 8 core 3.2ghz base and has a 115watt TDP. So there is no way we are going to see a 3.2ghz SR without massive power reductions.
     
  17. kalelovil

    Regular

    Joined:
    Sep 8, 2011
    Messages:
    568
    Likes Received:
    104
    An overly simplistic GPU flops comparison:

    PS3: 367Gflops
    7800 GTX 512: 400Gflops (fastest card available a year before release of the console), US$649 MSRP (~$700 retail), ~120W TDP.

    PS4: 1.84Tflops
    HD 7970 Ghz Edition: 4.3Tflops (fastest card available around a year before release of the console), US$499 MSRP (~$430 retail), 250W TDP.
     
    #2457 kalelovil, Feb 18, 2013
    Last edited by a moderator: Feb 18, 2013
  18. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    I'm only comparing to single GPU cards, not dual GPU like the GTX 690.

    I didn't compare to the 680, I compared to the fastest GPU at the launch time of the consoles. That won't be the 680. The 680 to Orbis/Durango is more like the 6800 Ultra to the PS360. When we first heard reliable rumours about the 360 they talked about a console at least as powerful as 6800 Ultra's in SLI. Clearly no-one is even entertaining the new consoles being as powerful as 680's in SLI.

    Within a few weeks, the GTX Titan will be available. Before the launch of the consoles there's a very good chance both the 8xxx and 7xx series GPU's will also be available.

    The current fastest GPU on the same architecture as Orbis (Tahiti) sports 4.3 TFLOPS of shader performance. That's 2.3x Orbis and is also in line with 680 performance. Titan, available in a few weeks should sport at least 40% more real world performance. That's 3.3x Orbis. I would expect the high end 8xxx series launching at the end of this year from AMD to sport similar performance.

    It was the 7800GTX 512MB which was ridiculously expensive and probably had a pretty large TDP for the day, certainly larger than either console which were actually manufactured on a smaller process.) It's specs in relation to the 360 were 165% texturing, 220% fill rate, 55% geometry setup, 118% total shader throughout but only 86% when you include texture addressing and 21% the bandwidth of the edram or 240% the bandwidth of the main memory. So as you can see, my generalisation of the 360 being 90% of this card was a very rough average.

    But that's not the highest end GPU of today, nevermind 8 months from now.
     
  19. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,493
    Likes Received:
    474
    If you read between the lines bkilian is saying Durango doesn't support patches which are required for tessellation. So Durango doesn't support tessellation. :twisted:

    Honesty on the internet. I thought it was a myth.
     
  20. Brad Grenz

    Brad Grenz Philosopher & Poet
    Veteran

    Joined:
    Mar 3, 2005
    Messages:
    2,531
    Likes Received:
    2
    Location:
    Oregon
    The problem is utilization isn't the only measure of efficiency. And efficient design is as much about avoiding stalls, bottlenecks and saturation as it is about high utilization. If the city planners in your town designed the sewer system to be at a high level of "utilization" on an average day, that might seem very efficient, but only until the first time it rained and every toilet in the city backed up at the same time and millions of gallons of raw sewage is dumped into the local waterways. In that case building excess capacity to cope with peak loads and worst case scenarios is more efficient, which was my point about the ROPs.

    The other problem is that when people say the Durango design is targeting efficiency they assume the goal, AND the result is higher efficiency than the Orbis design can provide. But that is literally unfounded and utterly without support. I agree that the added hardware in Durango makes it efficient. But it is being compared to a design that was also designed to be very efficient, only the choices made in the Orbis design do not require specialized hardware to increase efficiency. The reason each company made those choices are immaterial to the analysis of each.

    The fact that Microsoft's innovations in hardware can bring Durango in line with the inherent efficiency of an Orbis style architecture is a remarkable achievement. What we don't have is any evidence or indication that they catapult Durango to higher efficiency in anything but the most specialized, latency sensitive situations. Meanwhile there will certainly be other situations where Orbis is more efficient. My argument has always been that on average they offer very similar levels of efficiency and as a consequence that is not an avenue by which Durango can close the gap in power that exists on paper.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...