Working backwards: Overcoming technical challenges for the next generation.

Discussion in 'Console Technology' started by Squilliam, Dec 4, 2008.

  1. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    Everything you need to support compression is a good out-of-order CPU.

    But it's not very local, unlike DXTn; you need to touch tens, maybe hundreds of bytes to decompress a single pixel; and with geometry densities climbing ever higher, it's getting harder to count on good hit rates in the texture caches.
     
  2. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    That's a bit like shooing mosquitos with canons isn't it?
    PS2 had a small piece of silicon almost 9 years ago that could do DCT macroblock decoding faster@150mhz then multi ghz OOOe cpus at 100% utilization.
    And even with the resources being abundant, how does CPU decompression fit into texture fetch pipeline that you refer in the second part of the post?

    Mind you I agree that we don't need more silicon dedicated to hw compression - but you need to accept compression schemes that work on temporal locality rather then on-demand.

    Until market starts expecting it.
     
  3. Silenti

    Regular

    Joined:
    May 25, 2005
    Messages:
    711
    Likes Received:
    423
    I think the market already is looking that way. At least the hardcore for RPG's. Would be really nice if procedural generation could step in here in a big way. Not just for pure looks, but for random fleshing out of objects in large cities. Would at least add more to the atmosphere.
     
  4. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    Yeah, you're right about "media" type of compression; I was thinking more about the LZ- and PPM-style lossless codecs.

    If we believe the Larrabee team, it doesn't fit well :)

    Sorry, I didn't understand that part.
     
  5. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    Ah ok, those are a different story, but then again we've been using LZW derivates on disc-data since PS1 days, so by the very definition of lossless compression you can't expect any significant advancements on that front, no matter how much processing power you can throw at it.

    Well I just meant we'll be doing more and more fancy realtime-decompression on programmable hardware/CPUs, but it will continue to be centered around temporally-coherent data structures + maybe heuristics to help decide what data is needed around the near future (eg. 'megatexture' and similar) rather then improved 'dxtc'(instant decompress on demand).
     
  6. DieH@rd

    Legend

    Joined:
    Sep 20, 2006
    Messages:
    6,387
    Likes Received:
    2,411
    Just wondering, is there any chance that PS3 will get x4 BluRay drive? Or it will be forever stuck on x2?
     
  7. assen

    Veteran

    Joined:
    May 21, 2003
    Messages:
    1,377
    Likes Received:
    19
    Location:
    Skirts of Vitosha
    Why would they willingly introduce a more expensive, more complicated and more noisy drive when none of the functions of the console - movie playback and game playback - can't use it? One day when 2x drives aren't even produced anymore, they will stick a 4x or 8x or whatever is the cheapest model, but will limit it to 2x.
     
  8. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    And in addition to assen's points, what historical precedent is there? How many PS1's came with 20x CD drives? Or PS2's with 16x DVD drives? They never got an upgrade, because it serves the machine no purpose.
     
  9. FirewalkR

    Regular

    Joined:
    Jul 13, 2007
    Messages:
    259
    Likes Received:
    0
    Isn't the real problem with the PS3's drive the access times though? Perhaps they could improve those... and improve streaming experience for some. Devs would have to allow for install for older hardware though. I guess we'll wait for PS4 for a better drive. Blu-ray or something else.
     
  10. Npl

    Npl
    Veteran

    Joined:
    Dec 19, 2004
    Messages:
    1,905
    Likes Received:
    7
    Well, PSP-2000 doubled the memory from 32 to 64MB to allow for more caching and thus reduced loadtimes (apart from Skype apparently needing the additional memory). So, while I dont expect Sony doing anything else than reducing cost for quite a while, its not impossible we might see improvements later on.
     
  11. Nano

    Regular

    Joined:
    Dec 7, 2007
    Messages:
    288
    Likes Received:
    0
    Location:
    London, England
    Unsurprisingly, memory will be an issue. Rather than going for the fastest, newest RAM tech I think the hardware companies should go for something cheaper but with significantly more capacity.
     
  12. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    13,878
    Likes Received:
    4,727
    Surely when installing to the hardrive its not a question of capping the transfer rate ? That way if one day a 4x drive is as cheap as a 2x drive (which may be a time aproaching quickly) they could add it in there and install times would be reduce drasticly
     
  13. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    How would you rate the reliability of a faster spinning drive?
     
  14. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    13,878
    Likes Received:
    4,727
    Thats a good question. It really all depends on the build quality. If the reliability is with in the same time frame as the 2x drive I don't see the problem.

    If its a fast enough diffrence between the two many people may buy new systems. Esp if the noise isn't increased when playing games due to the limits of programing.
     
  15. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    NeoGeo CD went from 1x to 2x if I remember correctly.
     
  16. KyoDash

    Newcomer

    Joined:
    Mar 20, 2007
    Messages:
    32
    Likes Received:
    0
    Nope, the CDZ was still a 1x speed drive. All SNK did was add an extra megabyte of RAM used just for loading, in which case when that 1mb was full up the data was tranfered into the 7mb of video memory. Or something like that, as I'm sure they used it more efficiently in later games. Though of course, with such large amounts of graphics data being moved we never really saw it.
     
  17. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    I see, I wonder if the can they add cheap ram to BR drive to achieve the same effect ?
     
  18. KyoDash

    Newcomer

    Joined:
    Mar 20, 2007
    Messages:
    32
    Likes Received:
    0
    That would be possible, and you could use the RAM as buffer to stream graphics data into video memory, just like on Neo CDZ. This would work better than streaming textures from HDD, and you could load the next lot of data into the buffer RAM as you play the section already loaded into VRAM.

    Not sure how fast RAM (v cheap option, like SD?) you would need for streaming of textures to be efficient, though for reducing load times it would be great.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...