Alternative distribution to optical disks : SSD, cards, and download*

Discussion in 'Console Technology' started by Cheezdoodles, May 26, 2008.

  1. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    1,604
    Likes Received:
    426
    Location:
    Earth
    Point I'm arguing is that 8x blu-ray drive might be comparatively just as fast or even faster than current gen optical media if binary size grows less than the speed of blu-ray drive. I think if assets grow to be more than 3x bigger than this gen then blu-ray next gen will start to give worse performance than optical media this gen. So if we take the assumption of 8GB game size you could triple it to 24GB next gen and see same performance from next gen console as what we are seeing today.Incidentally this would mean single layer BD is enough for next gen. You could ofcourse use dual layer BD and add more data duplication to avoid seek times...

    Think flash memory as the cheap base sku a la arcade xbox360. Ofcourse you would add the optional hdd for powerusers. The big installs go to hdd and flash is still being used as cache. On the cheap base unit flash is only used as cache+gamesaves+smaller psn/xbox live games storage. If you want more, you buy the optional HDD or the more expensive sku which comes with HDD.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    41,623
    Likes Received:
    12,627
    Location:
    Under my bridge
    I agree, but I don't see savings coming from compression schemes. Even if you're right about texture compression, what other savings can be made? Movies and audio are already lossy compressed. Models can't be. Isome assets won't cost more to produce, as they are already created in higher fidelity and scaled down. The creation will thus cost the same. The difference will be where we need more content, and that will probably have to look to what was supposed to be one of the magics of this gen, procedural content. Games can be populated with characters and levels created by piecing together various assets. Think EA Sport's player customisation done procedurally for every NPC except those the artists want particulat control over, and you'll see that content variation can be handled via computation rather than prebuilding. There just hasn't been the juice to do that this gen, or so it seems with lots of games repeating character models etc.
     
  3. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    1,604
    Likes Received:
    426
    Location:
    Earth
    I would hazard guess textures is where the biggest gain can be made. Samples(sound) might have modest gains to be had. I wonder if object models(vertex data) is already compressed or if it's something that could be fit into the pipeline.

    Another thing that might affect memory equation is different buffers and data sets used for graphics rendering, AI and so on. Perhaps game engines will move to more complex rendering models requiring more interim buffers eating into usable ram. Raising resolution or traditional AA levels also eats more memory. Perhaps the AI and game logic becomes more complex and data structures used to feed the engine grow. I think xbox360 get's some memory advantage here this gen due to edram and tiling compared to regular gpu.

    If we have enough gpu power perhaps some of the lower res buffers used this gen get updated to full resolution next gen(think about shadows, smoke, particles, so on). Some buffers might double in size by using higher precision datatypes such fp16/fp32. Displacement mapping can magnify the amount of geometry in memory substantially compared to what was loaded from disc.

    What I expect is most games next gen will try to be DD friendly and hence binary sizes will be somewhat limited. Higher end games might push 2 layer BD but then DD option starts to be rather painful unless assets are scaled down for DD version.
     
    #1023 manux, Mar 14, 2011
    Last edited by a moderator: Mar 14, 2011
  4. Dregun

    Newcomer

    Joined:
    Nov 9, 2005
    Messages:
    244
    Likes Received:
    7
    I could really use a quick education (anyone offering).

    Could someone tell me how we would fill a system with 2GB of usable ram (not vram). What I mean is I don't really understand what you guys are inferring too for binary data, I could assume that is basic code to tell the game how to run (compile textures x,y,z to model A etc etc) but I really don't know.

    What is filling up the RAM today and what kind of scale would be present for the future? Master Chief/Drake - Would doubling the assets make an improvment in these characters, how would those assets increase etc etc.

    How much room does a model take, does the size of the model increase at the same rate as the complexity (2x the amount of verticies = 2x the file size?)

    Are we currently limited to the textures being used by the amount of ram available or is it more about processing speed. If it is ram what kind of textures are we looking at for next generation and size are they going to be?

    Would the increase in processing power mean we would potentially have more objects on the screen? If so that would require textures too right?

    I'm going on the presumption that the ram is being filled up with textures, audio, models, game code and.... ? that is where I'm getting stuck not knowing much about game design.


    Thanks!
     
  5. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    1,604
    Likes Received:
    426
    Location:
    Earth
    The very layman explanation would be analogy with zip packages and text documents. The documents compress to small size but must be decompressed before they can be used. This same applies to most game data(objects, textures, game logic, scripts and so on).

    For textures GPU's use very specific kind of compression allowing easy and fast access to any pixel inside the compressed image. This GPU friendly compression is not as dense as formats like jpeg or jpeg2000 which don't allow user to access single pixels without decompressing larger blocks or even whole file.

    What we could do here is to use these denser compressions on disc and then recompress on memory to GPU friendly format. The tradeoff is between io time and cpu cycles used to do the operation.

    For vertex data displacement mapping could be viewed as an extreme compression method. Similarly heightmaps used to do terrains or voxels could be seen as rather extreme format of compression. If we start to have large models and lot's of vertex data even simple zip compression might give considerable savings. This gen vertex data is probably small enough not to warrant too much compression as there is not too much to be gained?

    Similarly any data that is not compressed on disc or compressed unoptimally can benefit from compressing the asset on disc and re/decompression once transferred to memory.

    Some data such as movies in disc already might use best possible practical compression and there is not much(if anything) to be gained without lowering quality.

    AI might start by reading the control points for navigating and some basic decision trees from disc but then start dynamically building structures based on player actions and game engine feedback. This dynamic behavior will be done during runtime and eats more memory than what was actually read from the disc.

    For rendering games use lot's of buffers to do calculation such as shadows, stereoscopic rendering, deferred shading, dynamic lightmaps/volumes, z-buffer, normal buffer and so on. All this data is generated on the run based on level geometry and not necessarily read from disc(prebaked content is an exception though). It's not a big jump to expect more advanced rendering next gen requiring more precision and more interim buffers. Also it might be that we will have higher AA or higher resolution rendering next gen eating into usable ram unless consoles continue to use xbox360 like edram solution and tiling that can use less memory than rendering everything in one go to main memory.
     
    #1025 manux, Mar 15, 2011
    Last edited by a moderator: Mar 15, 2011
  6. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
  7. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    8,709
    Likes Received:
    608
    Location:
    Treading Water
    While there's certainly going to be an issue for digital distribution for anyone with slower connections and data caps.

    That article is fucking horribly written. They lowered the default quality to preserve bandwidth for customers, but it's just a switch to turn on full quality.
     
  8. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    From reading around the net, it seems that plenty of customers have a real issue with Caps with this ISP and they can´t pick someone else either. Perfect!
     
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    41,623
    Likes Received:
    12,627
    Location:
    Under my bridge
    My evening broadband is becoming useless. 0.69 Mbps in an evening, one tenth of what can be got in the day. I'm thinking there are just too many users. Though fibre is being rolled out across the country, my particular village, surrounded by fibre-enabled townsteads, isn't even on the list to be considered!
     
  10. JPT

    JPT
    Veteran

    Joined:
    Apr 15, 2007
    Messages:
    1,924
    Likes Received:
    299
    Location:
    Oslo, Norway
    Shifty would you mind giving this one a go

    http://netalyzr.icsi.berkeley.edu/

    Once when your net is good and when its bad, then send me the links to your results, either in PM or posted in this thread.

    One of my bad test results is this one, due to me playing with some ULA IPv6 stuff in here.

    http://n5.netalyzr.icsi.berkeley.edu/summary/id=3210a1cd-32179-18cb64d7-cdd7-44f2-98be

    And on our standard guest hotspot in the office

    http://n5.netalyzr.icsi.berkeley.edu/summary/id=3210a1cd-32185-f79c4c56-174c-4c35-b83c
     
    #1030 JPT, Mar 31, 2011
    Last edited by a moderator: Mar 31, 2011
  11. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    41,623
    Likes Received:
    12,627
    Location:
    Under my bridge
    Just done a slow one. Will do a quick tomorrow. Server BW wasn't measurable in this instance.
     
  12. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,998
    Likes Received:
    1,503
    FLASH ahaha savior of the universe !

    I'm back...... and so is intel and micron announcing the first 20nm mlc nand for ssd use


    http://www.anandtech.com/show/4271/intel-micron-announce-first-20nm-mlc-nand-flash-for-use-in-ssds



    [​IMG]


    16GB sd cards are $20 bucks now . A nice 30% drop would mean $16 at retail. It will most likely cost less than $5 for the manufacturer to produce.


    Its going to hit not in 2012 or 2013 but in 2nd half of 2011. This is a major step up and its my understanding that the reads don't hurt flash its the writes. So they can reduce the size further for consoles by going with TLC instead of MLC at 25nm it saved about 20% die space. So we can see closer to 50% drop in costs by the end of this year for what we'd want to consider for a console cart
     
  13. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    The new SSD disk generation didn´t really show a price drop vs capacity maybe these 20nm will help.

    But maybe it would be good to have a clear idead of what you propose.

    SSD disk instead of harddrives and games developed and released on flash?

    Or how does it work?
     
  14. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    8,709
    Likes Received:
    608
    Location:
    Treading Water
    Why do you insist on using SSD prices? What is the point?
     
  15. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,998
    Likes Received:
    1,503
    Why SSD prices ? You have to remember that the newest SSD disks that are faster . Prices have still come down. I bought my vertex 2 for $170 its a 128gig drive thats how much i paid for my 60 gig vertex 1


    N SD card form factor should be good enough remember even cheap sd cards are at 10MB/s with some going up to close to 30MB/s . CF cards can hit a 100MB/s

    I propose something slightly larget than a 3DS card.

    So basicly you go to the store and but the game just like you do now , except your on flash. It allows so many benfits.

    The other thing I'd do is add multiple slots so you can have multiple games in at once. I invision gears 5 being able to acess gears 4 maps for multiplayer if you have gears 4 in another slot. This will reduce the used market.
     
    #1035 eastmen, Apr 26, 2011
    Last edited by a moderator: Apr 26, 2011
  16. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,998
    Likes Received:
    1,503
    Sandisk and Toshiba announce 19nm Nand

    http://www.theinquirer.net/inquirer/news/2045533/sandisk-toshiba-announce-19nm-nand-flash-memory




    They said they are working on 3bits per cell units which should drive costs down further as they are smaller. Of course it kills the number of writes but we really only need it to be writen to a few times at most.
     
  17. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
    Ok, still trying to get the message, sorry :)

    You want the Consoles to have SSD or Standard Harddrives?
    And then deliver the games on 16GB(?) cards?
     
  18. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,998
    Likes Received:
    1,503
    Yea , I figure you get rid of the optical drive and the 2.5 inch drive and instead put in a single plater 1TB 3.5inch drive. You will greatly reduce the size of the console.

    Then you ship the games on 4 /8/16/32/64 GB cards depending on whats needed. Obviously some games will still fit fine in less than 8 gigs of space and some even on just 4 gigs of space. Some would need more .
     
  19. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,998
    Likes Received:
    1,503
    http://www.tomshardware.com/reviews/sdxc-sdhc-uhs-i,2940-9.html

    thought this was interesting in regards to the new class 10 cards.

    Looks like access time is 1.15 ms

    Random reads are hitting 50MB/s

    sequential reads are able to hit the 60MB/s mark.


    SD cards are getting quite fast and a custom package with a dumb raid 0 controller can really push these numbers. You can be looking at a 120MB/s
     
  20. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,632
    Likes Received:
    36
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...