Why doesn't anyone use a fpga?

Discussion in 'Console Technology' started by Flux, May 3, 2010.

  1. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    430k-3.5M gates, isn't that less than a single SIMD lane in a modern GPU?
     
  2. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    They are talking ASIC-equivalent gates I don´t know what means. Anyway, the programmable logic is capable of up to 1080 GMACS Peak DSP Performance, according to the specs here.
    http://www.xilinx.com/products/silicon-devices/epp/zynq-7000/silicon-devices/index.htm

    On top of that it includes a dual core ARM Cortex 9 at 800 MHz capable of running Linux, not to shabby in my opinion.

    I think these descriptions of automotive applications such as lane assist gives an idea that FPGAs are a quite good at realtime image processing.

    [​IMG]

    That is why I think FPGAs can be an interesting alternative for EyeToy/Kinect type of data processing. Especially when considering that the console business is also very cost sensitive.
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,740
    Likes Received:
    11,217
    Location:
    Under my bridge
    Why an FPGA instead of a custom designed ASIC?
     
  4. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    I can see two main reasons of why FPGAs can be found in car applications today and the same reasons would to a certain degree also apply to consoles.

    1. The cost advantage of ASICS is no longer what it used to be. The amount of transistors that can be crammed into 1 mm2 of silicon with a modern process is just crazy. The cost of the silicon of a pretty powerful FPGA is low from the start so there is simply no longer that much to gain by converting it to an ASIC.

    2. The fact that an FPGA can be re-programmed gives the flexibility to the developers to change the logic right up until the day product is released and it also offers the option of fixing bugs and adding new functionality after release by flashing new firmware to the device.

    The second reason is obviously a quite strong argument for consoles where the firmware is regulary updated.
     
  5. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    The cost advantage is due to low volume. Total expected car sales this year: 14 Million. Luxury level cars make up a fraction of that. Luxury cars with these systems make up an even smaller fraction. Then you factor in the various different manufactures all using different designs/equipment and you are looking at roughly annual volume per design in the upper bound of 100K or so. This is simply too low of a level to make an ASIC viable. As the feature becomes more common and becomes a Tier 1 supplier item, the volume will increase potentially to make an ASIC viable or the functionality will simply be annexed into the base ASICs over time.

    This is only viable when the basic functionality is new.

    maybe for prototyping, but not for production when you are already doing an ASIC and can amortize the costs over 10s of millions of parts.
     
  6. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    No it´s not just because of that. The playfiled has also changed due to the process-advances as I described. Just as an example, I recently chose the MCU for one of our sub-systems and did some investigations as part of the decision making. Even though it´s a very simple system, there were 0 reason in our case to go for an 8-bit MCU instead of an 32-bit MCU. The price advantage that existed 10 years ago simply isn´t the same. I guess with the new Cortex-M0 it will be totally erased. However, 8 bit MCUs will continue to live for ages due to all legacy SW.

    Hey, I even found an article to support my anecdotal evidence. :smile:

    And thanks for telling me I've got a luxuary car (lane assist + back assist), my wife will be happy when I tell her. :smile:
    But you are right that it´s not yet standard equipment in many cars, but I was surprised to find so many car manufacturers were offering this kind of functionality.

    ?

    Sony and MS have been prototyping their console firmware for quite sometime now.....
     
  7. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    This has been true for quite a while actually, there aren't really many spaces where it makes sense to use anything less than a 32b SOC/MCU and those that it does make sense for tend to be fairly special purpose (rad hard, etc). The simple fact is that most 8b/16b MCUs haven't been ported to anything close to modern processes or packaging and as such real world costs for 32b soc/mcus are basically the same.

    Hard to claim anecdotal evidence when the whole column is about a lack of evidence...



    Both lane assist and back assist are sold on a very very small fraction of the cars available. For the most part you are looking at high end SUVs, luxury cars, and minivans. Very much not a standard feature. Once it is, it will just be subsumed into the standard cadre of SoCs in a car.


    Once the functionality is standard or widespread it gets subsumed into standard parts and FPGAs aren't used for it anymore. This isn't actually new and has been going on since the beginning of electronics.


    Firmware is primarily updated to fix SOFTWARE bugs/add SOFTWARE features. For FPGAs, post release updates are very very rare (it had better damn well work as shipped). FPGAs designs are still hardware designs and are validated and verified to similar standards as ASIC before release. Software is Software and if a software engineer has ever heard of real validation/verification they probably spent the next 10 months having nightmares. For both ASICs and FPGAs the vast majority of design resources are validation/verification. For software, it is rare that more than 10% of resources are spent on validation/verification.
     
  8. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    Just wow! I can´t believe I am reading this. Ever heard of the MicroBlaze soft processor? Hell, Xilinx even uses MicroBlaze to implement other IP blocks that they are offering. Take your time and think about that for a minute. Xilinx uses a soft processor to implement IP blocks. Then go and ask someone who actually are programming FPGAs today why they think Xilinx are using soft processors. Please do this before posting another knee-jerk response filled with assumptions.
     
  9. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    I understand FPGAs and I understand ASICs. What I wrote was in no way knee-jerk. Nor was it filled with assumptions. If you have specific issues with what I wrote, I suggest you address them specifically.
     
  10. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,436
    Likes Received:
    264
    I don't get why anyone would question a FPGA as being a hardware design. If you put a FPGA in a car, video board, etc. it better be verified better than most software. I haven't used a FPGA in a design in 10 years and at that time they were easier than ASICs because pre-silicon validation wasn't as important, but they weren't planned to be patched on a regular basis.
     
  11. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    Yes, I have issues with what your wrote.
    1. Please explain how the soft processors fit into you FPGA/Software world.
    2. Please explain why the soft processors have become increasingly popular.
    3. Please explain if there is a difference in the quality requirements of an application from the user perspective depending on if the application uses an MCU or an FPGA?
     
  12. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    cores are just an IP block whether they are in an FPGA or an ASIC. Nothing new or particularly novel about them. Only thing that matters is that they were properly verified and validated so that they correctly run whatever code is run on them.

    Using cores over hardwired logic is merely a trend that dates back to the early 80s.

    From an end user perspective people just want things to work. Customers don't give a flip about hardware/software. From a product design perspective, software tends to have a fraction of the validation and verification coverage of actual hardware, be it solder/wrap board of LS/7400 chips, ASICs, or FPGAs. Simple fact is that software historically and today has many more bugs post release than hardware as a direct result of the amount of resources spent to validate and verify both software and hardware. This is fairly obvious to anyone in the field.

    FPGAs are by and large treated to the same levels of validation and verification as hardware because of all the real world issues involved in actually releasing a programming update for an FPGA. Simply put, people almost never release updates that change the programming of an FPGA. It happens about as much as people releasing firmware/bios updates to work around a bug in an ASIC.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...