Is Ivy Bridge HD4000 more powerful than PS360?

Discussion in 'Console Technology' started by gongo, May 30, 2012.

  1. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    http://www.lostcircuits.com/mambo//...sk=view&id=105&Itemid=1&limit=1&limitstart=16

    IIRC HD4000 = 2 ROPS, 25.6gbps memory bandwidth, 1.15ghz 16EU gpu cores ( = 256gflops sp?)

    at 720p...it runs good Farcry2 and RE5....the later being almost too good! 60fps! Hawell is rumored..and Intel rumors are almost true....to have 40EU gpu cores!!

    granted it took the best microchip company 5~6 years to come up with a ....SoC with the power of large HD twins...nothing extreme....though we would need to think..why has Sony and MS not shrunk the PS360 to smaller casing yet?

    Are we no longer able to have...PSOneTwo type of HD consoles going on?
     
  2. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    9,983
    Likes Received:
    1,494
    Hmm

    The cpu is $313 by itself , couple it with a $200 mobo , $50 ram and a $150 ssd.


    Seems like we still aren't quite there yet , although it seems like the much cheaper trinitys are there at a price that rivals the cost of the ps360
     
  3. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120
    I'm going to guess, yes. It competes with Trinity which is 384 SP's surely a more powerful GPU than PS360.

    Two rops is awfully low for what it's worth, the consoles have 8. Do these rops run at 1.15ghz? If so it's more like 4+.

    I'm guessing they're more capable ROPS as well. The CPU will be much more powerful than PS360 too, regardless it's obviously running benchmarks at 720P without issue.
     
  4. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    That expensive stuff is required about as much as gold-plated case for the console.

    Though I agree on the general part that the thing is more expensive than consoles but that wasn't the original question.
     
  5. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120
    It's price is higher, not necessarily cost. Ivy Bridge is a very small chip. 160 mm^2.
     
  6. french toast

    Veteran

    Joined:
    Jan 5, 2012
    Messages:
    1,667
    Likes Received:
    9
    Location:
    Leicestershire - England
    Haswell will put the hammer down next year, 40 eu's, some likely dedicated L4 cache and other improvements for massive bandwidth, combined with another new architecture not to mention the REAL 22nm after maturing like a fine cheese over the course of 12 months - AMD will be in serious trouble.

    Not that i want AMD to go down the plug hole, quite the contrary i own AMD products, i just want AMD to be able to match or exceed Haswell next year.

    integrated graphics have gone into overdrive the last few years, thanks to competition, if AMD goes then we will be looking at a slow stale technilogical progression, for over inflated prices.

    Back to topic, yea i would say HD4000 is on par with a xbox360, if you combine the CPU in that and swap the two then i think games would improve in the 360.
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120
    AMD is competing on 32nm though, if they go to 22nm and lets say double the 384 SP's they currently have, yeah...

    Out of curiosity was the die size of the newest 360 iteration ever measured? Anand never measured it cause he didn't want to tear his heat spreader off, but I seem to remember we got a measurement from somebody...
     
  8. HMBR

    Regular

    Joined:
    Mar 24, 2009
    Messages:
    416
    Likes Received:
    105
    Location:
    Brazil
    what?
    ivy bridge can be used in a 50 USD motherboard,
    and soon they are going to release the lower end ivy bridges, cheap mobile Core i3s, and the special models of desktop core i3s with the HD4000 (most will use the HD2500), so I expect that soon you will be able for some 150 usd to buy a i3+HD4000+h61mb+cheap ram,

    but in reality wasn't llano already faster? you can buy some 400sps llano for a really low price right now...
     
  9. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,288
    Location:
    Helsinki, Finland
    We had this discussion already last year. AMD E350 netbook APU (Brazos 1.6 GHz) runs many console ports at 30 fps if you set details to low (matching console settings) and lower resolution to 1280x720 (matching console resolution). So you would argue that Brazos already matches the 7 year old consoles in many games. Ivy Bridge CPU is around 8x faster than E350 on average (http://www.anandtech.com/bench/Product/328?vs=551), but the GPU speed difference isn't that big. Games are most of the time GPU bound on all integrated GPUs, so the extra CPU performance wouldn't likely improve the gaming performance at all.

    That's twice the frame rate of the console version, so you could argue it's twice as powerful as the current consoles (in running games designed for current consoles). However it's not that simple. The console frame rate is often more stable than PC frame rate, because the developer can specially optimize the code for console bottlenecks. There are too many different PC CPU/GPU configurations to properly optimize away all the bottlenecks (improve minimum frame rate). Of couse if the game would be properly optimized just for Ivy Bridge (or E350) it would likely run faster (and if 3d API could be skipped completely on PC there would be another extra speed boost). But back to reality: Basically nobody optimizes their games specially for low end integrated GPUs (high end GPUs are often a much higher priority). Good thing is that we finally have integrated (laptop) GPUs that are capable of running console ports with a playable frame rate (and with a slightly higher detail settings to boot).
     
    #9 sebbbi, May 31, 2012
    Last edited by a moderator: May 31, 2012
  10. tunafish

    Regular

    Joined:
    Aug 19, 2011
    Messages:
    542
    Likes Received:
    171
    At present, the cheapest CPU with HD Graphics 4000 is $235, but as soon as Intel gets to launching their dual cores, that's going to go down to $150 or so.

    Why on earth would you do that? Intel has spent the last 5 hardware generations moving the performance-sensitive stuff from the motherboard to the CPU die. There really isn't much left. Unless you plan to go for really heavy overclocking or need some less common ports, there is never a reason to pay more than ~$120 for a MB. And if you don't plan to overclock at all, the $70 Gigabyte MB is the best choice.

    After Intel releases Haswell, which moves the VRM on-package, there won't be *any* sensible reason to buy an expensive MB left. None at all. I wonder how the MB manufacturers are going to sell their expensive edition motherboards then.


    You can get 4GB of 1600MHz ram for ~$25. No real need for any more for a budget box.

    Seriosly? For a low budget box? How about a $50 HDD instead.
     
  11. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    That kind of neatly coincides with SB being close to consoles IME, and IVB roughly doubling the GPU.

    The neat thing is that it's now actually possible to do things on PC like use left over CPU to run free MLAA or something. Granted that's assuming you actually spend time doing optimization for a specific type of hw, but the recent integrated chipsets arguably make for a viable platform to optimize for unlike most others.
     
  12. TheD

    Newcomer

    Joined:
    Nov 16, 2008
    Messages:
    214
    Likes Received:
    0
    Are you really, really sure you are thinking about the E350?

    A quick search on youtube shows a lot of videos with the E350 running games at framerates that are kind of really, really crappy (BF3 SP at 640x480 at 19FPS on lowest, Bops at 800x600 and lowest settings @ 9FPS).
     
  13. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Interesting...very interesting...when i created this thread...my impressions were that HD4000 and PS360 wins on certain areas...HD4000 does not have the fillrate and mem bandwidth but comes with faster....shaders...? I did not expect to hear developers....hinting that IB is twice as fast as PS360! Granted...RE5/Capcom 'pc ports' were built for pc too!

    so i went to youtube looking for HD4000 gameplay videos..this guy made many recent games at 720p ...running on HD4000! Nice!
    Some games were very smooth..some jerky..could be due to fraps..and/or his i7 3770k HT is on....what do you guys think?
    http://www.youtube.com/watch?v=bF0X1Dou20k&feature=BFa&list=PLED1F6B94AC0738F9

    Think that got misreported....only the dual cores ulv haswell for ultrabooks get the vrm on die...or was it on package...? either one...it is interesting how Intel can accomplished that..i took a look at my matx htpc and....those vrms..mosfets...caps are big and many!
     
  14. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    One more thing about the HD4000...that was not widely reported...is that it has its own 8mb LLC...that is one big scratchpad edram like thingy ...!?! Do the resident devs feel there can be optimization to have for 8mb of fast...cache??

    So it means IB is a cpu with a total 16mb of L3 cache...for a pretty tiny microchip!
     
  15. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,957
    Likes Received:
    4,553
    I believe the GPU's access to the L3 cache in sandy/ivy bridge was designed to be completely transparent (thus not "optimizable") to game developers.

    No that any developer would ever bother to make specific optimizations for Intel iGPUs.
     
  16. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,288
    Location:
    Helsinki, Finland
    There was a forum where users collected their E350 gaming benchmark scores. Over 50 games were listed, and roughly half of them ran at around 30 fps or higher. I can't find it anymore, but here are some scores:

    Battlefield 2 (2005) - 1024x768, all max, 25-35 fps
    Crysis 2 (2011) - 1280х720, 20 fps
    Dirt 3 (2011) - 1024x600, very low, 25-28 fps
    Mass Effect 2 (2011) - 1024х600, 20 fps
    Call of Duty: Modern Warfare 2 (2009) - 1280х720, 25 fps

    Here are some more from Anandtech:
    Batman, low: Arkham Asylum 29 fps
    Battlefield 2, low: 50.8 fps
    Company of Heroes, low: 44 fps
    Need for Speed: World, low: 33.5 fps
    Quake 4, low: 57.3 fps

    Metro and BF3 PC versions are unfortunately too much for E350. There are some other games such as Civilization V that are completely unplayable (lots of draw calls). Draw calls are actually a bottleneck for Intel drivers as well. I hope that DX 11.1 (and DX 12) further reduces the overhead of draw calls, because spending lots of extra cycles doing extra work doesn't sound right on (battery operated) tablets and mobile phones (Windows 8 & Windows Phone 8). Luckily for us Windows 8 already runs games a few percent faster, so OS/driver/service side optimization seems to be a big focus for them right now.
     
  17. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    Would it actually be possible to render tiles - I assume the Intel HD graphics works on tiles like previous GMA solutions - to L3, then read that back in with the CPU for MLAA processing, without having to take a roundtrip via bandwidth-limited (starved, really) main memory? I'd be extremely surprised if that was the case, but it'd be really cool of course.

    Although with IB being OpenCL-certified I guess it would be extremely beneficial if you could access GPU results directly on-chip without having to hand everything off via main memory. Much less latency of course, and less power spent (always a concern these days), and any bandwidth not wasted on needless transactions hither and yon the better.
     
  18. Svensk Viking

    Regular

    Joined:
    Oct 11, 2009
    Messages:
    507
    Likes Received:
    61
    Isn't draw calls what DX11 multithreaded rendering is supposed to help with?

    Civilization V is confirmed to use that, and Nvidia GPUs showed clear performance improvements when their drivers enabled support for it
     
  19. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,288
    Location:
    Helsinki, Finland
    Ivy Bridge has full speed AVX (256 bit wide vectors) and it has new float16<->float32 conversion vector instructions as well. It would be pretty good for image post processing (esp 16 bit float HDR data). Performance analysis (clock rate) data shows that Ivy Bridge CPU clocks are usually very low when it's running games, so there's certainly lot of untapped performance left. Most games just utilize 2 or 3 threads, and the rest could crunch 256 bit AVX data at full speed. But so far I don't know if any game supports AVX yet.
    On early AMD drivers recorded command buffers didn't improve performance at all compared to doing similar draw calls again and again every time (I tested on Radeon 5850). I remember reading that both AMD and Nvidia optimized their command buffer implementations after Civ V release. They must have first implemented that feature simply by fully performing recorded draw calls at command buffer execution time instead of transforming those commands to a hardware display list (at capture time), and just putting an execute token+pointer to the ring buffer (at command buffer execution time). Executing a command buffer shouldn't cost much more than issuing a single draw call (but I am not sure what kind of extra safety checks PC drivers must perform, so it might be a bit more expensive on PC).

    It seems that Intel hasn't yet optimized their command buffer support. Ttheir driver team must be much smaller compared to companies that have GPUs as their main area of business. AMD and Nvidia release new drivers monthly, while Intel does that a few times a year.
     
  20. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    AMD stopped their monthly driver releases effectively today... but AMDs drivers aren't the bees knees either (and don't even start with Linux).

    Intel has a VERY bad reputation concerning their GPU drivers... Not sure about Pulsbou, but those were PowerVR derived... so it wasn't even their code to begin with. They "work"... but just barely. Never had to suffer them for long, luckily.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...