Haswell vs Kaveri

If the GPU doesn't have at least 3GB RAM it wont be future proof at all. Games are starting to go beyond 2GB and the new console generation just started.

At the absolute highest settings, perhaps. You aren't likely to play at those settings with an i3 + low/midrange card combo, nor with an APU.
 
At the absolute highest settings, perhaps. You aren't likely to play at those settings with an i3 + low/midrange card combo, nor with an APU.

He brought up "future proof" and that is nonsense with this low end hardware. You can't even run middle of the road settings with a 1GB card anymore in some cases. It can run out of RAM and stutter like mad doing PCIe transfers.
 
He brought up "future proof" and that is nonsense with this low end hardware. You can't even run middle of the road settings with a 1GB card anymore in some cases. It can run out of RAM and stutter like mad doing PCIe transfers.

That wasn't his point. His point was that the latter system gives better gaming performance today and tomorrow (ie. in the future) compared to the former system. Note that if one really felt the need for 2GB RAM, one could get a GTX 650 OC 2GB for a similar price as GTX 750 1GB (but the latter card would probably be way faster in most scenarios considering it is already faster than GTX 650 Ti, with superior perf. per watt too).
 
Last edited by a moderator:
That wasn't his point. His point was that the latter system gives better gaming performance today and tomorrow (ie. in the future) compared to the former system. Note that if one really felt the need for 2GB RAM, one could get a GTX 650 OC 2GB for a similar price as GTX 750 1GB (but the latter card would probably be way faster in most scenarios considering it is already faster than GTX 650 Ti, with superior perf. per watt too).

An interesting tidbit is one can configure the AMD APUs for 2GB VRAM. That might outperform a 1GB card doing PCIe transfers. I have a 560 Ti 1GB that I use fairly often and it tends to stutter noticeably on recent games unless the textures are kept pretty low quality. It's not a GPU performance thing, it's the crippling PCIe swapping.
 
An interesting tidbit is one can configure the AMD APUs for 2GB VRAM. That might outperform a 1GB card doing PCIe transfers. I have a 560 Ti 1GB that I use fairly often and it tends to stutter noticeably on recent games unless the textures are kept pretty low quality. It's not a GPU performance thing, it's the crippling PCIe swapping.

Like I said earlier, one could configure the discrete GPU setup with a 2GB card too, but I doubt one will find many scenarios where a 2GB GTX 650 OC will even come close to a 1GB GTX 750 in performance with any modern day game. At 1080p resolution with moderate details, these low-to-midrange GPU's are usually not bottlenecked by VRAM capacity anyway. That said, I do agree that it would be worthwhile stepping up to a 2GB GTX 750 Ti if the budget is flexible enough.
 
Any game that is going to balk at "only" 2gb will not have sufficient processing power to make that game perform well anyway.

Aside from that, there are a number of 2gb cards that perform incredibly well with BF4 at ultra dx11 settings - a game that supposedly needs 3gb of vram to perform acceptably.
 
Any game that is going to balk at "only" 2gb will not have sufficient processing power to make that game perform well anyway.

Aside from that, there are a number of 2gb cards that perform incredibly well with BF4 at ultra dx11 settings - a game that supposedly needs 3gb of vram to perform acceptably.

I think a 2GB card is still an ok buy. Thief for example will fit in 2GB as long as SSAA is disabled and you are around 1080p. I haven't played BF4.

You really do not want to go to 1GB anymore though. There are a lot of games, even some from a couple of years ago, that will overload 1GB.
 
He brought up "future proof" and that is nonsense with this low end hardware. You can't even run middle of the road settings with a 1GB card anymore in some cases. It can run out of RAM and stutter like mad doing PCIe transfers.

The questio is that for that budget, I feel teh second option is better and more future proof.
If you are going to play last games with high detail, that it is not a computer for you.
For 15€ more you have 2Gb cards, but do you have the money?
Yes, O.K. then. No, then if you can, use your old card for some time more and buy a 2Gb card.
 
http://techreport.com/review/26166/gigabyte-brix-pro-reviewed/3

qL5Anfd.png


0YJflgC.png
 
http://www.brightsideofnews.com/new...-details-leaked---soc-for-the-mainstream.aspx

One of the main innovations of Carrizo is the integrated Fusion Controller Hub (FCH). It will feature two SATA 6Gb/s ports, four USB 3.0 ports and eight USB 2.0 ports.

On desktops Carrizo will drop into the known FM2+ socket and the integrated FCH will be disabled and the external FCH of the board used. This is an understandable tradeoff to avoid changing the motherboard infrastructure again. While some users might dislike the fact, that there is dormant silicon in AMDs upcoming APU for the desktop, the functionality is not sufficient to satisfy the needs of desktop users. If anything, it would be a nice to have addon to the current standalone FCH features. On the laptop side the connectivity should be sufficient though.

AMD sure loves to do things inefficiently.
 
There's DDR4 support, and I will pick this passage.

Speaking of packages, there is another BGA package called SP2 mentioned in the document. Since leaked roadmaps mentioned a server version called Toronto in a BGA package, it might be this one. The document lists some minor difference in terms of maximum supported memory frequencies for SP2 (i.e. 2133 maximum for SODIMMs with a single populated channel with dual rank modules on SP2 vs 1866 on FP4) and additional memory technologies (RDIMM, LRDIMM – i.e. Registered DIMMs) supported on SP2.

and also

The PCI Express complex reveals another tradeoff AMD had to make. Unless there is a mistake in the document due to it's early state, AMD considerably reduced the amount of available PCIe lanes, from 24 total to 16 total. The discrete graphics connectivity only features 8 lanes that can be split into two x4 links. The general purpose core features 4 lanes that can be split into up to four x1 links, two x2 links, a combination of x1 and x2 links or a single x4 link. On the desktop version four additional lanes are available for the UMI interface to the FCH. On the BGA versions these lanes are not available, as the internal FCH is connected to these ports. It is unclear yet, why the graphics connectivity was chopped down and it is an aspect I will try to shed some light on at a later date.

And better ECC memory support as well (flagging memory with has had an unrecoverable error, ECC on cache and I/O)

So the Carrizo is the new server chip! Registered DDR3 DIMM will make it easily support at least 128GB of ram. DDR4 may allow more.. And I can't see a reason why this variant can't be used on desktop boards too, like a soldered-on well binned chip that uses and introduces DDR4.

The PCIe 8x directly translates into the need of an 8x slot on a server. I'm sure this will go in some mini-ITX servers. The 8-core Atom and LGA 1150 have recently gone into that new market.
Lastly it's all congruent with Carrizo eventually introducing a new socket. The chip is a SoC, memory controller is updated. The first model would be like the Phenom II 940, so to speak.
 
You mean like the integrated graphics in most modern CPUs (and >90% of Intel's sales) that is unused on desktops with discrete graphics?

Then again, the portion of <2 year desktops that have no discrete GPU should be quite substantial.



Memory speeds for the Kaveri setup are omitted in there, but it looks like the Iris Pro 5200 is mostly set back by power envelope. Quite a surprise for me.
 
Then again, the portion of <2 year desktops that have no discrete GPU should be quite substantial.

The point is that "wasted" silicon is hardly unusual in computing, especially in desktops. If you build a Haswell-based gaming PC, then you will have discrete graphics and something like 25% of your CPU will never be used. That's unfortunate but if it means notebooks can work without discrete graphics, it's a good tradeoff.

The same goes for Carrizo's southbridge.
 
Well, after a 2min research on techreport, I call complete bullshit on the FPS numbers they have on the Brix Pro review.

Here are the IQ settings + results they posted during the A8-7600 review for Battlefield 4:

yE0ijDZ.jpg

jKWKwtK.gif





And these are from the Brix Pro review:

6SoHkpa.jpg

noyu0hn.png



So according to techreport, the A8-7600 gets ~10% lower average FPS at 1080p and Medium settings than at 768p and Low settings.
 
Last edited by a moderator:
Are you sure those benchmarks were run on the same scene? A different scene would easily explain the difference.

I corrected my post.
Techreport shows lower performance at the low IQ settings and half the pixel resolution.

It's simply impossible, no matter what the situation.
 
Back
Top