Why doesn't Ivy Bridge have Quad x4 PCIe config option so that we can use Quad 7970 without using an extra PLX bridge? After all it's PCIe 3.0 so we still have 4 GB/s of bandwidth per card.
Hi Ian. Congratulations. Very nice work. I could not check all 23 pages of comments, but I think there must be an update including C2Quad as it still is one of the most used configs. Q9450/9550 for instance?
Same, Q9450 here with 8 GB RAM on Win8, would love to see it in the charts. Do I just need a new graphics card (5850 now), or a whole new computer instead?
Wow it's been awhile since I've seen an E-ATX case on anandtech pictured with an actual full size E-ATX motherboard installed in it to show what it looks like I'm almost shocked. Would be nice if you guys could get a few motherboard makers to give you some boards in all sizes even if they're non-functional display boards so you can use them in case reviews to show what the case looks like with different sized boards installed.
Shhhhhh. You're using too much of that stuff called sense. It might spread and suddenly everyone would want case reviews that reflect what anyone who'd install a motherboard in 99% of the cases they review instead of miniITX for every review. I mean, there's no way that putting a miniITX or microATX into every case review isn't going to impact the actual case being reviewed, is there?
Good point - apart from not testing the FX-4xxx processors (I don't have any), the FX-4xxx uses an AM3 platform - the FM2 platform is both newer and the chipsets offer more native USB 3.0 / SATA 6 Gbps as well as a UEFI BIOS from the ground up. The FX-4xxx is still a relevant choice with its L3 cache, and a couple of newer boards have been released to try and get the best from the 990FX chipset. Though out of what I have tested so far, the A8 makes the most sense if you're looking at pure 1-GPU gaming. If I get an FX-4100 in, it will be tested and conclusions adjusted if it performs similarly - there's no point suggesting a CPU I haven't tested and can't back it up with data.
I entirely agree that it'd be wrong to make conclusions without data. However, I feel like the APU recommendation ought to go with some sort of "however..." caveat.
I look forward to your FX4 and FX6 results, however. I was initially not at all sold on these chips, but now that the prices have come down and the FX6 is often priced against Intel's i3, they are much more compelling.
If you're going to compare it against Trinity APU's, then wouldn't it be fairer to get an FX-4300, based on the same piledriver core? See if that L3 cache makes enough of a difference? More up to date as well...
If you live near a microcenter, you can get an FX-4130 (3.8 Ghz) and motherboard for $99. That leaves quite of room to get a better GPU, and probably a better overall gaming experience for a given amount of money. I upgraded from an X4-955 to a 3570K about 6 months ago, and have to admit that I barely notice the performance increase in games, and would have probably been better off spending the $250 on a better video card. I do like the extra speed while using handbrake, though, and my son likes my old X4-955 that was a big upgrade from his previous setup, though.
I sympathize. I have similar hardware (Phenom II X4 processor) and I've been looking for a good reason to upgrade, but can't really find one. Regardless, those crazy motherboard + processor deals at Microcenter sure are tempting!
This. I have an overclocked AMD Phenom X4 830 with an overclocked Asus 6850 and not much $$...dang it, honestly i'll just get in trouble with my wife for spending $$ to upgrade a PC that in her eyes works perfectly fine. I can probably get away with the GPU, as I can swap that out much quicker without her noticing.
Actually, you're looking at an Athlon X4 740 / 750k. That's the top end of the Trinity line-up with the GPU disabled and an accompanying price cut, but with the same cache structure and motherboard chipsets as the Trinity systems tested here.
Seems like it's up to the sale, but I'd be more tempted by the FX 6350 over the FX 4350 given the pricing on Newegg.
More cores is more better, especially if you're making the sacrifice to use the 990FX chipset without PCIe 3.0 (and the FM1/2 chipsets also lack this anyway).
That said, I'd probably wait for a good sale on the FX 8350 and just go with that if I were considering AMD at all.
I wouldn't (and didn't) mostly because I'm one of those quirky desktop users who wants to use as little power and produce as little heat as possible to reduce fan noise yet after speeeeed. When I was looking (last year), AMD didn't really offer me much in the way of CPU's or GPU's.
I live in hope that AMD will pop out something Volcanic or Steamroll the competition, but sense seems to suggest they won't.
Mmm, not done by a true gamer as it doesn't address a number of things:
1) Not everyone wants to run the game at max settings getting 30fps. Many want 60, or in my case 120fps as that's what my monitor can do. To do this we turn down graphics a bit, but this makes us much more likely to be cpu bound. Remember generally you can turn down the graphics settings to ease strain on gpu for higher fps, but cpu settings are much more fixed - you can't lower the resolution or turn of AA to fix cpu bottlenecks!
2) Min fps is key, not average fps. This I learned years ago playing ut2004. That game might return 60fps most of the time while admiring the scenery, but when you were in the middle of an intense fight with multiple players fps could half or even quarter. It's obviously in the middle of a firefight that you most need the high fps to win.
3) There's a huge difference between single player games and online. Basically most single player games also run on consoles so they run like a dream on most PC cpu's as even the slower ones are more powerful. However go onto a 64 player server (which a console can't do) and watch the fps tank - suddenly the cpu is being worked much harder. BF3, UT engined games all do this when you get on a large server.
Hence your conclusions are wrong imo. You want an o/c intel quad core - i5 750 o/c to about 4ghz+ or better really. Why that - because basically it's still not far of as fast as you'll get - the latest intel cpu's still have 4 cores, ipc isn't much better and only clock a little higher then that.
I suppose it depends on what you define "sizable" as? Perhaps a i2500K would be better, but even with a i5 750 @4ghz vs a i3570K@4.5ghz we aren't talking huge increases in cpu power - 25-30% maybe (hyperthreading aside which generally isn't much help in games).
I very much played a lot of clan-based BF2/BF2142 for a long while. 'True Gamer' is often a misnomer anyway, perpetuated by those who want to categorize others or want to announce their own true nature.
1) The push will always be towards the highest settings at which you can hit that 60-120 FPS ideal. If some of the games we see today can't hit 60 on a single GPU at 1440p, at 4K it's all going to tank. Many games tested in this review hit 60+ above two GPUs which was the point of this article to begin with.
2) Min FPS falls under the issue of statistical reporting. If you run a game benchmark (Dirt3) and in one scene of genuine gameplay there is a 6-car pileup, it would show the min FPS of that one scene. So if that happened on an FX-8350 and min-FPS was down to 20 FPS when others didn't have this scene were around 90 FPS for minimum, how is that easily reported and conveyed in a reasonable way to the public? A certain amount of acknowledgement is made on the fact that we're taking overall average numbers, and that users would apply brain matter with regard to an 'average minimum'.
3) This is a bit obvious, but try doing 1400 tests on 64 player servers and keeping any level of consistency. If this is your usage scenario, then you'll know what concessions you will have to make.
An i5-750 using an older chipset also suffers from less of the newer features - native SATA 6Gbps for example for an awesome RAID-0 setup. This could be the limiting factor in your gaming PC. We will be testing that generation for the next update of this testing :)
As written in the review, the numbers we have taken are but a small subset of everything that is possible, and we can only draw conclusions from the numbers we have taken. There are other numbers available online which may be more relevant to you, but these are the ones under our test-bed situations. Your setup is different from someone elses, which is a different usage scenario from others - testing them all would require a few years in Narnia. But suggestions are more than welcome!
I suppose "true gamer" does sound a bit elitist, by that I really meant someone who plays not benchmarks. I agree it's hard to test min fps in 64 player BF3 matches, but that's the sort of moment when your choice of cpu matters, not in for example in a canned off-line BF3 benchmark. As you are advising on cpu buying choices for gaming it is pretty important.
My personal experience is the offline canned benchmarks giving average fps say you require a cpu a lot less powerful then you really do when you take your fancy new rig online in the latest super popular multi player game. Particularly as in that game you pretty quickly start playing to win and are willing to sacrifice some fancy settings to get the fps up so you don't loose again as you try to hit that annoying fast moving 15 year old while your fps is tanking :)
Therefore while it's fine to advise those people who only want to play offline console ports using benchmarking as you did, it's just doesn't work for the rest of us.
It sounds more than a bit elitist: it is elitist. For every gamer that spends 10-20 hours of time each week in multiplayer gaming (MMORPG, or whatever FPS you want to name, or World of Tanks, etc.), there are likely at least ten times as many gamers that generally stick to single player games. What's more, that sort of definition of "true gamer" may as well just say "high school or early 20s with little life outside of the digital realm." Yes, that's a relatively big demographic, but there are many 20, 30, 40, and even 50-somethings that still play a fair amount of games, but never bother with the multiplayer stuff. In fact, I'd say that of the 30+ year old people I know well, less than 1% would meet your "true gamer" requirement, while 5% would still be "gamers".
The purpose of this article is to give a scientific basis for comparison within the boundaries of realistic testing deadlines. I would be interested to see you produce something as statistically rigorous based on performance numbers taken from online gaming. If you managed to do it before said numbers became irrelevant due to changes to the game code I would be utterly flabbergasted.
There is no way to recreate or capture all the variables/scenarios to repeatedly benchmark a firefight in BF3 across multiple systems. The results from this hardware review are relevant, because they are easily repeatable by others and provide a fair baseline to compare systems. The point of this study is not what CPU do I need to play BF3 or Crysis at max settings, it's how much bandwidth bottleneck is going on with a single GPU setup? What happens in reality with multi-GPU setups? How well does the new AMD architecture (because "true gamers" want to save $$ to buy games) compare to Intel?
What you have to do, as a "true gamer" and someone who has enough wits about them, is extrapolate the results to your scenario because everyone's will be different. And honestly, anyone who plays FPS...the "true gamers", will know what you pointed out. It's insanely obvious even the first time you play a demanding FPS MMPOG like BF3.
I however, play single player 99% of the time. Only online FPS I'll play now is CS.
"2) Min FPS falls under the issue of statistical reporting. If you run a game benchmark (Dirt3) and in one scene of genuine gameplay there is a 6-car pileup, it would show the min FPS of that one scene. So if that happened on an FX-8350 and min-FPS was down to 20 FPS when others didn't have this scene were around 90 FPS for minimum, how is that easily reported and conveyed in a reasonable way to the public? A certain amount of acknowledgement is made on the fact that we're taking overall average numbers, and that users would apply brain matter with regard to an 'average minimum'."
The point of a benchmark is to provide a consistent test that can be replicated exactly on multiple systems. If you're not able to do that then you aren't really benchmarking anything. That's why 99% of games are not tested in multiplayer but rather single player in experiences they can strictly control. (i.e. with test demos). If for some reason the game engine is just that unpredictable even in a strictly controlled test situation you could do multiple trials to take a minimum average.
Minimum FPS is an extremely necessary test and its easily possible to do. Other sites include it with all of their gaming benchmarks.
But yeah statistics is extremely complex and error prone. I once read that a large amount of statistics in scientific publications have errors to a certain degree (but not necessarily making the results and conclusions completely wrong!!!)
Or if you actually know such a "special scene" can happen, discard all test were it happened.
The main issue here is actually available time or the amount of work. Averages over 3 aren't really that great. if you could run everything 100 times such "special scenes" would be irrelevant.
P55 boards can offer very good RAID0 performance with SSDs, or more importantly RAID1 or RAID10 (I hope those with RAID0 have some kind of sensible backup strategy). See my results:
One will obviously get more out of newer SSDs using native SATA3 mbds for the sequential tests, but newer tech won't help 4K numbers that much. In reality few would notice the difference between each type of setup. This is especially true given how many later mbds use the really awful Marvell controllers for most of the SATA3 ports (such a shame only a couple are normally controlled by the Intel or other chipset); performance would be better with an older Intel SATA2. I expect many just use the non-Marvell ports only if they can.
What matters is to have an SSD setup of some kind in the 1st place. My P55 system (875K) boots very quick with a Vertex3, gives a higher 3DMark13 physics score than a 3570K, and GPU performance with two 2x 560Ti is better than a stock 680. It's really the previous gen of hw which can present more serious bottlenecks (S775, AM2, DDR2, etc.), but even then results can often be surprisingly decent, eg. oc'd Ph2 965, etc.
Also, RAID0 with SSDs often negates the potential of small I/O performance. Depending on the game/task, this means SSD RAID0 might at times be slower than a single good SSD.
Dribble is right in that respect, improvements are often not as significant as people think or expect (I've read sooo many posts from those who have been disappointed with their upgrades), though it does vary by game, settings, etc. Games which impose a heavier CPU loading (physics, multiplayer, AI etc.) might see more useful speedups from a better CPU, but not always. There are so many factors involved, it can become complicated very quickly.
Your 120 hz screen has a frame latency of about 8 ms. Meaning it effectively can't show you more than 60 new fps. Anything above that it shows you the same pixel twice. So basically, you are watching reruns, and anyone who states that he can tell a difference between 60 fps and +60fps is basically kidding himself.
Really great review and testing. As for the CPU to add to the list, you could add some very cheap solution like the G1610 and G2020 too see how these 40-60$ chip perform againts all other chip or simply compare to an older E6700 like the one on the test. Other than that, you could also add a 3820 in the testing simply to lower the cost of the X79 setup, making it a little more mainstream VS a 600$ 3930k.
A8 for single gpu gaming with a 7970? Really? Just because your limited run of 4 games did not show anything wrong with the a8 does not mean that the a8 is going to perform properly with other games. Play Hitman with it and a 7970 or multiplayer BF3, then see if you are still going to recommend the a8.
What's with all the people in here who don't understand statistics?! You can't do scientifically rigorous multiplayer testing and produce useful results. The time required alone to test... the mind boggles.
Keep in mind that the article's title doesn't start with 'Statistical Analysis of...', but rather 'Choosing a...'.
That's important. While you can't 'properly' benchmark multiplayer games, you can make reasonable inferences and use them to support your conclusions. The reality being exposed here is that Ian's benchmarks are really only useful for choosing a CPU for single-player games, not that there's a damn thing wrong with that.
However, it's not unreasonable for people to point out that the gaming situations requiring real CPU power to maintain playability are not covered in a 'Choosing a Gaming CPU' article.
In a multiplayer situation, you'll likely get similar ratios of performance, just lower average FPS. It's pretty easy to assume an X2 or i3 or other dual core is not going to hold up well, as these results support. But how in the hell are you supposed to have a baseline to compare systems in a multiplayer scenario? Do you have any idea what a cluster fuck that would be, even to compare just one game across only two systems, let alone as many as this review has?
This review helps CPU buyers because they can look at these results, and multi-GPU setups, and see where the bottleneck will occur first. That doesn't mean there won't be more bottlenecks, but at least you can see which part of your system you should upgrade first.
nice article - would like to have seen an AMD AM2 setup for comparison though. Sadly though I don't like the obvious intel slant - with comments like ` noticeable gap` between intel and amd cpu`s , yet its under 1 fps! I challenge you to actually see a 1fps difference without a meter...
I didn't say gap with the small 1 FPS differencess, I said split. Whenever I said gap, there is a sizeable difference ~10%. For the small FPS difference in Dirt 3 + one GTX 580, I said "Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.". Please don't misinterpret my results when I cater for your issue word for word. If you have an issue with a *specific* analysis, please let me know.
again I disagree - you use the words , chosen carefully - the implication is obvious. `gap` and `split` implying a considerable distance between the 2. when in reality there is none. at least anandtech has finally started using real world resolutions and not the pointless 800x600 . poor choice in ambiguous words In writing.
I disagree; "gap" and "split", particularly taken in context, are very clear in the text. What's more, for someone that appears to be worried about a single word choice, you're at the same time ignoring most of the other words.
Gap: A break or hole in an object or between two objects. Split: A tear, crack, or fissure in something, esp. down the middle or along the grain.
There's a split between AMD and Intel, but in many cases not a gap.
I have got a Q9400 coming in from a family member for the next update to this review :) Putting more cards in the review might multiply it out too much time wise :/ If there is more requests to try more mid-range cards, I might move to that and retest everything, if I can get the cards in. The 7970s/580s were the only ones I really have to hand to test multi-GPU.
Core2Quad, like my Q9450-ish, I'd only like to know if buying a modern 660 or similar will not hamper that card too much. Not very interested in multi-card configs. Great review you did, but I only looked at the single-card table. I think most people try to balance the single CPU vs GPU upgrade cycles.
This is a great article to compare cpus across multiple gpus. I'd also be curious to see how different GPUs scale. I'd like to see if a single $400 card is better than 2 $200 cards. I'm going to say that given the choice between one or two $400 cards, two is better than one. Going to the extreme would get you to ask if you want to go crazy, if four $100 cards is better than 1 $400 card. That would probably be going too far since you have to end up with expensive motherboards to support four gpus. But I think that would make a useful article about gpus.
Thank you for putting the DP Xeon platform in. I imagine it is a niche market but a platform parallel to that in an older generation would be a huge help. I have an aging LGA 771 Asus Z7S-WS board with (2) e5472 procs with (1) 7950 w/boost. The system was built for 3D rendering and architectural work and as 2 systems are not affordable, this became my gaming machine as well. Other than putting my own benchmarks up against what I can find here or other sites it is very hard for me to decide when and to what to upgrade. I greatly appreciate the Xeon inclusion on this as there are some (few?) who fall into the work + play on a single machine scenario.
Maybe this is a dumb question, but shouldn't the Core Parking updates also be beneficial to the APUs? They're still using the same module architecture as the FX chips.
That's planned for the next update, after Haswell launch. At the time I completely forgot and went on to the next platform. Need to pull out the FM2 test bed, install an OS and retest them - another day of testing at least (!). But it's on the 'to do' list.
This analysis is great! And extremely useful for anyone contemplating a gaming build in the near future (as I am). I look forward to seeing your updates and more articles like this.
Btw, minor typo ("future") at the very end--"but we hope on expanding this in the fuiture."
You agree with the fact your intelligence has diminished? Okay.
I would LOVE to see you design a microprocessor as complex as one of AMDs. Their processors actually perform admirably in highly threaded workloads, while their current architecture is weak in the IPC department. Their CPUs are by no means weak and should still be recommend in some circumstances, such as their APU range. Please try and post intelligently, I know it's hard for you.
I know they're more difficult to get a hold of, but I'd be curious how some of the lower power stuff, like the i7-3770T or the i5-3570T would do. Even a i5-3550S would be pretty interesting, I think.
I mean, I know there's a lot of gamers that just want as powerful (or conversely, as cheap) a CPU as possible, but it would be interesting to see if Intel's more 'efficient' (for lack of a better word) chips do nearly as well.
I would be curious to see how "low-power" parts do as well, though that would be a secondary desire behind seeing these tests done on multiple monitor configurations.
Odd. The i5-3570K is a very popular CPU and it doesn't get attention or recommendation? Does that mean that previous tests by numerous websites indicating and directing thousands of consumers to build with this CPU somehow became irrelevant? I could have sworn that the rule of thumb was you go with an i5-3570K instead of an I7 if you're not into heavy audio/video work but yet here it doesn't appear to be the case. Very interesting.
I didn't have one to hand and couldn't get one in. We don't work in an office at AT, we're spread across the world. The nearest I had to it was the 2500K, which is an IPC decrease. i5-3570K (and the Haswell equivalent) should be in the next update :)
It's an IPC decrease, but it oc's far better than the 3570K due to the cap material issue; end result is the 2500K will be faster overall. I still think the 2500K is a better buy, assuming one can get them. Unless of course one is willing to replace the cap material with something better, then the 3570K will be an easy winner.
Could you investigate more into how AMD failed in Civilization V? could it be that RTS game are harder to multihread optimize thus favoring Intel CPUs?
Civ V is more dependent on the CPU than the GPU, and in this case that's where AMD's shortcomings in single-threaded performance show. It will be very interesting to see what happens in these scenarios whem AMD starts releasing HSA capable APUs. When coupled with a discrete GPU, will they be able to manage both the integrated and discrete components to an advantage in games like Civ 5 and other CPU demanding strategy games?
I would also like to see a AMD Phenom II X3 720BE. That processor was very popular back in the day but also has pretty good OC capability. I am getting ready to build a new machine and I'd like to see how my current setup would compare to newer Piledriver and Haswell chips. Great review BTW!
Uhm, did you not just read this article? Unless you are running multi-GPU's AMD's CPU are fine. With the exception of Civ5 which is CPU bound. But outside of that one case, saving $150 by buying an AMD makes a lot of sense. Especially if it allows you to put that money into a better GPU.
You are wrong sir, there are WAY more cpu bound games than you think, almost any MMO will fall into this category, Skyrim is another one and most games that only use one or two cores, sadly this is alot of games. Trust me I love AMD and have used them for years but after upgrading everything I was still getting poor performance in most of the games I play. I broke down and bought a 2600k after a lot of research and wow was it an improvement over my 1100t (6 core amd)
Its sites like this and slanted test like these that kept me with AMD for years, glad I finally figured it out and still hold out hope AMD will improve their IPC along with future games using more cores properly.
I'm getting close to doing so, as his "contributions" are completely useless. Vote here for banning or not -- I'm inclined to just leave it be for now, but if he continues to post prolifically with nothing meaningful, I'll take action.
Please ban him. There really is no reason to leave trolls like this on the forum, they contribute nothing and constantly derail meaningful discussion. I can't think of a single reason to not ban him (and other) trolls.
Ban this loser. He simply can not leave his fan boy status alone long enough to evaluate an outstanding SCIENTIFC analysis. I love both Intel and AMD CPU's. Both have their places just depends on what your objectives are.
Jarred, I would fully agree with banning any person who continually makes no contribution to the discussion. These comment sections often supply me with useful information, and can be read as a continuation of the article itself. Having to hunt for the valuable opinions amongst piles of cretins and idiots makes me want to go elsewhere.
Hmmm. So bottom line is my 2007-vintage QX6850 is perfectly good a 1080p so long as I get the a decent GPU.
Bizarro state of affairs when a 6 year old CPU is perfectly happy running cutting edge games. Not sure if I should blame the rise of the GPU or the PS3/XBox360 for holding back gaming engines for so long!
In games that are CPU limited (like Skyrim or Arkham Asylum), no. I continue to get the impression from both personal experience and articles/reviews like this that once you have "enough" CPU power, the biggest limiting factor is the GPU. "Enough" often seems to be a dual core operating at 3.0GHz, but newer titles and CPU bound titles continue to raise the bar.
Agreed. Especially in multiplayer situations. Try running PlanetSide 2 or Natural Selection 2 with a core2quad like I do. It isn't pretty. But just about any other singleplayer game... sure, no problem.
So... these were all tested on a single monitor? Though the article has lots of interesting information, I'd argue that doing these tests on a three monitor 1440p setup would show much more useful information that consumers looking at these setups would be able to apply to their purchasing decisions. It's great to see more reviews on different CPU + multiple GPU configurations, as well as the limitations of such settings, but by limiting such tests to an increasingly unlikely usage scenario of a single monitor, the data becomes somewhat esoteric.
Did you mean three 1080p monitors (i.e. 5760x1080) by any chance? 7680x1440 is a very, very rare setup especially for a gamer. For work purposes (e.g. graphics designer, video editor etc) it can be justified as the extra screen estate can increase productivity, but I've never seen a gamer with such setup (heck, the monitors alone will cost you close to $2000!). I'm not saying there aren't any but it's an extreme minority and I'm not sure if it's worth it to spend hours, even days, testing something that's completely irrelevant to most of our readers.
Furthermore, while I agree that 5760x1080 tests would be useful, keep in mind that Ian already spend months doing this article. The testing time would pretty much double if you added a second monitor configuration as you'd have to run all tests on both configs. Maybe this is something Ian can add later? There is always the trouble of timing as if you start including every possible thing, your article will be waaay outdated when it's ready to be published.
I didn't mean three 1080p monitors, which does seem to be the "common" three monitor configuration I've seen most gamers going for (since it's cheap to do with 24" panels being under $200 a pop) My 27" S-IPS 2560x1440 panel cost about $300, so I'm not sure where you're getting the $2000 figure from... and if you spend $1500-$2000 on the graphics subsystem, why wouldn't you be spending at least half as much on the monitors?
Most modern high-end graphics cards should be able to easily handle three 1080p monitors in a three card config... possibly a two card config... a round up like this would be much more useful to consumers if it did include such information... as well as show just how well the different CPU and GPU combos worked with multiple monitors.
I have 3 x 1080p and a 7970, on modern games it isn't possible to get 60fps without turning settings way down. Really need 2 x 7970 to maintain 60+ fps
I'm doing lots of tests that should help in your case. If you want me to test anything specific, feel free to PM. I have the same 875K, but also 2500K, 2700K, 3930K, Ph2 965, QX9650 and many others.
I'm really surprised that minimum FPS wasn't also tested. Testing just for average FPS is not that informative to the actual experience you will have. If given the choice between two CPU's I'd take one averaging 70 fps but with a minimum fps of 50 over one that averages 80fps but has a minimum fps of 30.
Perhaps some games are more CPU limited, I'm thinking MMO's like Planetside 2 were there are a lot of players at once. Not sure how you'd benchmark that game though.
Ian I know you are a BL2 fan. The game is written with a old UT engine i'm told, so it's performance scaling isn't the same as some of these other titles. The method of testing you used was similar to how I buy my own equipment and recommend to others. With my same 3770k clocked at stock 3.9ghz I can only get about 57fps with my gtx670. when it is OC'd to 4.7ghz that same scene now becomes GPU limited at 127fps on my 144hz lcd. I'm glad you posted this. When people ask for my advice on what hardware to buy, I always tell them, that they should aim for a resolution first, 1080p for example, then what game they would want to play and what performance presets, mid settings 120hz, then buy a gpu/cpu combo that compliments those settings. if your budget allows then up the hardware a tier or two. Too many times do I see people just buy a top tier GPU and wonder why their fps is lower than expected. My way your expectations are met, then if budget allows, are exceeded. I hope you start a trend with this report. So that others can go this route when performing upgrades.
The article is a good start! Pity it didn't include the Tomb Raider benchmark that anyone can run, nor include a discussion about the badly implemented Windows timer frequency that Lucas Hale documented with his "TimerResolution" program. HyperMatrix found lowering the default timer resolution from 10ms down to 1 ms allowed for "Crysis 3 - 30% Framerate and Performance"
Awesome article, thanks! Is it possible to include some sort of gaming physics testing? Now that PhysX is beginning to catch some momentum, I'd be great to see if a 8-module AMD processor handles physics stuff better than a 4-core comparable Intel one, and at what point does a dedicated physics card starts to make sense, if at all.
I’d be also nice if a “mainstream gaming” article could be made too. Benchmarks at 1080p with cards like the 660Ti and 7850, for instance. No need for 3 way SLI/CF on those, so you'll not need as much time in Narnia. :)
interesting read, although i find it too focused to be of much general use (or useful future reference). i'd like to have seen for example how an E8500 holds up (too big of a gap between E6500 and i52500), as well as at least ONE game i would even bother playing (skyrim/witcher/etc). and of course like you mentioned, even a slightly bigger sampling of graphics cards. (i think you mentioned that).
anywho, i realize this wasn't meant to be anything exhaustive (i do appreciate having the CPU/GPU benches available here as a good reference though), and i do like the detail/explanation length you went into.
Regarding your comments on the role of artificial intelligence in game performance/programming: I've just finished a course in AI, and while implementations may vary quite a bit from game to game, many AI programs can be reduced to highly-parallel brute-force computation, simply evaluating the resulting states of many potential decisions for a numerical representation of their desirability, then selecting the best option from the set of evaluated actions. Obviously this is something that will vary greatly from game to game, but in games with many independent AI managed elements, I would expect a certain amount of the processing to be offloaded to the GPU.
Other than that I agree with you on the demands of AI in games; my professor (who specializes in game AI and has experience in the industry) said that the AI is usually given about 10% of the CPU time in a game, so it's rarely a limiting factor.
I'm still working through the whole article (really enjoying it so far) so I'm sure I'll have many more comments/questions later.
Based on previous CUDA experience, CUDA doesn't like a lot of IF statements in its routines. So if you're offloading different AI parts onto the GPU, unless all the elements are being put through the same set of if commands (and states), it won't work too well, with some warps taking a lot longer than others if there is large branch deviation. It's a task suited to MIMD environments, like a CPU. Then again, it really depends on the game. Clever AI is already here, because we confine it to a self-created system. One could argue that the bots in CounterStrike are not particularly smart, but the system can put their accuracy up to 100% to make it harder. It's a lot of give and take, perhaps. It is times like these I wish I did CompSci rather than Chemistry :) I need to take one of those MIT online AI courses. You know, inbetween testing!
I suppose conditionals would make offloading some AI components to the GPU impractical, but there still remains a subset of AI computations which seem very GPU friendly to me. State evaluation functions seems like a prime example, the CPU would be responsible for deciding which options to evaluate, building an array of states to be evaluated by the GPU. These situations probably don't come up very often in FPSs, but in something like Civilization I can see it being quite common.
I've actually got to head over to that class now, I'll ask the professor if he knows of any AI's using GPU computing in modern games.
Like Ian said, GPU's aren't good 'branch' processors, but I do see where you're coming from. Things like real physics, audio environment maps, and pre-render lighting maps could be fed to AI routines running on the CPU. This would allow for a much greater 'simulation awareness' for AI actions.
I spoke with my professor and he said that as far as he knows, many people have discussed to prospect of using GPU's for AI, but nobody has actually done so yet. He's going to ask some friends of his at some major game studios to see if they are working on it.
He did agree with me that there are some aspects that could be computed on a GPU, but a lot of the existing AI methods are inherently sequential, so offloading it to the GPU will require new algorithms in many cases.
You may wish to check nVidia's GTC conference web site where you can find some GPU AI Research. Also, nVidia published various PDF slides on GPU Path Planning.
If you look deeper in some specific AI Domains such as, say, AI Planning (first used in F.E.A.R. in 2005, lately used in KillZone 3 and Transformers 3: The Fall of the Cybertron) you can find papers investigating the use of GPUs.
On of the bottom line of current GPU AI research is that GPUs crunch large numbers of data very fast so, currently, there is not much hope in using the many GPU threads for tiny amounts of data of state space search.
And here is the title of a 2011 GPU AI Planning paper (research; not yet in a game): "Exploiting the Computational Power of the Graphics Card: Optimal State Space Planning on the GPU". You should be able to find the PDF on the web.
My 2 cents is that it's a good topic for a final paper.
Thanks again, I think I will be doing GPU AI as my final paper, probably try to implement the A* family as massively parallel, or maybe a local beam search using hundreds of hill-climbing threads.
Keep it simple is the best advice. It's better to have a running algorithm than none, even if it's slow.
Also, ask you advisor whether he'd want you to compare with a CPU implementation of yours in order to evaluate the pros and cons between your sequential implementation and your // implemenation. I did NOT write "evaluate gains from seq to //" as GPU programming is currently not fully understood, probably even not by nVidia engineers.
Finally, here is book title: "CUDA Programming: A Developer's Guide to Parallel Computing with GPUs". But there are many others these days.
thanks a lot for all your input, I intend to evaluate not only the advantages of GPU computing, but it's weak points as well, so I'll be sure to demonstrate the differences between a sequential algorithm, a parallel CPU algorithm, and a massively parallel GPU algorithm.
Could you test the Q6600 and i7-920 in your next roundup? I have many PC gaming friends, and we all seem to have a Q6600, i7-920, or 2500k in our rigs. Thanks! Great job on the article.
I too have a Q6600, but it would be interesting to see the high end (non-extreme edition) Core 2s as well: E8600 & Q9650. Just for yucks, perhaps a socket 775 Pentium 4 could also make an appearance? :)
You've got a lot of data there. And it's good data if your main purpose is to compare a Radeon HD 7970 to a GeForce GTX 580. Unfortunately, most of it is worthless if you're trying to isolate CPU performance, which is the ostensible purpose of the article. You've gone far out of your way to try to make games GPU-limited so that you wouldn't be able to tell what the various CPUs can do when they're the main limiting factors.
Loosely, the CPU has to do any work to run a game that isn't done by the GPU. The contents of this can vary wildly from game to game. Unless you're using DirectX 11 multithreaded rendering, only one thread can communicate with the video card at a time. But that one rendering thread mostly consists of passing data to the video card, so you don't do much in the way of real computations there. You do sort some things so that you don't have to switch programs, textures, and so forth more often than necessary, though you can have a separate sorting thread if you're (probably unreasonably) worried that this is going to mean too much work for the rendering thread.
Actually determining what data needs to be passed to the video card can comprise the bulk of the CPU work that a game needs to do. But this portion is mostly trivial to scale to as many threads as you care to--at least within reason. It's a completely straightforward producer-consumer queue with however many "producer" threads you want and the rendering thread as the single "consumer" thread that takes the data set up by other threads and passes it along to the video card.
Not quite all of the work of setting up data for the GPU is trivial to break into as many threads as necessary, though. At the start of a new frame, you have to figure out exactly where the camera is going to go in that frame. This is likely going to be very fast (e.g., tens or hundreds of microseconds), but it does need to be done before you go compute where everything else is relative to the camera.
While I haven't programmed AI, I'd expect that you could likewise break it up into as many threads as you cared to, as you could "save" the state of the game at some instant in time and have separate threads compute what all AI has to do based on the state of the game at that moment, without needing to know anything about other game characters were choosing at the same time. Some games are heavy on AI computations, while online games may do essentially no AI computations client-side, so this varies wildly from game to game.
A game engine may do a lot of other things besides these, such as processing inputs, loading data off of the hard drive, sending data over the Internet, or whatever. Some such things can't be readily scaled to many CPU cores, but if you count by CPU work necessary, few games will have all that much stuff to do other than setting up data for the GPU and computing AI.
But most of the work that a CPU has to do doesn't care what graphical settings you're using. Anything that isn't part of the graphics engine certainly doesn't care. The only parts of a the CPU side of game engine that care what monitor resolution you're using are likely to be a handful of lines to set the resolution when you change it and a few lines to check whether an object is off the camera and therefore doesn't need to be processed in that particular frame--and culling such objects is likely done mostly to save on the GPU load. Any settings that can be adjusted in video drivers (e.g., anti-aliasing or anisotropic filtering) are done almost entirely on the video card and carry a negligible CPU load.
Thus, if you're trying to isolate CPU performance, you turn down or off settings that don't affect the CPU load. In particular, you want a very low monitor resolution, no anti-aliasing, no anisotropic filtering, and no post-processing effects of any sort. Otherwise, you're just trying to make the game mostly CPU bound, and end up with data that looks like most of what you've collected.
Furthermore, even if you do the measurements properly, there's also the question of whether the games you've chosen are representative of what most people will play. If you grab the games that you usually benchmark for video cards reviews, then you're going out of your way to pick games that are unrepresentative. Tech sites like this that review hardware tend to gravitate toward badly-coded games that aren't representative of most of the games that people will play. If this video card gets 200 frames per second at max settings in one game and that video card gets 300, what's the difference in real-world game experience? If you want to differentiate between different video cards, you need games that are more demanding, and simply being really inefficient is one way to do that.
Of course, if you were trying to see how different CPUs affect performance in a mostly GPU-limited game, that can be interesting in an esoteric sense. It would probably tend to favor high single-threaded performance because the only difference you'd be able to pick out are due to things that happen between frames, which is the time that the video card is most likely to be forced to wait on the CPU briefly.
But if you were trying to do that, why not just use a Radeon HD 5450? The question answers itself.
If you would like to get some data that will be more representative of how games handle CPUs, then you'll need to do some things very differently. For starters, use just a single powerful GPU, to avoid any CrossFire or SLI weirdness. A GeForce GTX Titan is ideal, but a Radeon HD 7970 or GeForce GTX 680 would be fine. For that matter, if you're not stupid about picking graphical settings, something weaker like a Radeon HD 7870 or GeForce GTX 660 would probably work just fine. But you need to choose the graphical settings intelligently, by turning down or off any graphical settings that don't affect CPU load. In particular, anti-aliasing, anisotropic filtering, and all post-processing effects should be completely off. Use a fairly low monitor resolution; certainly no higher than 1920x1080, and you could make a good case for 1366x768.
And then don't pick your usual set of games that you use to do video card reviews. You chose those games precisely because they're outliers that won't give a good gauge of CPU performance, so they'll sabotage your measurements if you're trying to isolate CPU performance. Rather, pick games that you rejected from doing video card reviews because they were unable to distinguish between video cards very well. If the results are that in a typical game, this processor can deliver 200 frames per second and that one can do 300, then so be it. If a Core i7-3570K and an FX-6300 can deliver hundreds of frames per second in most games (as is likely if the game runs well on, say, a 2 GHz Core 2 Duo), then you shouldn't shy away from that conclusion.
"While I haven't programmed AI..." Doesn't that make most of your other assumptions and guesses related to this area invalid?
As for the rest, the point of the article isn't to compare HD 7970 with GTX 580, or to look at pure CPU scaling; rather, it's to look at CPU and GPU scaling in games at settings people are likely to use with a variety of CPUs, which necessitates using multiple motherboards. Given that in general people aren't going to buy two or three GPUs to run at lower resolutions and detail settings, the choice to run 1440p makes perfect sense: it's not so far out of reach that people don't use it, and it will allow the dual, triple, and quad GPU setups room to stretch (when they can).
The first section shows CPU performance comparison, just as a background to the gaming comparisons. We can see how huge the gap is in CPU performance between a variety of processors, but how does that translate to gaming, and in particular, how does it translate to gaming with higher performance GPUs? People don't buy a Radeon HD 5450 for serious gaming, and they likely don't play games.
For the rest: there is no subset of games that properly encompass "what people actually play". But if we're looking at what people play, it's going to include a lot of Flash games and Facebook games that work fine on Intel HD 4000. I guess we should just stop there? In other words, we know the limitations of the testing, and there will always be limitations. We can list many more flaws or questions that you haven't, but if you're interested in playing games on a modern PC, and you want to know a good choice for your CPU and GPU(s), the article provides a good set of data to help you determine if you might want to upgrade or not. If you're happy playing at 1366x768 and Medium detail, no, this won't help much. If you want minimum detail and maximum frame rate at 1080p, it's also generally useless. I'd argue however that the people looking for either of those are far less in number, or at least if they do exist they're not looking to research gaming performance until it affects them.
Ian, thanks for this. I'd really like to see how these tests change even higher resolutions, 3 monitor setups of 5760x1080, for example. There are folks claiming that the additional PCIe lanes in the i7 e-series makes for significantly better performance. Your results don't bare this out. If anything the 3930K is behind or sometimes barely ahead (if you consider error margins, arguably it's on par with the regular i7 chips.) I own an i7 2700K and 3930K.
Awesome review! Very impressed with the effort and time put into this! Thanks a lot! It be cool if you could maybe somewhere fit an i7 860 in somewhere over there. Socket 1156 is feeling left out :P I have i7 860...
Great data for people who want to overload their video card and figure out which CPU will help them do it. But it's basically worthless for gamers who want to make games run smoothly and look nice and want to know what CPU will help them do it.
Would you do video card benchmarks by running undemanding games at minimum settings and using an old single core Celeron processor? That's basically the video card equivalent to treating this as a CPU benchmark. The article goes far out if its way to make things GPU-bound so that you can't see differences between CPUs, both by the games chosen and the settings within those games.
But hey, if you want to compare a Radeon HD 7970 to a GeForce GTX 580, this is the definitive article for it and there will never be a better data set for that.
Troll much? The article clearly didn't go too far out of the way to make things GPU bound, as evidenced by the fact that two of the games aren't GPU bound even with a single 7970. How many people out there buy a 7970 to play at anything less than 1080p -- or even at 1080p? I'd guess most 7970 owners are running at least 1440p or multi-monitor...or perhaps just doing Bitcoin, but that's not really part of the discussion here, unless the discussion is GPU hashing prowess.
If they're not GPU bound with a single 7970, then why does adding a second 7970 (or a second GTX 580) greatly increase performance in all four games? That can't happen if you're looking mostly at a CPU bottleneck, as it means that the CPU is doing a lot more work than before in order to deliver those extra frames. Indeed, sometimes it wouldn't happen even if you were purely GPU bound, as CrossFire and SLI don't always work properly.
If you're trying to compare various options for a given component, you try to do tests that where the different benchmark results will mostly reflect differences in the particular component that you're trying to test. If you're trying to compare video cards, you want differences in scores to mostly reflect video card performance rather than being bottlenecked by something else. If you're trying to compare solid state drives, you want differences in scores to mostly reflect differences in solid state drive performance rather than being bottlenecked by something else. And if you're trying to compare processors, you want differences in scores to mostly reflect differences in CPU performance, not to get results that mostly say, hey, we managed to make everything mostly limited by the GPU.
When you're trying to do benchmarks to compare video cards, you (or whoever does video card reviews on this site) understand this principle perfectly well. A while back, there was a review on this site in which the author (which might be you; I don't care to look it up) specifically said that he wanted to use Skyrim, but it was clearly CPU-bound for a bunch of video cards, so it wasn't included in the review.
If you're not trying to make the games largely GPU bound, then why do you go to max settings? Why don't you turn off the settings that you know put a huge load on the GPU and don't meaningfully affect the CPU load? If you're doing benchmarking, the only reason to turn on settings that you know put a huge load on the GPU and no meaningful load on anything else is precisely that you want to be GPU bound. That makes sense for a video card review. Not so much if you're trying to compare processors.
You go to max settings because that's what most people with a 7970 (or two or three or four) are going to use. This isn't a purely CPU benchmark article, and it's not a purely GPU benchmark article; it's both, and hence, the benchmarks and settings are going to have to compromise somewhat.
Ian could do a suite of testing at 640x480 (or maybe just 1366x768) in order to move the bottleneck more to the CPU, but no one in their right mind plays at that resolution with a high-end GPU. On a laptop, sure, but on a desktop with an HD 7970 or a GTX 580? Not a chance! And when you drop settings down to minimum (or even medium), it does change the CPU dynamic a lot -- less textures, less geometry, less everything. I've encountered games where even when I'm clearly CPU limited, Ultra quality is half the performance of Medium quality.
Basically for the most part the single GPU game tests tell us absolutely nothing about the CPU because save for a couple especially old or low end CPUs, none of them even come close to hindering the already completely saturated GPU. The 2-4 GPU configurations are much more interesting because they show actual differences between different CPU and motherboard configurations. I do think it would be interesting to also show a low resolution test which would help reveal the impact of crossfire / SLI overhead versus a single more powerful GPU and could more directly expose the CPU limit.
Very interesting article. And a lot of unwarranted criticism in the comments.
I'm kind of disappointed that the dual Xeons failed so many benchmarks. I was looking to see how I should upgrade my venerable 2x5150 machine - whether to go with fast dual-cores, or with similar-speed quad-cores. But all the benchmarks for the Xeons was either "the same as every other CPU", or "no results".
Oh well, I have more important things to upgrade on it anyways. And I realize that "people using Xeon 5150s for gaming" is a segment about as big as "Atom gamers".
Would love to see something like a E3-1230 tested, it is around the same price as a i5-3570K but has no graphics, bigger cache and Hyper threading, but no over clocking and 100MHz lower clock. should be similar to a i7-3770 for around 60% of the price.
So let me get this straight. The engineers are idiots, yet you want them to go and work for other companies, the best candidate there being Intel. Also, thanks to this apparent mental handicap, they use Intel processors... oh, I get it, you're pro AMD!
Just a note as I keep seeing posts going on about Planetside 2 being CPU limited. It is not. I am saying this from experience having played it just fine on a Core2Duo based system. The reason you are most likely having problems will either be your ping or your GPU. I was running with a GTX 460 1gb card and it was fine for that particular game. I have just upgraded to a new CPU because the main game I play (Mechwarrior Online) is far more CPU bound being based on CryEngine 3.
I think it should be noted that online multiplayer games are a different beast. Multiplayer is typically more CPU intensive, and if you're looking to maintain completely smooth gameplay without any dips in framerate or stuttering then the CPU becomes more important than it is for single player gaming.
Also would you consider benchmarking live, online streaming of games? Would be great to see how much of a benefit the 3930K would have over other chips, and if Piledriver can pull ahead of the i5s definitively.
Your sample size is statistically beyond irrelevant which prevents the scientist in me from drawing any conclusions from it. In addition, claiming any sort of causal relationship between results is outright scientifically wrong even if the sample size would be statistically relevant. From an engineering standpoint the X79 systems with ample headroom in every relevant department would be the best choice to avoid any possible bottlenecks / contention issues in the largest possible number of different workloads.
Any recent system with a recent CPU and recent midrange graphics card can play a game and can often play it well. Advising a Core i7 3770K based on a statistically irrelevant benchmark while disregarding systems architecture is something that neither the scientist, the software engineer and the hobbyist in me can get behind in any way.
Hyperbole, you have a new friend: meet Markus! "Beyond irrelevant", "any conclusions", "outright scientifically wrong", "ample headroom", "every relevant department", "best choice"....
Let me guess: you have an X79 system, and it works great for you, and thus anyone even suggesting that it might not be the best thing since sliced bread is something you can't even think about in any way. This article is focused on gaming, and if you want to do things besides gaming yes, you will need to consider other facets of the system build. At the same time, if all you're looking for is a good gaming setup, perhaps with two or three GPUs, I have trouble imagining anyone recommending something other than i7-3770K right now (unless the recommendation is to "wait for Haswell").
Let me give you a few things to consider that, while the scientist may not necessarily agree, the software engineer and hobbyist definitely would avoid SNB-E and workstations. 1) Overall power requirements (they still matter). 2) Quick Sync (may not be perfect quality, but dang it's fast). 3) Better performance in many games with two GPUs, no matter what paper specs and system architectures might say.
And that, Jared, is how to shut down this arrogant, condescending self-titled expert/scientist. I guess he must think the rest of us are bozos who come here for the comic relief?
My comment was about the testing method not being scientifically sound even though the author makes it a point to refer to the "well-adjusted scientist" in himself. There's a huge number of games out there as well as a lot of different mid-range to high-end video cards. Recommending an i7 3770K on the basis of one resolution tested and only 4 games is something that you absolutely cannot call science.
I am among other responsibilities a software engineer and I don't actively avoid Sandy Bridge E and workstations.
My criticism of the methods used and the conclusions drawn is valid criticism, especially in the face of the article being given the appearance of being science.
If you're going to do recommendations based on statistics and for whatever reason decide to disregard engineering and the science behind systems design you're going to need a far larger sample size than what was used here.
You can deflect this all you want by quoting power usage and quicksync but while power usage power usage should be a factor, this test was not about quicksync. If it had been they would not have tested X79 systems at all ;-)
From both work and hobby I know a lot of power users and gaming remains one of the most demanding uses and one of the *most prevalent* demanding uses of a modern PC. Throwing a more powerful system aside and disregarding engineering needs to be done with a lot more care and thoroughness, all of which is missing here.
Answering valid criticism with scorn and aggression is also very telling. Perhaps you're more insecure than you thought you were?
Great review, congrats! This comes at a perfect time for me, I just ordered a Qnix QX2710 1440p 27 inch monitor from ebay and a couple of 670´s to work along with my 2500k OCed to 4.5Ghz. It seems I will be amazed with that upgrade, lets see! Cheers
You honestly would be fine running most games at that rez with a single 670. After a driver install, I forgot to turn SLI back on for like a week and didn't really notice much of a difference in most of my games. Two of most high end cards, from what I continue to hear on forums, can easily power three 1080p monitors at high settings as well. I've not been able to find much information for two and three card setups powering three 1440p or higher res monitors though.
Request: I've seen a few sites do these and they always use a large GPU. I understand removing the GPU from the equation, focus on the CPUs, but I would like to see these tests done with a 7770. If I'm dropping $1000+ on GPUs, I'm not thinking about buying an $80 CPU. A great question I haven't seen answered is how much CPU do I need for a normal mid-range card? If I am looking to get a mid-range GPU, do I need an i5 3570k? Would an x2 555 provide the same gaming performance? A what point does spending more on the CPU stop providing an improvement in performance?
I'd like to suggest putting the i7 3820 in to the next article. The 3960x and 3930k are both 6 core CPU's, making platform comparison with all the other quad core's in the article more difficult.
Certainly you should retain the 6 core CPU's so we can see their potential, but adding the 3820 would allow for direct comparison of the X79 platform vs other platforms when all are running a quad core CPU.
Hi Ian, fantastic article, has led me to rethink a lot of things, especially the scaling at work due to PCI-E 3.0 and price performance for low end systems. Seems that low-mid cpu and decent gpu is still the way to roll for future builds.
Dont know if it would be possible but it would be interested to see the difference between an SR-2 and an SR-X especially considering the PCI-E 3.0 and move to newer cpu architecture.
Be also nice to see a 980x or 990x x58 or a q6600 to see the benefits of moving up from c2q/d to core i... But you probably don't have time :)
Again great article, has made me rethink original thoughts on AMD's 8350 and the caliber of Anandtech' comment's... Cheers from Australia.
I haven't even read the entire article yet, but I can tell it's going to be awesome due to the outlying thoroughness on the first page.
Thank you so very much for going well beyond what other reviewers do by just reporting a single run for each setup without delving deeper into the "why". Truly noble; and I would say you can honestly call yourself a scientist. :)
I think anyone reading this article thoroughly will come away with a better sense of how multidimensional the questions about which mainboard, CPU, and GPU to buy are. It isn't just a matter of looking at a few 2D graphs and picking the top solution (though that might serve to get you in the ballpark of what you want).
Once again, I come away better educated and with more of a sense of what is going on with hardware combinations. Well done, and thanks! I'm looking forward to more of this type about these subjects.
I'd like to see an i7-3820 in action, since I have one in this rig right now! I ran the PoVRay benchmark to see where it placed and scored 1626.25. I've OCed my cpu to 4.6Ghz though but I'd still like to see where a stock 3820 places in these benchmarks. I'm also interested to know if Quad Channel memory makes any difference... Great article! Keep it up, I look forward to seeing more results.
Why is there no E6700 on some of the graphs? E6400 in biblically underpowered and under-cached, we already knew that. Quad would be a better comparison anyways, as many still have them for games and productivity. And another thing... is pci-e 1.1 playing any role with those core2duo bad fps numbers? Why not use a P45 or sth with pci-e 2.0?
Ian, When I saw this article I thought I was hallucinating. It is exactly what I’ve been scrubbing the web looking for. I recently purchased a Korean 1440P IPS display (very happy with it) and it is giving my graphics card (4850X2) a workout for the first time. I’m running an older Phenom II platform and have been trying to determine what the advantages of just buying a new card are versus upgrading the whole platform. Most reviews looking at bottlenecking are quite old and are at ridiculously low resolutions. CPU benchmarks by themselves don’t tell me what I wanted to know. After all, I don’t push the system much except when gaming. This article hit the spot. I wanted to know how a 7950 or 7970 paired with my system would fare. An additional follow up would be to see if any slightly lower end cards (7870, 650Ti etc.) show any CPU related effects. I have been debating the merits of going with a pair of these cards vs a single 7970. Thanks for the great review.
Also of note is Digital Foundry polling several developers whether they would recommend Intel or AMD as the better choice to future proof one's computer for next gen games. 100% recommended AMD over Intel.
Obviously a lot going on behind the scenes here with AMD leveraging their console wins and deep involvement at every level of next gen game development. With that in mind one might expect Kaveri to be an absolute gaming beast.
Great read. I read through the article and the comments yesterday, but had a extra thought today. Sorry if it has been mentioned by somebody else, but there seems to be a lot of discussion about MMORPG's and their CPU demands. Perhaps you could just do a scaled down test with 3-4 CPU's to see how they handled an online scenario. It won't be perfectly repeatable, but it could give some general advice about what is being stressed. I would assume it is the CPU that is getting hit hard, but perhaps it is simply rendering that causes the FPS to decrease.
Other than that, I would like to see my CPU tested, of course :) (Athlon II X4 630) But I think I can infer where my hardware puts me. Off the bottom of the charts!
Great work, however I would certainly question 2 things:
Why did you use 580GTXs? Anyone springing for 4 GPUs isnt going to go for two+ year old ones (a point you made yourself in the methodology section explaining why power consumption wasnt at issue here).
Why would you test all of this hardware for single monitor resolutions? Surely you must be aware that people using these setups (unless used for extreme overclocking or something) are almost certainly gaming at 3x1080p, 3x1440p, or even 3x1600p.
Also of concern to me would be 3d performance, although that may be a bit more niche then even 4GPU configurations.
Great article! The only gripe I would have (and yes I know the reasoning behind it is explained) is the decision not to include Crysis 3 in the testing.
The reason I make that gripe is the even though it has no time demo functionality and adds more work is that it is the closest thing to a next-gen game we have right now, and it is also the *only* game I've seen that reliably eats up as much CPU power and as many cores as you give it. It would have been interesting to see it here.
Core i7 860 overclocked at 3.6Ghz, GTX580 sli Pci2 x8/x8 = min 44, average 49. Final scene destroy the Ceph Alpha. No overclock on GPUs but plenty of headroom. Not scientific but would be useful to see same scene if someone has a more up to date processor.
Great review, the CPU has definitely become less important. I used to change my CPU around every 18 months or my system would show signs of struggling. I bought my i860 in 2009 and it is sitting alongside two GTX 580s (SLI x8/x8). Nearly four years seems like an eternity, got my first GTX580 in early 2011 is the longest I have kept with the same GPU. Shows you that games developers don't challenge the hardware like they used too.
People who make comments like this do not understand that it is about making a properly balanced system so that you get maximum bang for your bucks. This takes skill and a proper understanding of hardware capabilities and technology. On a gaming system you can trade down on a processor and buy a better GPU (or an SSD or both). When you get it right you get more FPS for the same or less money, faster loading times, and have overclocking headroom to use at a later date.
@Ian Cutress: Hello Ian, please get a hold on a Quad Core Intel Core 2 CPU (q9540, q9550, q9650, qx9650, qx9770) and include it in your testing. I don't know where you get that "many people are still on Core 2 Duos" maybe you have seen some sort of market research? I still use a QX 9650 for gaming (WoW and SW:TOR-MMO's) and I am very happy with its performance. It would be nice to see how the high-end Core 2 CPUs measure up against modern CPUs.
What a great Article, As I am in the process of making my mind up what to buy in the next two months this answers so many of my questions. Thank you! My main unanswered question some how seemed to get missed OR did I not read correctly?? QThe results on the games show the 2 CPU config as scoring zero! is that because the wouldn't run the software on the rig or what?
Great article. For those of us still gaming at 1080p with single GPU set ups, it is nice to know that it doesn't really matter if I spend a little less on the CPU and divert those funds toward a higher end GPU.
First, well done piece with lots of great info. Now then I would love to see this same kind of look done at 1080 resolutions with a mid range card like a 7870 or 7850. Would also love to see some other games added to the mix like a modern MMO or Skyrim that is a bit harder on the CPU.
Yes, Intel has great engineers not to forget the business gurus. Intel did a great job on Larrabee, choose(?) not to be in the Nextgen consoles and it will be jam tomorrow in the tablet and smartphone market. In April Intel reported a downturn of 25% in profits which it attributed to a decline on the PC market. As Anandtech has just shown the GPU is where the money is at, the CPU is a passenger and time to replace is extending. Intel makes good processors but it is also a one trick pony who has failed to move with the times.
You thought of some many things to consider, yet when you say: We know what's missing, you forgot so many things. I didn't read all the comments so excuse me if someone already mentioned it. But what's missing is several games. Crysis 3 for example, or Far Cry 3.
You mentioned that 1440p is a niche (it's 2560x1440 btw, 1440p isn't technically a resolution). So why didn't you test at 1920x1080, not only are games more prone to being cpu-limited, but games like Crysis 3 or Far Cry 3 are actually more demanding then games like DiRT 3.
Reason I mention this, is that I've found there to be a rather big difference between a X4 970 and a 3960X in those games, at ultra settings in Crysis 3 and FC3, with a GTX 660, 670 and 680. I know Anandtech doesn't report minimum's, but if you take the time to do as many runs as you did, you can scientifically establish that the minimum fps is also greatly affected by slower cpu's.
Reason I respond after not having done so in years, is that I found your suggestion to pare a 5600K with a high-end gpu to be disputable, and that's me being mild. Especially since you more or less went and said that 'other' websites or testing didn't do their testing properly.
The recommendation also doesn't take into consideration the upgrade path of the PC. If you were to follow this suggestion, the probability of having to do a full CPU and motherboard upgrade instead of just the GPU when you next need to upgrade is going to be significantly higher. Most people don't want to do a full system upgrade after 2-3 years because they are CPU limited on the new title they want to play. I say spend the extra $100-$150 on a better CPU and potentially make the PC last another two years.
Ian, this is real research and journalism. This kind of in-depth reporting on hardware is exactly what keeps me coming back to anandtech, year after year. Your efforts are appreciated!
I would love to see this kind of write-up that covers surround/eyefinity resolutions. I've been fairly impressed with how my 7950 handles games across three monitors, and I've been an nVidia fan for years.
I wonder why games are vastly more parallel on the GPU side of things than the CPU side. If a game can utilize 2048 SPs, why doesn't adding 2 or 4 more CPU cores help much?
Sienna. true that Amber`s artlclee is impossible... last saturday I bought Ariel Atom after having made $9498 this-last/5 weeks and-also, $10,000 last-munth. with-out any question its the coolest work I have ever had. I started this seven months/ago and pretty much straight away started bringin home minimum $71 p/h. I follow the details on this straightforward website, Bow6.comTAKE A LOOK
Love this information it was an eye opener. Great Job Ian!
To choose a gaming CPU is a question I am asked to answer nearly on a daily basis from clients or friends in my line of work. While your concluding recommendation are spot on given the information you provided, I wouldn't often find myself giving out the same advice. The reason behind this is the future upgrade path of the PC. My apologies if this has already been pointed out in the comments as I haven’t read every one yet.
Most people seeking a PC upgrade have just started playing a new title and have hit a wall. They are unable to play this new game at the resolution and detail they feel to be the minimum they can put up with. This wall is mostly a CPU or GPU limitation but sometimes it’s both. Of these upgrades the new graphics card is significantly less expensive than a full system upgrade, can be installed easily by most people, and doesn't leave you without a PC or any down time. On the other hand a full system upgrade is expensive, not everyone can put it all together, and often requires an OS reinstallation with data backup.
Let’s say an average gamer (not necessarily you and me) purchases a nice new gaming rig today for whatever reason. It’s likely that within two years or so they are going to hit a wall again. At this point most people have hit the GPU limitation and are able to upgrade the graphics card and away they go again for another one to two years. After hitting this wall for the second time it’s most likely time for a full system upgrade. This process could be only two years for some of us but for others it’s going to be four to five.
What I’m trying to point out is that we can recommend a CPU that is the cheapest while still not limiting our current GPU and get the best possible FPS per dollar right now. But if we do this it’s far more likely we are going to run into a CPU bottleneck early in the upgrade path and instead of forking out a few hundred for a new graphics card after a year or two, we might end up having to replace the both the CPU and motherboard as well.
For this reason I could not recommend an AMD A8-5600K or an equivalent Intel CPU to be purchased with a HD7970 or GTX580 unless you plan to never upgrade your graphics card. Spend the extra $100 to $150 on a better CPU and potentially make the PC last another two years. Maybe the inclusion of some popular titles like Battlefield 3 or PlanetSide 2 would have significantly changed your concluding recommendations. The information provided gives us a good indication of where the CPU bottleneck comes into play but I think the upgrade path of the PC along with what games are being played need to be given a lot more weight for an accurate recommendation to be made. Having said that I could be totally wrong and have recommended the wrong gaming builds for years.
I can see a lot of work but only for a future that won't exist for a good long while. You tested at a res that is too high and not showing reality today based on this dumb idea that we'll all buy $400+ monitors. This is the same crap Ryan tries to push (see the 660ti comments section, he pushed it then when they were $600 and ebay via your visa to korea...ROFLMAO - read as I destroyed his responses to me, click ALL comments so you can just CTRL-F both of us). So raise your hand if you're planning on buying a $400+ monitor, to go with an expensive $300 card only to end up at say 28fps in a game like sleeping dogs (avg...so game is unplayable as minimums would be oh I don't know 15fps?). I don't see any hands raised. So we'll be lucky if MAXWELL in Q1 (or whatever Volcanic does for AMD Q4 this year) will be playable at 1440p. Translation, we'll all buy 1920x1200 or 1080p for a while to come unless we own more than one card. Raise your hand if you have multi-gpu's. According to steampowered.com hardware survey that number (last I checked) was under 2%. You're wasting your time. Start writing for the 98% instead of the 2%. I just wasted MY time reading this crap.
REALITY: We are all gaming at 1920x1200 or 1080p (or worse, below this). This should be the focus. This would show LARGE separations in cpus and Intel kicking the crap out of AMD and that you wouldn't want to touch that A8-5600 with a 10ft pole. Why? The 7970 would not be the limiter, or at least not every time like here. What % of the people have 3-4 gpus? Give me a break this is what you see as the future? $1200 in gpus and a $400+ monitor? You're pandering to a crowd that almost doesn't exist at all. For what? To make an AMD cpu seem buy-able?
The data in this article will be useful in 3yrs+ when you can hit 1440p at above 30fps MINIMUM on most cards. Today however, we needed to see what cpu matters at a resolution that doesn't make a 7970 look like a piece of outdated trash. You're pretty special today if you have 7970 or up in the gpu.
More AMD CYA if you ask me. Just like we're still waiting months for Ryan to do an FCAT testing article...LOL. We'll be waiting for that until xmas I'd guess unless AMD doesn't get the prototype driver done by then, which means we'll never see FCAT testing here...ROFL.
Ryan has ignored TWO articles now on fcat. It didn't make the 7990 review, and part2 of fcat article never even came. Just keep delaying, your sites credibility is going down the drain while everyone else tells it like it is. AMD & their drivers currently SUCK (cpu & gpu). Their cpu's suck; hence running at a res that shows all your games can't run without multi-gpu and hit 30fps+ MINIMUM - meaning at this res they ALL require more than one gpu making cpu choice a non issue of course. Their gpu's are great but drivers suck so they give away games by the truckload to try to sell a gpu vs. exceptional NV drivers. Lets face it, the best hardware in the world sucks if drivers can't live up to the hardware. Unfortunately AMD blew all their R&D on consoles that are about to die on the vine, instead of GREAT drivers to go with a GREAT gpu.
What do you end up with when you spend your wad on consoles instead of drivers? FCAT showing you suck, runts, stutter, enduro that lacks on notebooks (see notebookcheck 7970m article recently, it was even mentioned here oddly...LOL) and CF that is abysmal and at times showing NEGATIVE scaling for more than one gpu vs....again, NV drivers that have none of these issues. Optimus works (hence nv beats this drum repeatedly and justifiably) and so does their SLI. While AMD sucked for a year (see hardocp driver review for AMD & NV recently) NV got to sit on their butts (driver butts) waiting for AMD to finally get done with consoles and make a "Never Settle" driver that actually performed the way the cards should have OUT OF THE BOX! Thank god for never settle drivers in Nov or Nvidia wouldn't have released monthly driver enhancements from Dec-May...ROFL. People would be stuck with the same perf today as out of the box from NV (as hardocp showed they didn't improve 1fps all year until AMD caught them...well duh, no point giving out free perf when blowing your enemy away all year).
Mark my words...AMD will be writing off R&D for consoles soon. Even activision's Kotick just said last week that consoles (for all the reasons I've said repeatedly here and at tomshardware etc) have a very tough road ahead vs. all the new ones coming out. Sales of Wiiu off 50% after xmas pop. Just one Q after debut nobody cares already! He basically said they'll be watching for the same on the next two (ps4/xbox720). When that happens no games will be made going forward for this crap as we all move to tablet/phone/ cheaper console or PC (for ultimate gaming).
Video killed the radio star. Cheap android gaming killed the console star....LOL. Ouya, Steambox, Shield (pc to tv here!), wikipad, razer edge, gamepop etc...All stuff that will help kill the consoles and stuff they have never faced before. It was always the big 3, but his time big 3 with little 6-10+a billion phones & tablets chasing them and our gaming time...ROFL. The writing has been on the wall for a LONG while. As usual AMD management screws up. Wisely NV passed on a dying market and only spent 10mil on both Shield and Grid respectively...ROFL. Dirk Meyer wouldn't be doing this crap. They were idiots letting him go thinking he didn't get it. He had a mobile strategy, it just wasn't one that made their CORE products suck while creating it. Management has PIPE dreams. Dirk had REALITY dreams.
"If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU."
AMD CYA. Total lie. Drop this crap down to 1080p and watch the Intel chips separate the men from the boys and in MORE than just CIV5. ALL games would show separation I'd guess. You must of found this out, which immediately made you up the res huh? AMD threaten the free money or something if you showed them weak or Ryan managed to run FCAT testing?...LOL.
"We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience."
Seriously? "If you're ok with EBAYing your $400 "KOREAN" monitor this is a great way to improve your gaming at under 30fps minimum in all games...ROFL. Reworded for clarity Ian :)
NICHE situation is correct in that first sentence...LOL. Again, start paying attention to your audience which is 98% not the NICHE 2% or less. I'm debating buying a 2nd 1920x1200 (already have 2 monitors, one dell 24 and a 22in at 1680x1050) instead of your NICHE just because of what you showed here. 1440p is going to be difficult to run ABOVE 30fps MIN for ages. I'd spend most of my gaming time on the smaller dell 24 at 1920x1200 I think. So I'm debating buying the same thing again in 27in. I want a bigger screen, but not if I can't run 30fps for another 2-3 vid card revs (maxwell rev2?). This is just like I described above with AMD's gpu. Great hardware, but worthless without drivers that work right too. A korean monitor may look great, but what is it worth if you require $600+ in vid cards to have a prayer of 30fps? I'd rather buy a titan, not upgrade the monitor and hit well above 30fps on my dell 24 at 1920x1200 all day! THAT is a great gaming experience I can live with. I can't live with a BEAUTIFUL SLIDE SHOW on a korean monitor off ebay...LOL. I realize you can get a few here in the US now, but you get the point. This is making your niche look like a regular niche is 98%...LOL. Your situation is a NICHE of the NICHE. Check steampowered survey if you don't get what I just said. http://store.steampowered.com/hwsurvey/ Less than 1% run your res tested here. That's niche of a niche right? The entire group of people above 1920x1200 is less than 2% added all up (and this is out of probably over a few hundred MILLION steam users). Just click the monitor res and it will break them out for you. You wrote an article for NOBODY to show AMD doesn't suck vs Intel? Start writing for EVERYBODY (that other 99%) and you'll be making recommendations for INTEL ONLY.
I'm not saying anything bad against Ian here, clearly he did a lot of work. But whoever is pushing these articles instead of FCAT etc is driving this website into useless land. You guys didn't even mention NV's killer quarter (AGAIN). Profits up 29% over last year, heck everything was up even in a supposedly bad PC time (pc sales off 14%...no affect on Nvidia sales...LOL). They sell cards because their drivers don't suck and a new one comes out for every AAA title either before or on the day the game comes out! That's what AMD should be doing instead of console dev. They gave up the cpu race for consoles too! I'll be glad when this gen (not out yet) of consoles is DEAD. Maybe they will finally stop holding us back on PC's. They stuck us with 720p and dx9 for years, and they're set to stick us at 1080p for another 8yrs. They also allowed NV to not do anything to improve drivers for a year (due to AMD not catching them until Never Settle end of Nov2012). But maybe not this time...LOL. DIE CONSOLES DIE! :)
Here's what happens when you show 1080p with details down...cpu's part like the red sea: http://www.tomshardware.com/reviews/neverwinter-pe... Look at that separation! "It's a little surprising that the Core i3-3220, FX-4170, and Phenom II X4 960 aren't able to manage a minimum of 30 FPS, though they come close. The dual-core chips are stuck at about 20 FPS, and the FX-8350 does a bit better with a 31 FPS floor that averages closer to 41 FPS. Only Intel's Core i5-3550 demonstrates a significantly better result, and we have to assume that higher-end Core processors are really what it takes to let AMD's single-GPU flagship achieve its best showing."
Note only two CPU's managed above 30fps minimum! I guess you need a good cpu for more than just CIV 5 huh? You should have ran at this res with details down to show how bad AMD is currently. PEOPLE, listen to me now. Buy AMD cpus only if you're REALLY taxed in the wallet and can't afford INTEL! I love AMD, but if you value your gaming fun (meaning above 30fps) and have a decent card, for the love of god, BUY INTEL. This was a test with a SINGLE 7970ghz. AMD is light years away from Taxing their won top end gpus. But Intel isn't. The bottom to top in this article at toms was 17fps to 41fps. THAT IS HUGE! And they didn't even show top i7's. It would likely go up into the 50's or 60's then.
Anandtech (not really blaming Ian himself here) is steering people into stupid decisions and hiding AMD's weaknesses in cpu's here, and in FCAT/gpu's with Ryan. I can't believe I'm saying this, but Tomshardware is actually becoming better than anandtech...LOL. WOW, I said that out loud. I never thought that would happen. It's 4:50am so I'm not going to grammar/spellcheck the nazi's can have fun if desired. :) Too bad I got to this article a week late.
http://techreport.com/review/23246/inside-the-seco... THE REAL CPU ARTICLE YOU SHOULD READ. Note the separation from top to bottom in skyrim here is 58fps for AMD up to 108fps for Intel...See my point? Leave it to Scott Wasson (the guy who broke out the need for FCAT! along with Ryan Shrout I guess at pcper) to write the REAL article on why you don't want a slow cpu for ANY game. This is what happens at 1080P people! Note the FX8350 and 1100T are nowhere NEAR Intel in this review in ANY game tested. The phenom ii x4 980 is slow as molasses also! Note also Scott discusses frametimes which show AMD sucks. Welcome to stutter that isn't just because of the gpu...LOL. " All of them remain slower than the Intel chips from two generations back, though. "
And this one sums it up best on the conclusion at techreport's article: "We don't like pointing out AMD's struggles any more than many of you like reading about them. It's worth reiterating here that the FX processors aren't hopeless for gaming—they just perform similarly to mid-range Intel processors from two generations ago. If you want competence, they may suffice, but if you desire glassy smooth frame delivery, you'd best look elsewhere. Our sense is that AMD desperately needs to improve its per-thread performance—through IPC gains, higher clock speeds, or both—before they'll have a truly desirable CPU to offer PC gamers. "
Only anandtech has AMD rose colored glasses people. READ ELSEWHERE for real reporting. So AMD doesn't even offer a desirable cpu for gamers...LOL. Sad but true. Toms shows it, techreport shows it and if I had more time people, I really rip these guys apart at anandtech by posting a few more cpu tell-alls. This site keeps putting up stuff that HIDES AMD's deficiencies. I'd like to buy an AMD cpu this round, but I'd be an idiot if I did as a gamer. I7-4770k for me sir. Spend whatever you can on a haswell based system (it supposedly takes broadwell later) and wait for 20nm gpus until xmas/q1 where the real gain will come (even low end should get a huge bump). Haswell comes next month, you can wait for the FUTUREproof (if there is such a thing) socket one more month. Trust me. You'll be happy :)
I'd put more links, but this site will see too many and call me a spammer...UGH.
Well they have previously done worse to me :) I presented data in the 660ti article, called out their obvious lies even with their own data (LOTs of Ryan's own benchmarks were used to show the lies), which prompted Jarred to call me a Ahole and said my opinion was UNINFORMED ;). Ryan was claiming his article wasn't for above 1920x1080 (or 1200) but he was pitching me $600 Korean monitors (same ones mentioned here) you had to buy from EBAY and give you Visa to a nobody in Korea. Seriously? It could not even be bought on amazon from anyone with more than a SINGLE review, which I pointed out was probably the guy reviewing himself :) He had no about page on his site, no support etc, not even a phone#, just an email if memory serves. It was laughable. After taking Ryan down, Jarred attacked ME not the data.
What do you expect a person to do after that?
They've been snowing people for a long time with articles like this.
Where is FCAT article part2? Where is the FCAT results from 7990? We are still waiting for both and will continue as I keep saying until AMD fixes their junk drivers and I guess gives a green-light for Ryan to finally write about FCAT for REAL. This is a pro AMD site (used to be more neutral!), I really didn't write it hoping to get love from the viewers. I just wanted the data correctly presented which other sites did with aplomb. You don't have to like me, or the data, just realize it makes sense as shown in the post via links to others saying it. NOT me.
People who stopped at "same crap ryan" were not my intended audience ;) I can hate a person (well I never do) and still value the data in a great argument made by said person. I don't care about them as long as it makes sense. The person means nothing. As I said above I don't blame IAN really, he's just doing what he's told. I even admired the work he put in it. I just wish that work could have been dedicated to data actually useful to 98% of us instead of nobody while hiding AMD's weaknesses. AMD is not a cpu I could recommend currently at all for anything unless you are totally strapped for cash. Even then, I'd say save for another month or something and come home with Intel. I'm not really an Intel fan either...LOL. I was selling AMD back when Asus was leaving their name off their boards (fear of Intel) and putting their motherboards in WHITE boxes! Intel should have had to pay AMD 15B (they made 60+B over the years screwing AMD like this). They had the best cpu's and Intel managed to stall with nasty tactics until they caught them. I love some of Intel's chips but generally hate the company. But I'd consider myself a D-Bag for not telling people the truth and shafting their computer purchase doing it. If I personally want to help AMD and buy a chip I think is way behind, great - I've done my part (I won't just saying). But I wouldn't tell my friends to do it claiming AMD is great right now. That's not a friend. Anandtech's readers are in a sense their friends (we keep reading or they go out of business right?). Hiding things on a massive scale here is not what friends do for friends is it?
I didn't expect any favorable comments from consoles lovers either :)
Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario.
Here are a few suggestions. Since most users that would spend $500 on a flagship video card and $600-$800 on a 1440p monitor and God knows how much more on the rest of the system, aren’t likely to skimp on CPU choice to save a hundred bucks, a different testing scenario might produce more useful information for the masses (regarding cheap/er CPUs for gaming).
A more likely market for an AMD CPUs in a gaming rig would be people on a tight budget – when every buck matters and the emphasis is on getting as fast a GPU as possible. In my opinion, it’d be quite useful to test various AMD CPUs which are cheaper than an Intel quad-core; paired with a 650 Ti Boost and/or 600 and/or similarly-priced AMD video card at 1080p. Of course, this would raise yet another question – are Intel dual-cores faster than similarly-priced AMD quad-cores in this mid-range gaming scenario?
Suggestions for other CPUs: Core i5-3350P – baseline Intel quad-core performance (cheapest Intel quad-core on the market) Pentium G2120 – should perform similarly as an i3 for gaming (costs less) Celeron G1610 – cheapest Intel CPU
So you're agreeing with a guy that says it's OK to HATE someone but I'm the evil person for pointing out data that is incorrect? HATE? That's not a bit strong? "We might all hate this guy (for good reason)". And people are calling ME offensive? WOW. This reminds me of the gay people who claim to be tolerant, but god forbid any person says something against them (chic-fil-a comes to mind). They want that person tarred and feathered, smear them in the media and never work again, put them out of business, call them names, cheer people who commit violence against them etc...Nice...No double standards there. Another example, Stacy Dash voting for Romney. They called a BLACK woman who spoke her mind a RACIST...ROFL. What? They tore that chick apart merely for having a very well spoken (IMHO) opinion and pretty good reasons for saying them. She didn't sound stupid (despite what anyone thinks about her opinion), but they tarred and feathered her for saying something anti-obama... :( She's a very classy chick if you ask me and they still pick on her (saw some ripping on her on roku last week - msnbc or something).
Not sure what his reason is anyway. Did I attack you guys personally? I even let Ian himself off the hook and left the problem at the doorstep of whoever is directing these articles to be written this way. What bothers me most is all the "great article" "nice job" comments to an article that is very wrong and advocates buying a very low end AMD cpu vs. Intel and says it's going to be ok. IT WON'T and in FAR more than just CIV 5 as I showed via other hardware sites.
What part wasn't objective? My data? The other websites giving the opposite of this site? I can't change their data, and there is nothing objective to discuss when the data is just patently wrong as proved.
People can argue I'm not objective on my console beliefs (though backed by sales data, and I freely admit I hate them...LOL but I own an xbox360/2 ps2's - go figure - I don't want another holding my games at 1080p for 8yrs) and the new gen at xmas may sell very well (we'll know in 9-10 months if they scan sell past xmas pop), but the PC comments and data I provided are facts based on data from steampowered's survey, hardocp, toms, and techreport. I could have went with another group also with pcper, guru3d etc...but too many links and this site says your post is spam.
If it was offensive they need thicker skin or stop writing stuff that other sites totally refute. These guys KNOW that when you drop it down to 1080p the cpu is going to SHOW rather than the gpu's shown here (which aren't as taxed at 1080p) showing any cpu can get the job done. Well yeah, any cpu but only when you push gpu's so far they beg for mercy. To me saying that stuff in the article is a LIE when they know what happens turning it down. I wouldn't be so harsh if they were just ignorant of the data, but Anandtech is NOT ignorant. They've been benchmarking the heck out of this crap for ~15 years (I think he started the year I started my 8yr PC business in '97!). I guess you can't call people out for what they're doing today without being called offensive, emotional (LOL) and not objective. I couldn't have written that post if they would have tested where 98% of us play at 1080p right?
What are they doing here at anandtech? Why would they do this? They know what steampowered shows, I basically said the same stuff to Ryan in the 660TI article ages ago but with even MORE proof and using his own articles to prove my points. I used HIS benchmarks.
Ask yourself why we are still waiting for the FCAT articles (now we're up to 2 or more...part 2 of the first, and 7990 data etc)? Ryan said we'd see them in a week. We are into months now. http://www.anandtech.com/show/6862/fcat-the-evolut... Where's part2? He still hasn't given us ONE ounce of data using it.
"In part one of our series on FCAT, today we will be taking a high-level overview of FCAT. How it works, why it’s different from FRAPS, and why we are so excited about this tool. Meanwhile next week will see the release of part two of our series, in which we’ll dive into our FCAT results, utilizing FCAT to its full extent to look at where FCAT sees stuttering and under what conditions."
That's from his Part1 linked above. How long do we wait?
Just for kicks: http://www.anandtech.com/show/6910/origin-genesis-... "Overall anything short of 5760 with 4x MSAA fails to make a 3rd Titan worthwhile. On the other hand, you do need at least 2 Titans to handle MSAA even at 2560"
Ok, so I need to spend $2000 on two titans to handle some MSAA at 2560 OVERALL in the tested games (heck one hits under 30fps in a game he tested at 1080p in that review). Raise your hand if you think IAN's article is correct...ROFL.
"In three of our games, having a single GPU make almost no difference to what CPU performs the best. "
Yeah in a res that according to Ryan's article taps out two $1000 titans...Then you're right. All cpu's are the same because the Titans are crying for some relief :) Their recommendation here: "A CPU for Single GPU Gaming: A8-5600K + Core Parking updates"
"The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD."
No way...So you'll buy $110 cpu and according to Ryan's article on the titan box, buy $2000 worth of titans to go with it to run at the resolution Anandtech thinks is important (2560x1440).
How do I respond to that without being offensive? You should hear what I'm saying in my brain right now...ROFL. The sad part is people are reading reviews like this and thinking it's correct. Look at the first comments on this article "nice work" etc...Really? I don't see a bunch of HATERS on my comments anyway. Just a few who at the least 1/2 agree with what I said ;) Yourself included. Your example proves to some degree, I didn't waste my time.
Sorry if you think my "truth" was hidden. I was attempting to make it more "in your face" for simplicity sake. Maybe I failed a bit...LOL. Can't please everyone I guess.
Nice...What reason? I defamed a hero of yours? Are they doing you any favors by hiding reality? Can you say after reading the links the other sites are wrong? The point of the links showing the exact opposite of this site is so you JUDGE Anandtech yourselves. I really don't want one of my favorite sites to go away. I just want them to start reporting FACTS as they are without the snow.
I don't feel I have to be politically correct all day for that. People need to get over that PC garbage and get thicker skins. We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.
For the sake of your PC purchase, if you intend on buying on their advice, read the links I gave guys. I'm trying to save people from getting burned! Like me or hate me, the data does NOT lie. You just have to look at it and judge for yourself. When one cpu scores 58 vs. another at 108, there is a SERIOUS reason to pick the proper cpu (just one example from above). If you're seriously broke, I'm all for AMD at that point (great integrated with richland probably making a pretty decent experience), but if not...INTEL. But in either case I wouldn't buy EITHER now. Wait for haswell (broadwell goes in it later...important maybe) or Richland which really makes low end gaming possibly pretty fun I think (at least you can play that is). In laptops maybe Haswell with GT3e makes sense as it should get near AMD or blow by them with 128mb in there. But that's not going to desktops. Integrated on desktops from Intel is still useless IMHO and won't affect Discrete sales one bit from AMD or NV.
I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p. That's why I laugh when people quote Titan on ANYTHING (which happens surprisingly a lot here). No one has a Titan so why even talk about such a card saying "AMD has no answer for it". Well no one even has the card anyway except for a couple of people. I agree also with the resolution thing. It makes no sense that so many reviews are catered to high resolution and mutli monitor setups.at?
People have been wondering why NV and AMD have increased top of the line GFX cards and it's because quite simply, few people have everything needed to exploit such cards. I'd get a 7970, but I don't have a multimonitor setup or a high resolution monitor so what's the point?
Console wise I think the WiiU was a bad for any comparison. It was an upgrade that really brought nothing extra. People who have a Wii don't care about graphics so most of the upgrades of the WiiU are meaningless to Wii owners. The new Xbox and PS4 will be much better in terms of sales.Those console gamers have been dying for a graphics boost.
In the end though you're response explains to me GPU pricing today and why top of the line GPUs are costing more and more. A smaller percentage of people are buying them, because GPUs that are lower end, or GPUs that are older are perfectly capable of doing the tasks needed by gamers today. Maybe when monitors drop in price and more people game at higher resolutions but for now, most people do 1080p, and that's the sweet spot for most people. I know thats the ONLY resolution I ever look and care about.
Thanks...Console arguments are like ford vs. chevy right? How many people won that argument back in the day? :)
If consoles sales after xmas pop continue for 6 months after (unlike wiiu etc that died as Kotick etc point to, wiiu off 50% says something, Vita, 3DS etc down from last revs too, software off also for all), I'll come back and say YOU sir were right :) You have my word. Of course it goes without saying, I'll be saying the exact opposite if it doesn't happen.
Regarding why we need more power...I can show situations where 1080P brought the top end to unplayable. Hardocp just did this. http://hardocp.com/article/2013/03/12/crysis_3_vid... They had to turn some settings down even on 680 and 7970ghz and cards below this really turned stuff off (670 etc). People can say, well this or that doesn't make much difference visually, but the point is you can't have everything on without more power (maxwell/volcano should finally make everything on 1080p playable with ALL details on, no sacrifice at all in anything I'd hope). "Crysis 3 plays a lot better at 1080p resolution, 1920x1080. At 1080p the GeForce GTX 680 and Radeon HD 7970 GHz Edition are able to push the graphics to very high and play with the best experience possible in the game. Granted, we have to use SMAA Medium in order to achieve this. It will most likely take next generation single-GPU video cards to allow us to play at SMAA High 4X at very high at 1080p."
Tombraider has the same issues only worse I guess. : http://hardocp.com/article/2013/03/20/tomb_raider_... "If you are interested in playing Tomb Raider the NVIDIA GeForce GTX 680 provided the fastest performance at 1080p, and was the only single GPU video card capable of playing with 2X SSAA at this resolution. At 2560x1600 the AMD Radeon HD 7970 GHz Edition CrossFire setup will provide more performance. For gaming on a budget, or at resolutions lower than 1080p, the GeForce GTX 660 Ti is an excellent option."
So the 660TI I almost bought is for LOWER than 1080p?...ROFL. OUCH. As they point out two cards for above 1080p and only the 680 survived 1080p itself, and only at 2xSSAA. I can site more examples also, but this makes the point. Even 1080p is tough for top end cards if gaming as the devs intended with all candy on is attempted. We need more power, and 20nm should give this from either company I hope. I hope I'll have enough of a reason to buy 1440p for a few games, then flop it over to my dell 1920x1200 when the new cards can't hack my 27in I plan to buy (if I do, might stick with 27in at 1080p, but I like having 2 resolutions native on the desk to play whichever my card can handle). It's comic ryan was pussing 1440p for the 660TI article, but hardocp says that card is for BELOW 1080p...LOL.
Well, if even older dual core CPUs and the weaker AMD parts don't scale at all with a single GPU, it would seem to me like a 60$ Pentium or even a 40$ Celeron with a bit below 3GHz might make a great companion for the typical a 200$-GPU for a Full-HD Gamer. Would be interesting to add any one of those low-cost Ivy Bridge parts to the comparison to see how they keep up with their core ix counterparts.
One more comment on FCAT missing - From the 7990 review: " The end result is that we’re not going to have FCAT data for today’s launch, as there simply hasn’t been enough time to put it together. FCAT was specifically designed for multi-GPU testing so this is an ideal use case for it and we’d otherwise like to have it, but without complete results it’s not very useful. Sorry guys.
The good news is that this means we have (and will be compiling) FCAT results for our cards based on the very latest drivers. So we’ll get to the bottom of frame pacing on the 7990, GTX 690, and more with an FCAT article later this week or early next week. So please stay tuned for that."
So we're 3 weeks later and no review for this data STILL. Again, people realize the delay tactics here. In another week it will be a MONTH. This is on top of already waiting for FCAT part2 article I mentioned already.
"Our goal with FCAT was to run an in-depth article about it shortly before the launch of the 7990 as a preparatory article for today’s launch. However like most ambitious goals, that hasn’t panned out."
It's not really ambitious when EVERYBODY else is already presenting data article after article. Just keep making excuses. Take a good look at the credibility of this site here people and judge these guys yourselves. Ryan Shrout seems to be able to pump out article after article on FCAT, including his review for the 7990, Titan etc...Every article discusses it at this point. Is Ryan Shrout at PCper.com so much more effective than this huge website? Ryan's asking for donations to upgrade his camera equipment for recording podcast type stuff etc. How many people do you have working here compared to his little site? Which I love BTW. Great site, and he nearly has doubled the funding the asked for :)
At some point I hope people start asking you guys more questions after looking at my posts pointing out stuff most just seem to miss. People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon. You can start with an FCAT article that makes other sites like PCper seem as small as they are.
Are you still trying to figure out how to use it or something? Call Ryan Shrout :)
Ian: i noticed you were GPU bound a lot. doesn't this sort of defeat the test? (i think you were GPU bound more than 50% of the time). i'm curious why you didn't use eye-finity or nVidia surround to test the quad graphics setup? with that much power under the hood it's almost a necessity. anyway, i don't mean to critisize the review, i think it still had some very usefull information. i just think that the conclusion wasn't complete if you're GPU bound. note: and decreasing the graphics so that your CPU bound is unrealistic - nobody with quad graphics is going to reduce the graphics capability so their CPU bound.
edit: i just read through (most) of the comments above. and, while 98% (doubtful, but OK) may play on a single 1080p screen, the fact is that high end graphics are a waste of money for a single 1080p monitor. and, while some games (like Skyrim and Civ V) use a lot of processor, that type of scenario is not indicative of most games. note: also, most of those 98% single screen 1080p users also probably DON'T have a top-of-the-line (ie: 980 or 7970, much less titan, 690 or 7990) graphics card. they probably have a 200-$300 graphics card and a 100-$250 CPU (ie: mainstream). nor do any of those less than top-of-the-line *need* anything more than single 1080p monitor and a mid-range CPU (of which the AMD or Intel variety will do just fine for 98% of those 98% with a single monitor) from my point of view this article set out to find out how much the CPU is used in gaming. does it make sense then to put a limit on the graphics capabilities? of course not. so you go with the high end (top-of-the-line) graphics solution. but in the end, the graphics capabilities was still limited by the screen resolution - you couldn't really see what the GPU's were/are capable of because they couldn't really stretch their leggs (and, in turn, the CPUs never stretched to their limits to feed such a request).
i participate in F@H. as such i also use my GPU's. i've noticed that (depending on the work unit) the GPU's can take as much as 20% of the CPU to keep them fed. is gaming really that much different? the CPU is needed to feed the GPU, and to do those functions that can not be done on the GPU. for folding, it doesn't matter how fast something gets done - so a faster CPU isn't imperative. but, for gaming, the speed of CPU and its ability to keep relevant data going to the GPU does matter. when the CPU can't keep up with the GPU you get slow minimum frame rates and a general "slow" feeling from the game. so, yes, i agree minimum frame rates are important when determining what CPU to use when feeding a high end graphics solution (more so when using more than a single GPU solution). but you still have to let the GPU's stretch thier legs to see how much of the CPU is being used - and that will determine if a CPU is good or not (min frame rates and CPU usage with high end graphics at appropriate resolutions)
Wow, the sheer amount of 'content' that Jian guy is producing is amazing. You could probably publish a few books worth of comments by now. Is it really necessary to hit everybody up with a 1000-word reply?
What I don't get is why you actually do this. You don't agree with what's been tested and how the data has been interpreted; okay, that is your right. And, yes, some of the conclusions drawn might be controversial; but what's your problem? Why don't you just voice your opinion once and leave it be? What are you doing here - are you some sort of freedom fighter for objective data on the internet?
You complain about how AnandTech are doing it wrong and claim that your own observations are objective and valid. From your point of view they might be, but what you are forgetting is that testing hardware is so vast a field, with so many variables that it's impossible to scientifically claim that ANY conclusion is objective, since the very essence of what you're dealing with precludes that. Everything (in hardware testing) is subjective - live with that truth.
It's not about having "tough skin", but having manners and being civilized. You can't expect people to listen to you and take you seriously if you're being rude even if your arguments are valid. Try a more gentle approach - I guarantee your message, whatever it might be, will travel further.
Remember, this is not an article about choosing a CPU for 1080p gaming, also, it's not complete. The provides information for people to interpret their own way. Yes, it draws conclusions at the end that I too think are best left unsaid; but why can't you just look past them? What is your problem? What are you trying to change here? If you don't like AnandTech so much, why don't you just... leave?
Am I supposed to not respond now? You just said I have no manners, am uncivilized, have no objectivity, and previously I’m offensive and it’s ok to HATE me…ROFL. POT – MEET KETTLE. If you were to take your own advice, shouldn’t you have just said “you could word it differently but I agree with the data” and left it at that? No, you took it much further with what amounts to an ad hominem attack on ME. You posted 333 words yourself to do it. :) But thanks for recognizing the work I put in :) I can type 60+wpm though so, not that much effort really and two to three times that with Dragon Naturally Speaking premium easily (pick up a copy if you can't keep up - 1600 words in about 9 minutes...ROFL v12.5 rocks). The homework takes time, but that was already done before they wrote this article as I read everything I can find on stocks I track and parts I'm interesting in.
I've watched this site (and toms) since they were born. 1997 I think here. I did leave toms when Tom Pabst himself forced out Van Smith over the sysmark crap years ago (and removed his name from ALL of his articles he wrote there, putting "tomshardware staff" or some such in Van's name's place). That was AWFUL to watch and I loved reading Tom Pabst's stuff for years. Millions of people were snowed there while they made AMD look like crap in articles with sysmark flagging Intel chips and turning off SSE on AMD. Eventually people like Van, I and others said enough that people took notice and it devalued his site before he sold it. Rightfully so if you ask me, as he was basically an Intel shill at that point as many had pointed out by then.
At some point somebody has to stand up and tell the truth like Van tried to do. It cost him his job, but the message made it through. Someone has to be willing to “take the hate” for other people's benefit. :) Or nothing will ever get fixed right? People reviewing stuff for millions need some kind of checks and balances right? There are NONE right now in our govt and look what’s happening there as they spend us into bankruptcy amid scandal after scandal kicking our financial future down the road time and again. If we had checks and balances for REAL our president would be in jail along with many dirty congress members on both sides (he just got caught wiretapping the AP – freedom of speech is being trampled, gun rights assaulted, our constitution is attacked at every turn!). People are DEAD possibly because this guy did NOTHING to save them in Benghazi for 7 hours under attack. What happened in Boston? Etc…I'm seeing the same stuff happen here that happened at Tomshardware. Someone has to correct them instead of congratulating them right? Otherwise so many people will make the wrong purchasing decisions based on bad advice from influential and supposedly trusted people (I still like this site, just want back to the neutral stance it used to have for years). In this economy I'd be thanking anyone who takes the time and effort to attempt to save me from buying a piece of junk with my hard earned money. In a nutshell this is why I take the time to show another side for people to consider. They don’t have to believe me, that’s the point of the links, quotes from those links etc. I WANT you to look at the data and make up your own minds. Either it costs this site tons of hits eventually and wakes them up or they need to be put out of business. If nobody ever complained about Win8 how long would we get crap like that? Look how fast it got an 8.1 version as a response and the product manager fired. Put their feet to the fire or they don’t stop ever.
Anand would have to be seeing his sites traffic go down. http://www.alexa.com/siteinfo/anandtech.com# If someone takes the time to prove you’re putting up bad data article after article and there is no defense put up (because there isn’t a defense) you are eventually taken down. Jared attacked me in Aug 2012. Pity you can’t go back a year but you can see this site is sliding at least at alexa for the last 6 months. Until they quit yanking our chains I’ll keep yanking theirs if my time allows! Toms went from 10mil to 2mil in just a couple years. I’m not sure what he sold for but it was far less than he’d have gotten before attacking Van, the article shenanigans etc.
Tell me, what parts of my comments were UNCIVILIZED or RUDE? Did I call anyone a name? Say they are stupid? Did I attack ANYONE personally? Did I do what you did? Actually I did quite the opposite. I said they are NOT ignorant and know exactly what they're doing here (hmm, insinuated intelligence…That’s a good comment right?). I even let Ian off multiple times (he's just doing what he's told no doubt) and noted from the get go he did a lot of work, but due to "someone" pushing bad data to hide AMD's faults it's all wasted. I attacked the crap this site is pushing (crap too harsh for you?), not any of the people themselves (who I'm sure are probably nice guys - well, I can't say that about them all, Jarred attacked ME not the data when I buried Ryan's conclusions and benchmarks). Did I swear at someone? Did I spew hate like the guy who gave a one liner to me? He's claiming its ok to HATE me? When did I ever cross a line like that? Is a debate of the facts worthy of HATE today?
If you hate the length of my post don't read it. Take your own advice, move along please. Was it necessary for you to post 1000 words back? :) I'd say even the HATERS took me seriously (the only ones that responded besides Tential – what 2 total plus a polite tential?) and saw the arguments were valid and listened. ALL of them did in their own way. Only the first below wasn’t rude as you say and just discussed what I was saying- tential - no flare up from him, just good old fashioned debate: "I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p."
Tential clearly got the message despite our console differences (they weren’t the point really). I’m sure tons of others did even if they’re silent about it. I used to be SILENT. You can’t argue with steampowered.com’s data, nor everyone else showing the res you SHOULD be running here. You can confirm via techreport, hardocp, tomshardware, etc I gave plenty of links and quotes for people to analyze.
"We might all hate this guy (for good reason) but the words he writes regarding CPU performance in this article have a lot of truth."
WOW...But at least he saw the truth, and his name is hilarious to me :) Did I attack back? NOPE. Even when he seriously crossed a line IMHO I did nothing but a polite rebuttal with some questions – still waiting for why he thinks it’s ok to HATE people for simple comments, but I don’t mind either way, even he got the message. Worse you agreed with the hate...LOL
Here’s you: "Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario."
Comic, I said nothing bad about people, just their data. But to you, it's OK to hate me for it and then toss comments about my character...This goes back to the double standard I mentioned in my previous posts.
There is nothing wrong with a vigorous debate of the facts in any case and I was CIVIL though critical. This was an article about the proper choice of a GAMER cpu. As presented the data is lies as they presented a situation that doesn’t exist (as even you pointed out in your scenario basically). It would be just "incorrect" if they didn't know what they were doing. But they DO know. They know they’re hiding FCAT data as I pointed out. AMD only talks to them as Guru3d recently pointed out (hilbert did). Odd, yes?
I find it funny I already answered your questions before with comments like this (but why not do another 1600 word essay for you) :) : “People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon.”
This doesn’t tell you why I’m doing it? I claim OTHER websites I pointed to are OBJECTIVE and VALID. I piled on with my own observations, but I was merely quoting others who all disagree with this site. That’s not subjective that’s FACT. It’s not my point of view; it is the same one as EVERY other site reporting this type of data. Hardocp, Techreport, PCper, Tomshardware. How many do I need before you call me objective? I can give more sites and another 1000 words of quotes…LOL. I can scientifically claim the resolution they chose here to make all cpu’s show the same perf because the gpu is bottlenecking everything, represents less than 1% of the population and I will be RIGHT. Introducing a variable that totally invalidates the entire premise of the experiment is not subjective, it’s misleading at best and easily proved wrong as I have done. My message travelled far enough as nobody missed it as far as I can tell. Mission accomplished, gentle or NOT ;)
If you don’t like my posts, To quote you: “why can't you just look past them? What is your problem?” “why don't you just... leave?” :) Gee, it seems I've upset you ;)
"What are you doing here - are you some sort of freedom fighter for objective data on the internet?"
Already answered and YES, why not :) What are you doing here? Are you some kind of smart alec that objects to people voicing their RELEVANT opinions in a "comment" section? Silly me, I thought that's what this section is for. Can we get back to discussing the data now? You've distracted us all from the topic at hand long enough and it isn't changing the data one bit.
Sorry for seriously crossing the line good sir but I still reserve the right to hate you if I choose. A wise man once wrote “We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.” <--- this is you =). Keep in mind I was also the first one to agree with you… What you write never fails to bring a smile to my face TheJian, and I hope you don’t stop pointing out the truth any time soon. Just try to keep the next comment shorter so we can read it without so much scrolling..... we don't all own LCDs with 1440+ vertical pixels like we are told to. In the end all we can pray for is a few less gamers to run out and buy an A8-5600K for their HD7970 and for a few of your points to be taken into consideration next time round.
First of all, I’d like to apologize for this long-delayed response – I simply didn’t have the time.
Truly epic. To start off, you haven't upset me, really; not before and not now - I was genuinely curious as to what it is that you think you're accomplishing by all this (not just this article, others as well). Thus, I set forth to playfully provoke you into responding. Success. Now that you’ve answered, and to be fair – more clearly than expected, I have a better understanding of what urges you to do what you do. Such a peculiar case you are, I am fascinated – are you a troll or aren’t you? Somewhere in between I guess. The arguments you provide are sound, although I still think they’re a bit… let’s not use a word as I’m sure you will twist it into a meaning of your choosing (not originally intended); and most of what you say is, well, adequate – all that makes you not-troll after all. Despite that fact that you would’ve probably responded to anything anyway, I still feel that a ‘thank you’ on my side is necessary for your taking the time to respond; and I’m not being ironic here.
Now, let’s get a few things out of the way. Note that I’m neither defending nor criticizing AnandTech, I’m simply voicing an opinion just the way you are. Very important – I never said it was okay to hate you or anybody for that matter, you deduced that yourself. I simply agreed with the gist of what OwnedKThxBye said. You cannot cling to ever word you read online, I don’t think anybody here truly feels hate, certainly not me. People just throw words around in the heat of the moment just the way you debate vigorously, I’m sure you understand that. The semantic field of the word ‘hate’ in 21st century contemporary English is huge, especially when used in this type of discourse.
Why would you blame me for distracting “us all” from the topic at hand when you are the King of Sidetracking? Gotta love your insights on US politics – it’s like watching one of those documentaries on History and the like. My favorite part is about “gun rights” – nice, so eloquently put. The only reason we still have the Second Amendment is because the US cannot just change the Bill of Rights which is part of the oldest acting constitution in the world – it’s a matter of national pride. The reason it was written is a historical occurrence no longer valid. During Colonial times the settlers had to harbor British soldiers which often mistreated them, and so the settlers needed a means of protection. That is how the Second Amendment came to be. Obviously, this is no longer the case. You could argue the right to bear arms is part of Americannness, but this doesn’t change the fact that the original, intended reason for the Second Amendment is a thing of the past.
Checks and balances for the consumer computer industry – so amusing. Manufacturers, Reviewers and Consumers each checking on the others; that is such an utopian concept. You say it doesn’t work for a country’s government, how do you expect it to work for an industry where money is king? There would always be hidden agendas, you can’t stop that.
I believe I’ve discovered a new form of entertainment, and that is reading Jian’s comments. You, sir, are crazy. I don’t mean this as an insult. Keep on fighting the good fight, I can’t wait to read more of your comments; and, please, never stop sidetracking and using internet abbreviations such as LOL.
This is partially at that Jian guy and at everyone. I understand the desire for high end GPU reviews but using your OWN earlier posts, you stated that the majority of people game at 1080p. If that's the case, whats the point of pushing for a 7990, Titan, FCAT review when quite frankly NO ONE HAS THOSE CARDS. According to your own data and posts from the previous page.
To me it seems like you're just trolling however, because you brought up the point of affordability, I think that that's where the majority of reviews should target. YES I want to see how the 7970 and the GTX 680 perform, yes I want to see the next gen too, but I really don't think we should waste so much time on Multi GPU setups that under 1% of the gaming community has.
How about more reviews on upgrade paths, Price to Performance, how to get the most performance at a reasonable price point. That's what I care to see. Any review in which the hardware being tested exceeds 2k (I mean additional hardware), to me is just boring because at the end of the day, I'm not buying two titans, or two 7990s, or even 3 7970s.
This is of course my PERSONAL opinion, but considering data backs it up, I'd like to see some more reviews cater to the average (when I say average I mean average in terms of the gamer who reads reviews and makes educated price to performance ratio choices) gamer.
This review kind of tries to do that but in all reality, we aren't gaming at 1440p so more reviews at how to get the best performance at 1080p for a good price, while leaving us a decent upgrade path would be nice.
Could you possibly go to some slightly older processors and GPUs? In particular the i7-990x would be a great start and the lower and upper end of AMDs 6000 series would be nice too (it seems a LOT of people upgraded from the 5000 series to the 7000 series this year) A benchmarking for Witcher 2 would be nice as well as max settings with Ubersampling turned on is extremely taxing on both CPU and GPU because of how inefficient CDProjekt's RED engine is.
62yrs old play ~150hrs a month. Ready to build new PC. Know next to nothing about building new PC. Read various forums and articles and find the comment sections are great at clearing up some of what I didn't understand in the main article. That being said this is one of the most intertaining comment sections I've read in awhile and was pretty informative. It's helped me put into perspective my hardware choices. Please lets agree to disagree but in a respectable manner. Thank you all for your comments and responces, it's an education.
Ian - since the new 4k TV's are out, i think these types of reviews are very indicative of what we can expect once we are able to hook a PC up (using multiple outputs - such as eyefinity or nVidia surround) to a single input 4k TV. for those who don't know, the new 4k standard (3840x2160) is equivelant to eyefinity or nVidia surround at 1080p, but with 4 monitors instead of 3, and in a normal 16x9 format rather than the super wide 3 screen setups. ie: --|--|-- vs ==|== note: equivelant resolution, but not actually 4 monitors :)
can't wait for THAT testing to begin. assuming an owner can turn off overscan (so you can see the taskbar at the bottom) i indeed intend to purchase one (likely, soon) and would definately want to hook my PC to it. my GTX690 would likely be able to do OK at such a resolution, but i would eventually want to get another 690 - as soon as i could figure out how to utilize the second card with only a single HDMI input on the TV.
as far as blue ray content - if you wait....it will come :)
Very nice and long explanation of what most already knew, GPU bound games in single player do not stress the CPU much, however once you go online or play a CPU bound game this information is worthless as AMD will come crashing down at about 40% less fps and your GPU won't be the bottleneck.
These differences in video processing benchmarks equal to nothing,as the clips are too short and simple Take a full blown Avatar or Inception movie on blu-ray and make it smaller using Handbrake or such Suffer through the process and you will see that the video obtained with a processor who is not 1000$ is trash:it lacks audio and captions on some parts or some frames are lost And then you will see the light:why people are buying processors with no faults from the processing line
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
242 Comments
Back to Article
K404 - Monday, May 6, 2013 - link
AWESOME. Sure, it's not an exhaustive list of CPUs, but it shows enough to point a LOT of people in the right direction. Nice one Ian!Ortanon - Wednesday, May 8, 2013 - link
Agreed. Very nice work.blanarahul - Wednesday, May 8, 2013 - link
Why doesn't Ivy Bridge have Quad x4 PCIe config option so that we can use Quad 7970 without using an extra PLX bridge? After all it's PCIe 3.0 so we still have 4 GB/s of bandwidth per card.IanCutress - Wednesday, May 8, 2013 - link
Intel limitations so you buy an X79/S2011. The PLX chip is a work around that limitation, of course, and helps expand motherboard product lines.xautau - Monday, May 13, 2013 - link
Hi Ian.Congratulations. Very nice work.
I could not check all 23 pages of comments, but I think there must be an update including C2Quad as it still is one of the most used configs. Q9450/9550 for instance?
IanCutress - Wednesday, May 15, 2013 - link
I have a Q9400 in right now, and I am probing around for something more like a QX9650 as well :)Ian
Stupido - Friday, May 17, 2013 - link
Maybe I can borrow you mine Q9650? ;) (it is clocked @4GHz 24/7 for few years already)Pjotr - Wednesday, May 15, 2013 - link
Same, Q9450 here with 8 GB RAM on Win8, would love to see it in the charts. Do I just need a new graphics card (5850 now), or a whole new computer instead?Phynaz - Wednesday, May 8, 2013 - link
Wow, that's one large pile of work. You gotta love this stuff.SunLord - Wednesday, May 8, 2013 - link
Wow it's been awhile since I've seen an E-ATX case on anandtech pictured with an actual full size E-ATX motherboard installed in it to show what it looks like I'm almost shocked. Would be nice if you guys could get a few motherboard makers to give you some boards in all sizes even if they're non-functional display boards so you can use them in case reviews to show what the case looks like with different sized boards installed.HisDivineOrder - Thursday, May 9, 2013 - link
Shhhhhh. You're using too much of that stuff called sense. It might spread and suddenly everyone would want case reviews that reflect what anyone who'd install a motherboard in 99% of the cases they review instead of miniITX for every review. I mean, there's no way that putting a miniITX or microATX into every case review isn't going to impact the actual case being reviewed, is there?IS THERE?
crimson117 - Thursday, May 9, 2013 - link
Agreed - why review a full tower with a micro atx? Who builds like that?Gigaplex - Sunday, May 12, 2013 - link
I do. Is that a problem?Blibbax - Wednesday, May 8, 2013 - link
If you're recommending the A8-5600K, but for people with a discrete GPU, aren't you really recommending the FX-4*** series?Ortanon - Wednesday, May 8, 2013 - link
Excellent question.IanCutress - Wednesday, May 8, 2013 - link
Good point - apart from not testing the FX-4xxx processors (I don't have any), the FX-4xxx uses an AM3 platform - the FM2 platform is both newer and the chipsets offer more native USB 3.0 / SATA 6 Gbps as well as a UEFI BIOS from the ground up. The FX-4xxx is still a relevant choice with its L3 cache, and a couple of newer boards have been released to try and get the best from the 990FX chipset. Though out of what I have tested so far, the A8 makes the most sense if you're looking at pure 1-GPU gaming. If I get an FX-4100 in, it will be tested and conclusions adjusted if it performs similarly - there's no point suggesting a CPU I haven't tested and can't back it up with data.Ian
Blibbax - Wednesday, May 8, 2013 - link
I entirely agree that it'd be wrong to make conclusions without data. However, I feel like the APU recommendation ought to go with some sort of "however..." caveat.I look forward to your FX4 and FX6 results, however. I was initially not at all sold on these chips, but now that the prices have come down and the FX6 is often priced against Intel's i3, they are much more compelling.
DeathReborn - Wednesday, May 8, 2013 - link
There's always the Athlon II X4 750K BE which still uses FM2 but lacks the IGP.Cow86 - Wednesday, May 8, 2013 - link
If you're going to compare it against Trinity APU's, then wouldn't it be fairer to get an FX-4300, based on the same piledriver core? See if that L3 cache makes enough of a difference? More up to date as well...kmmatney - Wednesday, May 8, 2013 - link
If you live near a microcenter, you can get an FX-4130 (3.8 Ghz) and motherboard for $99. That leaves quite of room to get a better GPU, and probably a better overall gaming experience for a given amount of money. I upgraded from an X4-955 to a 3570K about 6 months ago, and have to admit that I barely notice the performance increase in games, and would have probably been better off spending the $250 on a better video card. I do like the extra speed while using handbrake, though, and my son likes my old X4-955 that was a big upgrade from his previous setup, though.TrackSmart - Wednesday, May 8, 2013 - link
I sympathize. I have similar hardware (Phenom II X4 processor) and I've been looking for a good reason to upgrade, but can't really find one. Regardless, those crazy motherboard + processor deals at Microcenter sure are tempting!frozen ox - Thursday, May 9, 2013 - link
This. I have an overclocked AMD Phenom X4 830 with an overclocked Asus 6850 and not much $$...dang it, honestly i'll just get in trouble with my wife for spending $$ to upgrade a PC that in her eyes works perfectly fine. I can probably get away with the GPU, as I can swap that out much quicker without her noticing.Spunjji - Wednesday, May 8, 2013 - link
Actually, you're looking at an Athlon X4 740 / 750k. That's the top end of the Trinity line-up with the GPU disabled and an accompanying price cut, but with the same cache structure and motherboard chipsets as the Trinity systems tested here.HisDivineOrder - Thursday, May 9, 2013 - link
Seems like it's up to the sale, but I'd be more tempted by the FX 6350 over the FX 4350 given the pricing on Newegg.More cores is more better, especially if you're making the sacrifice to use the 990FX chipset without PCIe 3.0 (and the FM1/2 chipsets also lack this anyway).
That said, I'd probably wait for a good sale on the FX 8350 and just go with that if I were considering AMD at all.
I wouldn't (and didn't) mostly because I'm one of those quirky desktop users who wants to use as little power and produce as little heat as possible to reduce fan noise yet after speeeeed. When I was looking (last year), AMD didn't really offer me much in the way of CPU's or GPU's.
I live in hope that AMD will pop out something Volcanic or Steamroll the competition, but sense seems to suggest they won't.
SirZ - Wednesday, May 8, 2013 - link
Celeron 300ALOL
mwildtech - Wednesday, May 8, 2013 - link
WTF, you should dismiss this comment all of his are shit.Kabij2289 - Wednesday, May 8, 2013 - link
Great review once again Ian :)But I noticed a typo on Metro 2033 4x 7970 "16x/18x/8x/8x" :)
IanCutress - Wednesday, May 8, 2013 - link
Thanks :)kbnj123 - Wednesday, May 8, 2013 - link
On your CPU chart you have the Intel i7 3960X and 3930k listed as Ivy Bridge architecture. These should be Sandy Bridge-E if I'm not mistaken.IanCutress - Wednesday, May 8, 2013 - link
Thanks :) Copy/paste error :facepalm:Dribble - Wednesday, May 8, 2013 - link
Mmm, not done by a true gamer as it doesn't address a number of things:1) Not everyone wants to run the game at max settings getting 30fps. Many want 60, or in my case 120fps as that's what my monitor can do. To do this we turn down graphics a bit, but this makes us much more likely to be cpu bound. Remember generally you can turn down the graphics settings to ease strain on gpu for higher fps, but cpu settings are much more fixed - you can't lower the resolution or turn of AA to fix cpu bottlenecks!
2) Min fps is key, not average fps. This I learned years ago playing ut2004. That game might return 60fps most of the time while admiring the scenery, but when you were in the middle of an intense fight with multiple players fps could half or even quarter. It's obviously in the middle of a firefight that you most need the high fps to win.
3) There's a huge difference between single player games and online. Basically most single player games also run on consoles so they run like a dream on most PC cpu's as even the slower ones are more powerful. However go onto a 64 player server (which a console can't do) and watch the fps tank - suddenly the cpu is being worked much harder. BF3, UT engined games all do this when you get on a large server.
Hence your conclusions are wrong imo. You want an o/c intel quad core - i5 750 o/c to about 4ghz+ or better really. Why that - because basically it's still not far of as fast as you'll get - the latest intel cpu's still have 4 cores, ipc isn't much better and only clock a little higher then that.
maximumGPU - Wednesday, May 8, 2013 - link
i'm pretty sure there's a sizeable jump moving from an i5 750 to 3570K, in both ipc and potential for overclock.Dribble - Wednesday, May 8, 2013 - link
I suppose it depends on what you define "sizable" as? Perhaps a i2500K would be better, but even with a i5 750 @4ghz vs a i3570K@4.5ghz we aren't talking huge increases in cpu power - 25-30% maybe (hyperthreading aside which generally isn't much help in games).IanCutress - Wednesday, May 8, 2013 - link
I very much played a lot of clan-based BF2/BF2142 for a long while. 'True Gamer' is often a misnomer anyway, perpetuated by those who want to categorize others or want to announce their own true nature.1) The push will always be towards the highest settings at which you can hit that 60-120 FPS ideal. If some of the games we see today can't hit 60 on a single GPU at 1440p, at 4K it's all going to tank. Many games tested in this review hit 60+ above two GPUs which was the point of this article to begin with.
2) Min FPS falls under the issue of statistical reporting. If you run a game benchmark (Dirt3) and in one scene of genuine gameplay there is a 6-car pileup, it would show the min FPS of that one scene. So if that happened on an FX-8350 and min-FPS was down to 20 FPS when others didn't have this scene were around 90 FPS for minimum, how is that easily reported and conveyed in a reasonable way to the public? A certain amount of acknowledgement is made on the fact that we're taking overall average numbers, and that users would apply brain matter with regard to an 'average minimum'.
3) This is a bit obvious, but try doing 1400 tests on 64 player servers and keeping any level of consistency. If this is your usage scenario, then you'll know what concessions you will have to make.
An i5-750 using an older chipset also suffers from less of the newer features - native SATA 6Gbps for example for an awesome RAID-0 setup. This could be the limiting factor in your gaming PC. We will be testing that generation for the next update of this testing :)
As written in the review, the numbers we have taken are but a small subset of everything that is possible, and we can only draw conclusions from the numbers we have taken. There are other numbers available online which may be more relevant to you, but these are the ones under our test-bed situations. Your setup is different from someone elses, which is a different usage scenario from others - testing them all would require a few years in Narnia. But suggestions are more than welcome!
Ian
darckhart - Wednesday, May 8, 2013 - link
I agree with Dribble's post above, but your reply was also well thought and written, just like your article. Keep up the good work. Thanks!Dribble - Wednesday, May 8, 2013 - link
I suppose "true gamer" does sound a bit elitist, by that I really meant someone who plays not benchmarks. I agree it's hard to test min fps in 64 player BF3 matches, but that's the sort of moment when your choice of cpu matters, not in for example in a canned off-line BF3 benchmark. As you are advising on cpu buying choices for gaming it is pretty important.My personal experience is the offline canned benchmarks giving average fps say you require a cpu a lot less powerful then you really do when you take your fancy new rig online in the latest super popular multi player game. Particularly as in that game you pretty quickly start playing to win and are willing to sacrifice some fancy settings to get the fps up so you don't loose again as you try to hit that annoying fast moving 15 year old while your fps is tanking :)
Therefore while it's fine to advise those people who only want to play offline console ports using benchmarking as you did, it's just doesn't work for the rest of us.
JarredWalton - Wednesday, May 8, 2013 - link
It sounds more than a bit elitist: it is elitist. For every gamer that spends 10-20 hours of time each week in multiplayer gaming (MMORPG, or whatever FPS you want to name, or World of Tanks, etc.), there are likely at least ten times as many gamers that generally stick to single player games. What's more, that sort of definition of "true gamer" may as well just say "high school or early 20s with little life outside of the digital realm." Yes, that's a relatively big demographic, but there are many 20, 30, 40, and even 50-somethings that still play a fair amount of games, but never bother with the multiplayer stuff. In fact, I'd say that of the 30+ year old people I know well, less than 1% would meet your "true gamer" requirement, while 5% would still be "gamers".Says the 39 year old fuddy duddy.
Spunjji - Wednesday, May 8, 2013 - link
The purpose of this article is to give a scientific basis for comparison within the boundaries of realistic testing deadlines. I would be interested to see you produce something as statistically rigorous based on performance numbers taken from online gaming. If you managed to do it before said numbers became irrelevant due to changes to the game code I would be utterly flabbergasted.Dribble - Thursday, May 9, 2013 - link
No, the purpose of this article is to recommend cpu's for gaming.frozen ox - Thursday, May 9, 2013 - link
There is no way to recreate or capture all the variables/scenarios to repeatedly benchmark a firefight in BF3 across multiple systems. The results from this hardware review are relevant, because they are easily repeatable by others and provide a fair baseline to compare systems. The point of this study is not what CPU do I need to play BF3 or Crysis at max settings, it's how much bandwidth bottleneck is going on with a single GPU setup? What happens in reality with multi-GPU setups? How well does the new AMD architecture (because "true gamers" want to save $$ to buy games) compare to Intel?What you have to do, as a "true gamer" and someone who has enough wits about them, is extrapolate the results to your scenario because everyone's will be different. And honestly, anyone who plays FPS...the "true gamers", will know what you pointed out. It's insanely obvious even the first time you play a demanding FPS MMPOG like BF3.
I however, play single player 99% of the time. Only online FPS I'll play now is CS.
Pheesh - Wednesday, May 8, 2013 - link
"2) Min FPS falls under the issue of statistical reporting. If you run a game benchmark (Dirt3) and in one scene of genuine gameplay there is a 6-car pileup, it would show the min FPS of that one scene. So if that happened on an FX-8350 and min-FPS was down to 20 FPS when others didn't have this scene were around 90 FPS for minimum, how is that easily reported and conveyed in a reasonable way to the public? A certain amount of acknowledgement is made on the fact that we're taking overall average numbers, and that users would apply brain matter with regard to an 'average minimum'."The point of a benchmark is to provide a consistent test that can be replicated exactly on multiple systems. If you're not able to do that then you aren't really benchmarking anything. That's why 99% of games are not tested in multiplayer but rather single player in experiences they can strictly control. (i.e. with test demos). If for some reason the game engine is just that unpredictable even in a strictly controlled test situation you could do multiple trials to take a minimum average.
Minimum FPS is an extremely necessary test and its easily possible to do. Other sites include it with all of their gaming benchmarks.
Spunjji - Wednesday, May 8, 2013 - link
That doesn't necessarily mean that the numbers they give you are worth a damn...beginner99 - Thursday, May 9, 2013 - link
"Minimum FPS is an extremely necessary test and its easily possible to do. Other sites include it with all of their gaming benchmarks."Or you could do 5 runs, discard the worst and best and average the rest (min, max average FPS).
http://en.wikipedia.org/wiki/Truncated_mean
But yeah statistics is extremely complex and error prone. I once read that a large amount of statistics in scientific publications have errors to a certain degree (but not necessarily making the results and conclusions completely wrong!!!)
Or if you actually know such a "special scene" can happen, discard all test were it happened.
beginner99 - Thursday, May 9, 2013 - link
The main issue here is actually available time or the amount of work. Averages over 3 aren't really that great. if you could run everything 100 times such "special scenes" would be irrelevant.mapesdhs - Monday, May 20, 2013 - link
Ian,
P55 boards can offer very good RAID0 performance with SSDs, or more importantly
RAID1 or RAID10 (I hope those with RAID0 have some kind of sensible backup
strategy). See my results:
http://www.sgidepot.co.uk/misc/ssd_tests.txt
One will obviously get more out of newer SSDs using native SATA3 mbds for the
sequential tests, but newer tech won't help 4K numbers that much. In reality few
would notice the difference between each type of setup. This is especially true
given how many later mbds use the really awful Marvell controllers for most of the
SATA3 ports (such a shame only a couple are normally controlled by the Intel or
other chipset); performance would be better with an older Intel SATA2. I expect
many just use the non-Marvell ports only if they can.
What matters is to have an SSD setup of some kind in the 1st place. My P55 system
(875K) boots very quick with a Vertex3, gives a higher 3DMark13 physics score than
a 3570K, and GPU performance with two 2x 560Ti is better than a stock 680. It's
really the previous gen of hw which can present more serious bottlenecks (S775,
AM2, DDR2, etc.), but even then results can often be surprisingly decent, eg. oc'd
Ph2 965, etc.
Also, RAID0 with SSDs often negates the potential of small I/O performance.
Depending on the game/task, this means SSD RAID0 might at times be slower than a
single good SSD.
Dribble is right in that respect, improvements are often not as significant as
people think or expect (I've read sooo many posts from those who have been
disappointed with their upgrades), though it does vary by game, settings, etc.
Games which impose a heavier CPU loading (physics, multiplayer, AI etc.) might see
more useful speedups from a better CPU, but not always. There are so many factors
involved, it can become complicated very quickly.
Ian.
Felix_Ram - Sunday, May 26, 2013 - link
Your 120 hz screen has a frame latency of about 8 ms. Meaning it effectively can't show you more than 60 new fps. Anything above that it shows you the same pixel twice. So basically, you are watching reruns, and anyone who states that he can tell a difference between 60 fps and +60fps is basically kidding himself.http://forums.anandtech.com/showthread.php?t=23049...
http://forums.steamgames.com/forums/showthread.php...
Felix_Ram - Sunday, May 26, 2013 - link
Can't edit. A screen latency of about 16 ms*tehh4ck3r - Wednesday, May 8, 2013 - link
You should test a Phenom II X4-965 and a i5-3570K.B-Unit1701 - Wednesday, May 8, 2013 - link
And throw in a 45nm Core2, preferably over 3.0Ghzboulard83 - Wednesday, May 8, 2013 - link
Really great review and testing. As for the CPU to add to the list, you could add some very cheap solution like the G1610 and G2020 too see how these 40-60$ chip perform againts all other chip or simply compare to an older E6700 like the one on the test. Other than that, you could also add a 3820 in the testing simply to lower the cost of the X79 setup, making it a little more mainstream VS a 600$ 3930k.IanCutress - Wednesday, May 8, 2013 - link
I have those three CPUs in the 'ones I want for the next update'. I'm of course going to try and get them :)Ian
boulard83 - Thursday, May 9, 2013 - link
Thanks for the answer Ian ! :)whyso - Wednesday, May 8, 2013 - link
A8 for single gpu gaming with a 7970? Really? Just because your limited run of 4 games did not show anything wrong with the a8 does not mean that the a8 is going to perform properly with other games. Play Hitman with it and a 7970 or multiplayer BF3, then see if you are still going to recommend the a8.HalloweenJack - Wednesday, May 8, 2013 - link
anand have run BF3 with it and its perfectly fine.whyso - Wednesday, May 8, 2013 - link
Multiplayer?Spunjji - Wednesday, May 8, 2013 - link
What's with all the people in here who don't understand statistics?! You can't do scientifically rigorous multiplayer testing and produce useful results. The time required alone to test... the mind boggles.airmantharp - Wednesday, May 8, 2013 - link
Keep in mind that the article's title doesn't start with 'Statistical Analysis of...', but rather 'Choosing a...'.That's important. While you can't 'properly' benchmark multiplayer games, you can make reasonable inferences and use them to support your conclusions. The reality being exposed here is that Ian's benchmarks are really only useful for choosing a CPU for single-player games, not that there's a damn thing wrong with that.
However, it's not unreasonable for people to point out that the gaming situations requiring real CPU power to maintain playability are not covered in a 'Choosing a Gaming CPU' article.
felang - Wednesday, May 8, 2013 - link
+ 1,000,000frozen ox - Thursday, May 9, 2013 - link
In a multiplayer situation, you'll likely get similar ratios of performance, just lower average FPS. It's pretty easy to assume an X2 or i3 or other dual core is not going to hold up well, as these results support. But how in the hell are you supposed to have a baseline to compare systems in a multiplayer scenario? Do you have any idea what a cluster fuck that would be, even to compare just one game across only two systems, let alone as many as this review has?This review helps CPU buyers because they can look at these results, and multi-GPU setups, and see where the bottleneck will occur first. That doesn't mean there won't be more bottlenecks, but at least you can see which part of your system you should upgrade first.
felang - Wednesday, May 8, 2013 - link
Agree 100%. 4 games is not enough to reach a conclusion (dated ones at that). An A8 definitely is not going to cut it in more demanding games.HalloweenJack - Wednesday, May 8, 2013 - link
nice article - would like to have seen an AMD AM2 setup for comparison though. Sadly though I don't like the obvious intel slant - with comments like ` noticeable gap` between intel and amd cpu`s , yet its under 1 fps! I challenge you to actually see a 1fps difference without a meter...IanCutress - Wednesday, May 8, 2013 - link
I didn't say gap with the small 1 FPS differencess, I said split. Whenever I said gap, there is a sizeable difference ~10%. For the small FPS difference in Dirt 3 + one GTX 580, I said "Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.". Please don't misinterpret my results when I cater for your issue word for word. If you have an issue with a *specific* analysis, please let me know.HalloweenJack - Wednesday, May 8, 2013 - link
again I disagree - you use the words , chosen carefully - the implication is obvious. `gap` and `split` implying a considerable distance between the 2. when in reality there is none. at least anandtech has finally started using real world resolutions and not the pointless 800x600 . poor choice in ambiguous words In writing.HalloweenJack - Wednesday, May 8, 2013 - link
I must ask though - why does civ and total war do poorly on AMD? and will you be adding an AM2 rig - say a 9850?JarredWalton - Wednesday, May 8, 2013 - link
I disagree; "gap" and "split", particularly taken in context, are very clear in the text. What's more, for someone that appears to be worried about a single word choice, you're at the same time ignoring most of the other words.Gap: A break or hole in an object or between two objects.
Split: A tear, crack, or fissure in something, esp. down the middle or along the grain.
There's a split between AMD and Intel, but in many cases not a gap.
ThomasS31 - Wednesday, May 8, 2013 - link
Yes. A Core2Quad would be nice to see.Also some midrange video cards, like HD7870 and GTX660/Ti.
ThomasS31 - Wednesday, May 8, 2013 - link
My point is, that if you are on a budget, but has a C2Quad system... you may not need a new CPU for a new mid-range videocard.Though I admint these are very close to A8-A10 performance, so if that is enough, a C2Q as well might be good.
BTW a very good article... do you planning doing the same for GPUs? :)
IanCutress - Wednesday, May 8, 2013 - link
I have got a Q9400 coming in from a family member for the next update to this review :) Putting more cards in the review might multiply it out too much time wise :/ If there is more requests to try more mid-range cards, I might move to that and retest everything, if I can get the cards in. The 7970s/580s were the only ones I really have to hand to test multi-GPU.Ian
beepboy - Wednesday, May 8, 2013 - link
You're right about the cards, a waste of time - unless its more budget oriented.Pjotr - Wednesday, May 15, 2013 - link
Core2Quad, like my Q9450-ish, I'd only like to know if buying a modern 660 or similar will not hamper that card too much. Not very interested in multi-card configs. Great review you did, but I only looked at the single-card table. I think most people try to balance the single CPU vs GPU upgrade cycles.aburhinox - Wednesday, May 8, 2013 - link
This is a great article to compare cpus across multiple gpus. I'd also be curious to see how different GPUs scale. I'd like to see if a single $400 card is better than 2 $200 cards. I'm going to say that given the choice between one or two $400 cards, two is better than one. Going to the extreme would get you to ask if you want to go crazy, if four $100 cards is better than 1 $400 card. That would probably be going too far since you have to end up with expensive motherboards to support four gpus. But I think that would make a useful article about gpus.dwatterworth - Wednesday, May 8, 2013 - link
Thank you for putting the DP Xeon platform in. I imagine it is a niche market but a platform parallel to that in an older generation would be a huge help. I have an aging LGA 771 Asus Z7S-WS board with (2) e5472 procs with (1) 7950 w/boost. The system was built for 3D rendering and architectural work and as 2 systems are not affordable, this became my gaming machine as well. Other than putting my own benchmarks up against what I can find here or other sites it is very hard for me to decide when and to what to upgrade. I greatly appreciate the Xeon inclusion on this as there are some (few?) who fall into the work + play on a single machine scenario.xinthius - Wednesday, May 8, 2013 - link
You, my friend, just made me laugh well and truly, thank you."his intelligence will diminish gradually." Speaking from experience I see?
kyuu - Wednesday, May 8, 2013 - link
Maybe this is a dumb question, but shouldn't the Core Parking updates also be beneficial to the APUs? They're still using the same module architecture as the FX chips.IanCutress - Wednesday, May 8, 2013 - link
That's planned for the next update, after Haswell launch. At the time I completely forgot and went on to the next platform. Need to pull out the FM2 test bed, install an OS and retest them - another day of testing at least (!). But it's on the 'to do' list.Ian
kyuu - Wednesday, May 8, 2013 - link
Cool, thanks for the reply Ian.antonyt - Wednesday, May 8, 2013 - link
This analysis is great! And extremely useful for anyone contemplating a gaming build in the near future (as I am). I look forward to seeing your updates and more articles like this.Btw, minor typo ("future") at the very end--"but we hope on expanding this in the fuiture."
IanCutress - Wednesday, May 8, 2013 - link
Fixed the typo, cheers :)felang - Wednesday, May 8, 2013 - link
Only if you plan to play single player games onlyxinthius - Wednesday, May 8, 2013 - link
You agree with the fact your intelligence has diminished? Okay.I would LOVE to see you design a microprocessor as complex as one of AMDs. Their processors actually perform admirably in highly threaded workloads, while their current architecture is weak in the IPC department. Their CPUs are by no means weak and should still be recommend in some circumstances, such as their APU range. Please try and post intelligently, I know it's hard for you.
jhoff80 - Wednesday, May 8, 2013 - link
I know they're more difficult to get a hold of, but I'd be curious how some of the lower power stuff, like the i7-3770T or the i5-3570T would do. Even a i5-3550S would be pretty interesting, I think.I mean, I know there's a lot of gamers that just want as powerful (or conversely, as cheap) a CPU as possible, but it would be interesting to see if Intel's more 'efficient' (for lack of a better word) chips do nearly as well.
TheInternal - Wednesday, May 8, 2013 - link
I would be curious to see how "low-power" parts do as well, though that would be a secondary desire behind seeing these tests done on multiple monitor configurations.The0ne - Wednesday, May 8, 2013 - link
Odd. The i5-3570K is a very popular CPU and it doesn't get attention or recommendation? Does that mean that previous tests by numerous websites indicating and directing thousands of consumers to build with this CPU somehow became irrelevant? I could have sworn that the rule of thumb was you go with an i5-3570K instead of an I7 if you're not into heavy audio/video work but yet here it doesn't appear to be the case. Very interesting.IanCutress - Wednesday, May 8, 2013 - link
I didn't have one to hand and couldn't get one in. We don't work in an office at AT, we're spread across the world. The nearest I had to it was the 2500K, which is an IPC decrease. i5-3570K (and the Haswell equivalent) should be in the next update :)Ian
mapesdhs - Monday, May 20, 2013 - link
It's an IPC decrease, but it oc's far better than the 3570K due to the cap material issue;
end result is the 2500K will be faster overall. I still think the 2500K is a better buy, assuming
one can get them. Unless of course one is willing to replace the cap material with something
better, then the 3570K will be an easy winner.
Ian.
sherlockwing - Wednesday, May 8, 2013 - link
Could you investigate more into how AMD failed in Civilization V? could it be that RTS game are harder to multihread optimize thus favoring Intel CPUs?frozen ox - Thursday, May 9, 2013 - link
Civ V is more dependent on the CPU than the GPU, and in this case that's where AMD's shortcomings in single-threaded performance show. It will be very interesting to see what happens in these scenarios whem AMD starts releasing HSA capable APUs. When coupled with a discrete GPU, will they be able to manage both the integrated and discrete components to an advantage in games like Civ 5 and other CPU demanding strategy games?xinthius - Wednesday, May 8, 2013 - link
I'm alright, thanks. You will find that I stated admirably. Compare the price difference between each SKU.kyuu - Wednesday, May 8, 2013 - link
(No) Thanks for your input, but that's not what I was asking.beepboy - Wednesday, May 8, 2013 - link
Great job Ian! I'm really interested to see 680s in the picture please!creed3020 - Thursday, May 9, 2013 - link
Yeah I find it strange to go with GCN on the AMD side but then use Fermi on the NVIDIA side. Kepler would have been a better match.tedders - Wednesday, May 8, 2013 - link
I would also like to see a AMD Phenom II X3 720BE. That processor was very popular back in the day but also has pretty good OC capability. I am getting ready to build a new machine and I'd like to see how my current setup would compare to newer Piledriver and Haswell chips. Great review BTW!aznchum - Wednesday, May 8, 2013 - link
A reference Geforce GTX 580 should have 1.5 GB of RAM, not 2GB. Minor typo :P.kyuu - Wednesday, May 8, 2013 - link
Seriously Anand, people like this are why we need an ignore/block function for the comment threads.calyth - Wednesday, May 8, 2013 - link
Got a link for the core parking updates?Kogies - Thursday, May 9, 2013 - link
Try these, I think they will be the ones!http://support.microsoft.com/kb/2646060
http://support.microsoft.com/kb/2645594
Stuka87 - Wednesday, May 8, 2013 - link
Uhm, did you not just read this article? Unless you are running multi-GPU's AMD's CPU are fine. With the exception of Civ5 which is CPU bound. But outside of that one case, saving $150 by buying an AMD makes a lot of sense. Especially if it allows you to put that money into a better GPU.silverblue - Wednesday, May 8, 2013 - link
It's sans2212. He popped up again the other day on Toms Hardware. Ignore it.Xistence - Wednesday, May 22, 2013 - link
You are wrong sir, there are WAY more cpu bound games than you think, almost any MMO will fall into this category, Skyrim is another one and most games that only use one or two cores, sadly this is alot of games. Trust me I love AMD and have used them for years but after upgrading everything I was still getting poor performance in most of the games I play. I broke down and bought a 2600k after a lot of research and wow was it an improvement over my 1100t (6 core amd)Its sites like this and slanted test like these that kept me with AMD for years, glad I finally figured it out and still hold out hope AMD will improve their IPC along with future games using more cores properly.
rpsgc - Wednesday, May 8, 2013 - link
Can you please ban this 'Intellover' troll?JarredWalton - Wednesday, May 8, 2013 - link
I'm getting close to doing so, as his "contributions" are completely useless. Vote here for banning or not -- I'm inclined to just leave it be for now, but if he continues to post prolifically with nothing meaningful, I'll take action.Egg - Wednesday, May 8, 2013 - link
Please, yes, bansilverblue - Thursday, May 9, 2013 - link
Seconded... it gets tiresome after a while.Donniesito - Thursday, May 9, 2013 - link
Please banjjmcubed - Thursday, May 9, 2013 - link
ban pleasejjmcubed - Thursday, May 9, 2013 - link
I did not mean Donniestio... me smart good.iamezza - Thursday, May 9, 2013 - link
Please ban him. There really is no reason to leave trolls like this on the forum, they contribute nothing and constantly derail meaningful discussion.I can't think of a single reason to not ban him (and other) trolls.
creed3020 - Thursday, May 9, 2013 - link
Please ban, I come down here to read constructive comments not useless dribble.R3MF - Thursday, May 9, 2013 - link
ban.smuff3758 - Thursday, May 9, 2013 - link
Ban this loser. He simply can not leave his fan boy status alone long enough to evaluate an outstanding SCIENTIFC analysis. I love both Intel and AMD CPU's. Both have their places just depends on what your objectives are.colonelclaw - Thursday, May 9, 2013 - link
Jarred, I would fully agree with banning any person who continually makes no contribution to the discussion. These comment sections often supply me with useful information, and can be read as a continuation of the article itself. Having to hunt for the valuable opinions amongst piles of cretins and idiots makes me want to go elsewhere.extide - Thursday, May 9, 2013 - link
Please ban him, and I would consider myself a pretty solidly Intel guy myself, but you have to be realistic. Sheesh!Blibbax - Thursday, May 9, 2013 - link
Ban him and anyone else remotely similar.duploxxx - Friday, May 10, 2013 - link
if all would start voting to ban this person there would be a huge amount or thread reply :)Jon Tseng - Wednesday, May 8, 2013 - link
Hmmm. So bottom line is my 2007-vintage QX6850 is perfectly good a 1080p so long as I get the a decent GPU.Bizarro state of affairs when a 6 year old CPU is perfectly happy running cutting edge games. Not sure if I should blame the rise of the GPU or the PS3/XBox360 for holding back gaming engines for so long!
TheInternal - Wednesday, May 8, 2013 - link
In games that are CPU limited (like Skyrim or Arkham Asylum), no. I continue to get the impression from both personal experience and articles/reviews like this that once you have "enough" CPU power, the biggest limiting factor is the GPU. "Enough" often seems to be a dual core operating at 3.0GHz, but newer titles and CPU bound titles continue to raise the bar.Azusis - Wednesday, May 8, 2013 - link
Agreed. Especially in multiplayer situations. Try running PlanetSide 2 or Natural Selection 2 with a core2quad like I do. It isn't pretty. But just about any other singleplayer game... sure, no problem.TheInternal - Wednesday, May 8, 2013 - link
So... these were all tested on a single monitor? Though the article has lots of interesting information, I'd argue that doing these tests on a three monitor 1440p setup would show much more useful information that consumers looking at these setups would be able to apply to their purchasing decisions. It's great to see more reviews on different CPU + multiple GPU configurations, as well as the limitations of such settings, but by limiting such tests to an increasingly unlikely usage scenario of a single monitor, the data becomes somewhat esoteric.Kristian Vättö - Wednesday, May 8, 2013 - link
Did you mean three 1080p monitors (i.e. 5760x1080) by any chance? 7680x1440 is a very, very rare setup especially for a gamer. For work purposes (e.g. graphics designer, video editor etc) it can be justified as the extra screen estate can increase productivity, but I've never seen a gamer with such setup (heck, the monitors alone will cost you close to $2000!). I'm not saying there aren't any but it's an extreme minority and I'm not sure if it's worth it to spend hours, even days, testing something that's completely irrelevant to most of our readers.Furthermore, while I agree that 5760x1080 tests would be useful, keep in mind that Ian already spend months doing this article. The testing time would pretty much double if you added a second monitor configuration as you'd have to run all tests on both configs. Maybe this is something Ian can add later? There is always the trouble of timing as if you start including every possible thing, your article will be waaay outdated when it's ready to be published.
TheInternal - Thursday, May 9, 2013 - link
I didn't mean three 1080p monitors, which does seem to be the "common" three monitor configuration I've seen most gamers going for (since it's cheap to do with 24" panels being under $200 a pop) My 27" S-IPS 2560x1440 panel cost about $300, so I'm not sure where you're getting the $2000 figure from... and if you spend $1500-$2000 on the graphics subsystem, why wouldn't you be spending at least half as much on the monitors?Most modern high-end graphics cards should be able to easily handle three 1080p monitors in a three card config... possibly a two card config... a round up like this would be much more useful to consumers if it did include such information... as well as show just how well the different CPU and GPU combos worked with multiple monitors.
iamezza - Thursday, May 9, 2013 - link
I have 3 x 1080p and a 7970, on modern games it isn't possible to get 60fps without turning settings way down. Really need 2 x 7970 to maintain 60+ fpsTheInternal - Saturday, May 11, 2013 - link
I'm guessing it does 30+ FPS comfortably though?Arnulf - Wednesday, May 8, 2013 - link
Are you retarded or just an imbecile ?marc1000 - Wednesday, May 8, 2013 - link
good work Ian!that's a LOT of data, but the best part is the explanation of WHY. hope it makes matters clear.
side note: it was nice to see the link to www.adrenaline.com.br ! those guys are insane indeed! =D
Doomtomb - Wednesday, May 8, 2013 - link
I have an i7-875K. I would like you to include an i7 from the Westmere/Nehalem generation. Thanks!mapesdhs - Monday, May 20, 2013 - link
I'm doing lots of tests that should help in your case. If you want me to test anything specific,
feel free to PM. I have the same 875K, but also 2500K, 2700K, 3930K, Ph2 965, QX9650
and many others.
Ian.
Pheesh - Wednesday, May 8, 2013 - link
I'm really surprised that minimum FPS wasn't also tested. Testing just for average FPS is not that informative to the actual experience you will have. If given the choice between two CPU's I'd take one averaging 70 fps but with a minimum fps of 50 over one that averages 80fps but has a minimum fps of 30.mip1983 - Wednesday, May 8, 2013 - link
Perhaps some games are more CPU limited, I'm thinking MMO's like Planetside 2 were there are a lot of players at once. Not sure how you'd benchmark that game though.bebimbap - Wednesday, May 8, 2013 - link
Ian I know you are a BL2 fan. The game is written with a old UT engine i'm told, so it's performance scaling isn't the same as some of these other titles. The method of testing you used was similar to how I buy my own equipment and recommend to others.With my same 3770k clocked at stock 3.9ghz I can only get about 57fps with my gtx670. when it is OC'd to 4.7ghz that same scene now becomes GPU limited at 127fps on my 144hz lcd. I'm glad you posted this. When people ask for my advice on what hardware to buy, I always tell them, that they should aim for a resolution first, 1080p for example, then what game they would want to play and what performance presets, mid settings 120hz, then buy a gpu/cpu combo that compliments those settings. if your budget allows then up the hardware a tier or two. Too many times do I see people just buy a top tier GPU and wonder why their fps is lower than expected. My way your expectations are met, then if budget allows, are exceeded. I hope you start a trend with this report. So that others can go this route when performing upgrades.
Michaelangel007 - Wednesday, May 8, 2013 - link
The article is a good start! Pity it didn't include the Tomb Raider benchmark that anyone can run, nor include a discussion about the badly implemented Windows timer frequency that Lucas Hale documented with his "TimerResolution" program. HyperMatrix found lowering the default timer resolution from 10ms down to 1 ms allowed for "Crysis 3 - 30% Framerate and Performance"Patrese - Wednesday, May 8, 2013 - link
Awesome article, thanks! Is it possible to include some sort of gaming physics testing? Now that PhysX is beginning to catch some momentum, I'd be great to see if a 8-module AMD processor handles physics stuff better than a 4-core comparable Intel one, and at what point does a dedicated physics card starts to make sense, if at all.I’d be also nice if a “mainstream gaming” article could be made too. Benchmarks at 1080p with cards like the 660Ti and 7850, for instance. No need for 3 way SLI/CF on those, so you'll not need as much time in Narnia. :)
araczynski - Wednesday, May 8, 2013 - link
interesting read, although i find it too focused to be of much general use (or useful future reference). i'd like to have seen for example how an E8500 holds up (too big of a gap between E6500 and i52500), as well as at least ONE game i would even bother playing (skyrim/witcher/etc). and of course like you mentioned, even a slightly bigger sampling of graphics cards. (i think you mentioned that).anywho, i realize this wasn't meant to be anything exhaustive (i do appreciate having the CPU/GPU benches available here as a good reference though), and i do like the detail/explanation length you went into.
so thanks :)
xinthius - Wednesday, May 8, 2013 - link
But AMD offers good price to performance at lower tiers, they should be recommend.yougotkicked - Wednesday, May 8, 2013 - link
Regarding your comments on the role of artificial intelligence in game performance/programming: I've just finished a course in AI, and while implementations may vary quite a bit from game to game, many AI programs can be reduced to highly-parallel brute-force computation, simply evaluating the resulting states of many potential decisions for a numerical representation of their desirability, then selecting the best option from the set of evaluated actions. Obviously this is something that will vary greatly from game to game, but in games with many independent AI managed elements, I would expect a certain amount of the processing to be offloaded to the GPU.Other than that I agree with you on the demands of AI in games; my professor (who specializes in game AI and has experience in the industry) said that the AI is usually given about 10% of the CPU time in a game, so it's rarely a limiting factor.
I'm still working through the whole article (really enjoying it so far) so I'm sure I'll have many more comments/questions later.
IanCutress - Wednesday, May 8, 2013 - link
Based on previous CUDA experience, CUDA doesn't like a lot of IF statements in its routines. So if you're offloading different AI parts onto the GPU, unless all the elements are being put through the same set of if commands (and states), it won't work too well, with some warps taking a lot longer than others if there is large branch deviation. It's a task suited to MIMD environments, like a CPU. Then again, it really depends on the game. Clever AI is already here, because we confine it to a self-created system. One could argue that the bots in CounterStrike are not particularly smart, but the system can put their accuracy up to 100% to make it harder. It's a lot of give and take, perhaps. It is times like these I wish I did CompSci rather than Chemistry :) I need to take one of those MIT online AI courses. You know, inbetween testing!Ian
yougotkicked - Wednesday, May 8, 2013 - link
I suppose conditionals would make offloading some AI components to the GPU impractical, but there still remains a subset of AI computations which seem very GPU friendly to me. State evaluation functions seems like a prime example, the CPU would be responsible for deciding which options to evaluate, building an array of states to be evaluated by the GPU. These situations probably don't come up very often in FPSs, but in something like Civilization I can see it being quite common.I've actually got to head over to that class now, I'll ask the professor if he knows of any AI's using GPU computing in modern games.
airmantharp - Wednesday, May 8, 2013 - link
Like Ian said, GPU's aren't good 'branch' processors, but I do see where you're coming from. Things like real physics, audio environment maps, and pre-render lighting maps could be fed to AI routines running on the CPU. This would allow for a much greater 'simulation awareness' for AI actions.yougotkicked - Wednesday, May 8, 2013 - link
I spoke with my professor and he said that as far as he knows, many people have discussed to prospect of using GPU's for AI, but nobody has actually done so yet. He's going to ask some friends of his at some major game studios to see if they are working on it.He did agree with me that there are some aspects that could be computed on a GPU, but a lot of the existing AI methods are inherently sequential, so offloading it to the GPU will require new algorithms in many cases.
TheQweaker - Thursday, May 9, 2013 - link
You may wish to check nVidia's GTC conference web site where you can find some GPU AI Research. Also, nVidia published various PDF slides on GPU Path Planning.If you look deeper in some specific AI Domains such as, say, AI Planning (first used in F.E.A.R. in 2005, lately used in KillZone 3 and Transformers 3: The Fall of the Cybertron) you can find papers investigating the use of GPUs.
On of the bottom line of current GPU AI research is that GPUs crunch large numbers of data very fast so, currently, there is not much hope in using the many GPU threads for tiny amounts of data of state space search.
Hoping this helps.
-- The Qweaker.
yougotkicked - Thursday, May 9, 2013 - link
Thanks for pointing me towards those papers, they look pretty interesting and I've been looking for a topic to write my final paper on ;)TheQweaker - Friday, May 10, 2013 - link
Just in case, here is a pointer to the nVidia GPU AI Path finding in the developer zone:https://developer.nvidia.com/gpu-ai-path-finding
And here is the title of a 2011 GPU AI Planning paper (research; not yet in a game): "Exploiting the Computational Power of the Graphics Card: Optimal State Space Planning on the GPU". You should be able to find the PDF on the web.
My 2 cents is that it's a good topic for a final paper.
-- The Qweaker.
yougotkicked - Friday, May 10, 2013 - link
Thanks again, I think I will be doing GPU AI as my final paper, probably try to implement the A* family as massively parallel, or maybe a local beam search using hundreds of hill-climbing threads.TheQweaker - Saturday, May 11, 2013 - link
Nice project.2 more cents.
Keep it simple is the best advice. It's better to have a running algorithm than none, even if it's slow.
Also, ask you advisor whether he'd want you to compare with a CPU implementation of yours in order to evaluate the pros and cons between your sequential implementation and your // implemenation. I did NOT write "evaluate gains from seq to //" as GPU programming is currently not fully understood, probably even not by nVidia engineers.
Finally, here is book title: "CUDA Programming: A Developer's Guide to Parallel Computing with GPUs". But there are many others these days.
OK. That w
TheQweaker - Saturday, May 11, 2013 - link
as my last post.-- The Qweaker.
(sorry for the cut, I wrongly clicked on submit)
yougotkicked - Monday, May 13, 2013 - link
thanks a lot for all your input, I intend to evaluate not only the advantages of GPU computing, but it's weak points as well, so I'll be sure to demonstrate the differences between a sequential algorithm, a parallel CPU algorithm, and a massively parallel GPU algorithm.Azusis - Wednesday, May 8, 2013 - link
Could you test the Q6600 and i7-920 in your next roundup? I have many PC gaming friends, and we all seem to have a Q6600, i7-920, or 2500k in our rigs. Thanks! Great job on the article.IanCutress - Wednesday, May 8, 2013 - link
I have a Q9400 coming in soon from family - Getting one of the Nehalem/Westmere range is definitely on my to-do list for the next update :)sonofgodfrey - Thursday, May 9, 2013 - link
I too have a Q6600, but it would be interesting to see the high end (non-extreme edition) Core 2s as well: E8600 & Q9650. Just for yucks, perhaps a socket 775 Pentium 4 could also make an appearance? :)gonks - Wednesday, May 8, 2013 - link
i knew it from some time ago, but this proves once again that it's time to upgrade my good old c2d (conroe) E6600 @ 3.2GhzQuizzical - Wednesday, May 8, 2013 - link
You've got a lot of data there. And it's good data if your main purpose is to compare a Radeon HD 7970 to a GeForce GTX 580. Unfortunately, most of it is worthless if you're trying to isolate CPU performance, which is the ostensible purpose of the article. You've gone far out of your way to try to make games GPU-limited so that you wouldn't be able to tell what the various CPUs can do when they're the main limiting factors.Loosely, the CPU has to do any work to run a game that isn't done by the GPU. The contents of this can vary wildly from game to game. Unless you're using DirectX 11 multithreaded rendering, only one thread can communicate with the video card at a time. But that one rendering thread mostly consists of passing data to the video card, so you don't do much in the way of real computations there. You do sort some things so that you don't have to switch programs, textures, and so forth more often than necessary, though you can have a separate sorting thread if you're (probably unreasonably) worried that this is going to mean too much work for the rendering thread.
Actually determining what data needs to be passed to the video card can comprise the bulk of the CPU work that a game needs to do. But this portion is mostly trivial to scale to as many threads as you care to--at least within reason. It's a completely straightforward producer-consumer queue with however many "producer" threads you want and the rendering thread as the single "consumer" thread that takes the data set up by other threads and passes it along to the video card.
Not quite all of the work of setting up data for the GPU is trivial to break into as many threads as necessary, though. At the start of a new frame, you have to figure out exactly where the camera is going to go in that frame. This is likely going to be very fast (e.g., tens or hundreds of microseconds), but it does need to be done before you go compute where everything else is relative to the camera.
While I haven't programmed AI, I'd expect that you could likewise break it up into as many threads as you cared to, as you could "save" the state of the game at some instant in time and have separate threads compute what all AI has to do based on the state of the game at that moment, without needing to know anything about other game characters were choosing at the same time. Some games are heavy on AI computations, while online games may do essentially no AI computations client-side, so this varies wildly from game to game.
A game engine may do a lot of other things besides these, such as processing inputs, loading data off of the hard drive, sending data over the Internet, or whatever. Some such things can't be readily scaled to many CPU cores, but if you count by CPU work necessary, few games will have all that much stuff to do other than setting up data for the GPU and computing AI.
But most of the work that a CPU has to do doesn't care what graphical settings you're using. Anything that isn't part of the graphics engine certainly doesn't care. The only parts of a the CPU side of game engine that care what monitor resolution you're using are likely to be a handful of lines to set the resolution when you change it and a few lines to check whether an object is off the camera and therefore doesn't need to be processed in that particular frame--and culling such objects is likely done mostly to save on the GPU load. Any settings that can be adjusted in video drivers (e.g., anti-aliasing or anisotropic filtering) are done almost entirely on the video card and carry a negligible CPU load.
Thus, if you're trying to isolate CPU performance, you turn down or off settings that don't affect the CPU load. In particular, you want a very low monitor resolution, no anti-aliasing, no anisotropic filtering, and no post-processing effects of any sort. Otherwise, you're just trying to make the game mostly CPU bound, and end up with data that looks like most of what you've collected.
Furthermore, even if you do the measurements properly, there's also the question of whether the games you've chosen are representative of what most people will play. If you grab the games that you usually benchmark for video cards reviews, then you're going out of your way to pick games that are unrepresentative. Tech sites like this that review hardware tend to gravitate toward badly-coded games that aren't representative of most of the games that people will play. If this video card gets 200 frames per second at max settings in one game and that video card gets 300, what's the difference in real-world game experience? If you want to differentiate between different video cards, you need games that are more demanding, and simply being really inefficient is one way to do that.
Of course, if you were trying to see how different CPUs affect performance in a mostly GPU-limited game, that can be interesting in an esoteric sense. It would probably tend to favor high single-threaded performance because the only difference you'd be able to pick out are due to things that happen between frames, which is the time that the video card is most likely to be forced to wait on the CPU briefly.
But if you were trying to do that, why not just use a Radeon HD 5450? The question answers itself.
If you would like to get some data that will be more representative of how games handle CPUs, then you'll need to do some things very differently. For starters, use just a single powerful GPU, to avoid any CrossFire or SLI weirdness. A GeForce GTX Titan is ideal, but a Radeon HD 7970 or GeForce GTX 680 would be fine. For that matter, if you're not stupid about picking graphical settings, something weaker like a Radeon HD 7870 or GeForce GTX 660 would probably work just fine. But you need to choose the graphical settings intelligently, by turning down or off any graphical settings that don't affect CPU load. In particular, anti-aliasing, anisotropic filtering, and all post-processing effects should be completely off. Use a fairly low monitor resolution; certainly no higher than 1920x1080, and you could make a good case for 1366x768.
And then don't pick your usual set of games that you use to do video card reviews. You chose those games precisely because they're outliers that won't give a good gauge of CPU performance, so they'll sabotage your measurements if you're trying to isolate CPU performance. Rather, pick games that you rejected from doing video card reviews because they were unable to distinguish between video cards very well. If the results are that in a typical game, this processor can deliver 200 frames per second and that one can do 300, then so be it. If a Core i7-3570K and an FX-6300 can deliver hundreds of frames per second in most games (as is likely if the game runs well on, say, a 2 GHz Core 2 Duo), then you shouldn't shy away from that conclusion.
JarredWalton - Wednesday, May 8, 2013 - link
"While I haven't programmed AI..." Doesn't that make most of your other assumptions and guesses related to this area invalid?As for the rest, the point of the article isn't to compare HD 7970 with GTX 580, or to look at pure CPU scaling; rather, it's to look at CPU and GPU scaling in games at settings people are likely to use with a variety of CPUs, which necessitates using multiple motherboards. Given that in general people aren't going to buy two or three GPUs to run at lower resolutions and detail settings, the choice to run 1440p makes perfect sense: it's not so far out of reach that people don't use it, and it will allow the dual, triple, and quad GPU setups room to stretch (when they can).
The first section shows CPU performance comparison, just as a background to the gaming comparisons. We can see how huge the gap is in CPU performance between a variety of processors, but how does that translate to gaming, and in particular, how does it translate to gaming with higher performance GPUs? People don't buy a Radeon HD 5450 for serious gaming, and they likely don't play games.
For the rest: there is no subset of games that properly encompass "what people actually play". But if we're looking at what people play, it's going to include a lot of Flash games and Facebook games that work fine on Intel HD 4000. I guess we should just stop there? In other words, we know the limitations of the testing, and there will always be limitations. We can list many more flaws or questions that you haven't, but if you're interested in playing games on a modern PC, and you want to know a good choice for your CPU and GPU(s), the article provides a good set of data to help you determine if you might want to upgrade or not. If you're happy playing at 1366x768 and Medium detail, no, this won't help much. If you want minimum detail and maximum frame rate at 1080p, it's also generally useless. I'd argue however that the people looking for either of those are far less in number, or at least if they do exist they're not looking to research gaming performance until it affects them.
wcg66 - Wednesday, May 8, 2013 - link
Ian, thanks for this. I'd really like to see how these tests change even higher resolutions, 3 monitor setups of 5760x1080, for example. There are folks claiming that the additional PCIe lanes in the i7 e-series makes for significantly better performance. Your results don't bare this out. If anything the 3930K is behind or sometimes barely ahead (if you consider error margins, arguably it's on par with the regular i7 chips.) I own an i7 2700K and 3930K.Moon Patrol - Wednesday, May 8, 2013 - link
Awesome review! Very impressed with the effort and time put into this! Thanks a lot!It be cool if you could maybe somewhere fit an i7 860 in somewhere over there. Socket 1156 is feeling left out :P I have i7 860...
Quizzical - Wednesday, May 8, 2013 - link
Great data for people who want to overload their video card and figure out which CPU will help them do it. But it's basically worthless for gamers who want to make games run smoothly and look nice and want to know what CPU will help them do it.Would you do video card benchmarks by running undemanding games at minimum settings and using an old single core Celeron processor? That's basically the video card equivalent to treating this as a CPU benchmark. The article goes far out if its way to make things GPU-bound so that you can't see differences between CPUs, both by the games chosen and the settings within those games.
But hey, if you want to compare a Radeon HD 7970 to a GeForce GTX 580, this is the definitive article for it and there will never be a better data set for that.
JarredWalton - Wednesday, May 8, 2013 - link
Troll much? The article clearly didn't go too far out of the way to make things GPU bound, as evidenced by the fact that two of the games aren't GPU bound even with a single 7970. How many people out there buy a 7970 to play at anything less than 1080p -- or even at 1080p? I'd guess most 7970 owners are running at least 1440p or multi-monitor...or perhaps just doing Bitcoin, but that's not really part of the discussion here, unless the discussion is GPU hashing prowess.Quizzical - Wednesday, May 8, 2013 - link
If they're not GPU bound with a single 7970, then why does adding a second 7970 (or a second GTX 580) greatly increase performance in all four games? That can't happen if you're looking mostly at a CPU bottleneck, as it means that the CPU is doing a lot more work than before in order to deliver those extra frames. Indeed, sometimes it wouldn't happen even if you were purely GPU bound, as CrossFire and SLI don't always work properly.If you're trying to compare various options for a given component, you try to do tests that where the different benchmark results will mostly reflect differences in the particular component that you're trying to test. If you're trying to compare video cards, you want differences in scores to mostly reflect video card performance rather than being bottlenecked by something else. If you're trying to compare solid state drives, you want differences in scores to mostly reflect differences in solid state drive performance rather than being bottlenecked by something else. And if you're trying to compare processors, you want differences in scores to mostly reflect differences in CPU performance, not to get results that mostly say, hey, we managed to make everything mostly limited by the GPU.
When you're trying to do benchmarks to compare video cards, you (or whoever does video card reviews on this site) understand this principle perfectly well. A while back, there was a review on this site in which the author (which might be you; I don't care to look it up) specifically said that he wanted to use Skyrim, but it was clearly CPU-bound for a bunch of video cards, so it wasn't included in the review.
If you're not trying to make the games largely GPU bound, then why do you go to max settings? Why don't you turn off the settings that you know put a huge load on the GPU and don't meaningfully affect the CPU load? If you're doing benchmarking, the only reason to turn on settings that you know put a huge load on the GPU and no meaningful load on anything else is precisely that you want to be GPU bound. That makes sense for a video card review. Not so much if you're trying to compare processors.
JarredWalton - Wednesday, May 8, 2013 - link
You go to max settings because that's what most people with a 7970 (or two or three or four) are going to use. This isn't a purely CPU benchmark article, and it's not a purely GPU benchmark article; it's both, and hence, the benchmarks and settings are going to have to compromise somewhat.Ian could do a suite of testing at 640x480 (or maybe just 1366x768) in order to move the bottleneck more to the CPU, but no one in their right mind plays at that resolution with a high-end GPU. On a laptop, sure, but on a desktop with an HD 7970 or a GTX 580? Not a chance! And when you drop settings down to minimum (or even medium), it does change the CPU dynamic a lot -- less textures, less geometry, less everything. I've encountered games where even when I'm clearly CPU limited, Ultra quality is half the performance of Medium quality.
IndianaKrom - Friday, May 10, 2013 - link
Basically for the most part the single GPU game tests tell us absolutely nothing about the CPU because save for a couple especially old or low end CPUs, none of them even come close to hindering the already completely saturated GPU. The 2-4 GPU configurations are much more interesting because they show actual differences between different CPU and motherboard configurations. I do think it would be interesting to also show a low resolution test which would help reveal the impact of crossfire / SLI overhead versus a single more powerful GPU and could more directly expose the CPU limit.Zink - Wednesday, May 8, 2013 - link
You should use a DSLR and edit the pictures better. The cover image is noisy and lacks contrast.makerofthegames - Wednesday, May 8, 2013 - link
Very interesting article. And a lot of unwarranted criticism in the comments.I'm kind of disappointed that the dual Xeons failed so many benchmarks. I was looking to see how I should upgrade my venerable 2x5150 machine - whether to go with fast dual-cores, or with similar-speed quad-cores. But all the benchmarks for the Xeons was either "the same as every other CPU", or "no results".
Oh well, I have more important things to upgrade on it anyways. And I realize that "people using Xeon 5150s for gaming" is a segment about as big as "Atom gamers".
Spunjji - Wednesday, May 8, 2013 - link
Crap troll is crap.lorribot - Wednesday, May 8, 2013 - link
Would love to see something like a E3-1230 tested, it is around the same price as a i5-3570K but has no graphics, bigger cache and Hyper threading, but no over clocking and 100MHz lower clock. should be similar to a i7-3770 for around 60% of the price.Spunjji - Wednesday, May 8, 2013 - link
So let me get this straight. The engineers are idiots, yet you want them to go and work for other companies, the best candidate there being Intel. Also, thanks to this apparent mental handicap, they use Intel processors... oh, I get it, you're pro AMD!bikerbass77 - Wednesday, May 8, 2013 - link
Just a note as I keep seeing posts going on about Planetside 2 being CPU limited. It is not. I am saying this from experience having played it just fine on a Core2Duo based system. The reason you are most likely having problems will either be your ping or your GPU. I was running with a GTX 460 1gb card and it was fine for that particular game. I have just upgraded to a new CPU because the main game I play (Mechwarrior Online) is far more CPU bound being based on CryEngine 3.cusideabelincoln - Wednesday, May 8, 2013 - link
I think it should be noted that online multiplayer games are a different beast. Multiplayer is typically more CPU intensive, and if you're looking to maintain completely smooth gameplay without any dips in framerate or stuttering then the CPU becomes more important than it is for single player gaming.Also would you consider benchmarking live, online streaming of games? Would be great to see how much of a benefit the 3930K would have over other chips, and if Piledriver can pull ahead of the i5s definitively.
Markus_Antonius - Wednesday, May 8, 2013 - link
Your sample size is statistically beyond irrelevant which prevents the scientist in me from drawing any conclusions from it. In addition, claiming any sort of causal relationship between results is outright scientifically wrong even if the sample size would be statistically relevant. From an engineering standpoint the X79 systems with ample headroom in every relevant department would be the best choice to avoid any possible bottlenecks / contention issues in the largest possible number of different workloads.Any recent system with a recent CPU and recent midrange graphics card can play a game and can often play it well. Advising a Core i7 3770K based on a statistically irrelevant benchmark while disregarding systems architecture is something that neither the scientist, the software engineer and the hobbyist in me can get behind in any way.
JarredWalton - Wednesday, May 8, 2013 - link
Hyperbole, you have a new friend: meet Markus! "Beyond irrelevant", "any conclusions", "outright scientifically wrong", "ample headroom", "every relevant department", "best choice"....Let me guess: you have an X79 system, and it works great for you, and thus anyone even suggesting that it might not be the best thing since sliced bread is something you can't even think about in any way. This article is focused on gaming, and if you want to do things besides gaming yes, you will need to consider other facets of the system build. At the same time, if all you're looking for is a good gaming setup, perhaps with two or three GPUs, I have trouble imagining anyone recommending something other than i7-3770K right now (unless the recommendation is to "wait for Haswell").
Let me give you a few things to consider that, while the scientist may not necessarily agree, the software engineer and hobbyist definitely would avoid SNB-E and workstations. 1) Overall power requirements (they still matter). 2) Quick Sync (may not be perfect quality, but dang it's fast). 3) Better performance in many games with two GPUs, no matter what paper specs and system architectures might say.
smuff3758 - Thursday, May 9, 2013 - link
And that, Jared, is how to shut down this arrogant, condescending self-titled expert/scientist. I guess he must think the rest of us are bozos who come here for the comic relief?Markus_Antonius - Sunday, May 12, 2013 - link
My comment was about the testing method not being scientifically sound even though the author makes it a point to refer to the "well-adjusted scientist" in himself. There's a huge number of games out there as well as a lot of different mid-range to high-end video cards. Recommending an i7 3770K on the basis of one resolution tested and only 4 games is something that you absolutely cannot call science.I am among other responsibilities a software engineer and I don't actively avoid Sandy Bridge E and workstations.
My criticism of the methods used and the conclusions drawn is valid criticism, especially in the face of the article being given the appearance of being science.
If you're going to do recommendations based on statistics and for whatever reason decide to disregard engineering and the science behind systems design you're going to need a far larger sample size than what was used here.
You can deflect this all you want by quoting power usage and quicksync but while power usage power usage should be a factor, this test was not about quicksync. If it had been they would not have tested X79 systems at all ;-)
From both work and hobby I know a lot of power users and gaming remains one of the most demanding uses and one of the *most prevalent* demanding uses of a modern PC. Throwing a more powerful system aside and disregarding engineering needs to be done with a lot more care and thoroughness, all of which is missing here.
Answering valid criticism with scorn and aggression is also very telling. Perhaps you're more insecure than you thought you were?
Badelhas - Wednesday, May 8, 2013 - link
Great review, congrats!This comes at a perfect time for me, I just ordered a Qnix QX2710 1440p 27 inch monitor from ebay and a couple of 670´s to work along with my 2500k OCed to 4.5Ghz. It seems I will be amazed with that upgrade, lets see!
Cheers
TheInternal - Thursday, May 9, 2013 - link
You honestly would be fine running most games at that rez with a single 670. After a driver install, I forgot to turn SLI back on for like a week and didn't really notice much of a difference in most of my games. Two of most high end cards, from what I continue to hear on forums, can easily power three 1080p monitors at high settings as well. I've not been able to find much information for two and three card setups powering three 1440p or higher res monitors though.nvalhalla - Wednesday, May 8, 2013 - link
Request: I've seen a few sites do these and they always use a large GPU. I understand removing the GPU from the equation, focus on the CPUs, but I would like to see these tests done with a 7770. If I'm dropping $1000+ on GPUs, I'm not thinking about buying an $80 CPU. A great question I haven't seen answered is how much CPU do I need for a normal mid-range card? If I am looking to get a mid-range GPU, do I need an i5 3570k? Would an x2 555 provide the same gaming performance? A what point does spending more on the CPU stop providing an improvement in performance?khanov - Wednesday, May 8, 2013 - link
Thanks for the article, interesting read.I'd like to suggest putting the i7 3820 in to the next article. The 3960x and 3930k are both 6 core CPU's, making platform comparison with all the other quad core's in the article more difficult.
Certainly you should retain the 6 core CPU's so we can see their potential, but adding the 3820 would allow for direct comparison of the X79 platform vs other platforms when all are running a quad core CPU.
steve_rogers42 - Thursday, May 9, 2013 - link
Hi Ian, fantastic article, has led me to rethink a lot of things, especially the scaling at work due to PCI-E 3.0 and price performance for low end systems. Seems that low-mid cpu and decent gpu is still the way to roll for future builds.Dont know if it would be possible but it would be interested to see the difference between an SR-2 and an SR-X especially considering the PCI-E 3.0 and move to newer cpu architecture.
Be also nice to see a 980x or 990x x58 or a q6600 to see the benefits of moving up from c2q/d to core i... But you probably don't have time :)
Again great article, has made me rethink original thoughts on AMD's 8350 and the caliber of Anandtech' comment's...
Cheers from Australia.
pandemonium - Thursday, May 9, 2013 - link
I haven't even read the entire article yet, but I can tell it's going to be awesome due to the outlying thoroughness on the first page.Thank you so very much for going well beyond what other reviewers do by just reporting a single run for each setup without delving deeper into the "why". Truly noble; and I would say you can honestly call yourself a scientist. :)
Sabresiberian - Thursday, May 9, 2013 - link
I think anyone reading this article thoroughly will come away with a better sense of how multidimensional the questions about which mainboard, CPU, and GPU to buy are. It isn't just a matter of looking at a few 2D graphs and picking the top solution (though that might serve to get you in the ballpark of what you want).Once again, I come away better educated and with more of a sense of what is going on with hardware combinations. Well done, and thanks! I'm looking forward to more of this type about these subjects.
smuff3758 - Thursday, May 9, 2013 - link
Well said. Finally, a simple thank you.LaxerFL - Thursday, May 9, 2013 - link
I'd like to see an i7-3820 in action, since I have one in this rig right now! I ran the PoVRay benchmark to see where it placed and scored 1626.25. I've OCed my cpu to 4.6Ghz though but I'd still like to see where a stock 3820 places in these benchmarks. I'm also interested to know if Quad Channel memory makes any difference...Great article! Keep it up, I look forward to seeing more results.
SuperVeloce - Thursday, May 9, 2013 - link
Why is there no E6700 on some of the graphs? E6400 in biblically underpowered and under-cached, we already knew that. Quad would be a better comparison anyways, as many still have them for games and productivity. And another thing... is pci-e 1.1 playing any role with those core2duo bad fps numbers? Why not use a P45 or sth with pci-e 2.0?guskline - Thursday, May 9, 2013 - link
Why didn't you use the GTX680? Comparing a Radeon 7970 to a GTX580? Please!Pjotr - Wednesday, May 29, 2013 - link
He wasn't comparing the graphics cards. He was showing how much the CPU matters on an older (580) and newer (7970) card.SetiroN - Thursday, May 9, 2013 - link
Is MCT supposed to refer to SMT/Hyperthreading?I believe you're the only person in the world using that acronym. Where did you pull it from?
IanCutress - Wednesday, May 15, 2013 - link
MultiCore Turbo, as per my article on the subject: http://www.anandtech.com/show/6214/multicore-enhan...Sometimes called MultiCore Enhancement of MultiCore Acceleration. MCT is a compromise name.
DoTT - Thursday, May 9, 2013 - link
Ian, When I saw this article I thought I was hallucinating. It is exactly what I’ve been scrubbing the web looking for. I recently purchased a Korean 1440P IPS display (very happy with it) and it is giving my graphics card (4850X2) a workout for the first time. I’m running an older Phenom II platform and have been trying to determine what the advantages of just buying a new card are versus upgrading the whole platform. Most reviews looking at bottlenecking are quite old and are at ridiculously low resolutions. CPU benchmarks by themselves don’t tell me what I wanted to know. After all, I don’t push the system much except when gaming. This article hit the spot. I wanted to know how a 7950 or 7970 paired with my system would fare. An additional follow up would be to see if any slightly lower end cards (7870, 650Ti etc.) show any CPU related effects. I have been debating the merits of going with a pair of these cards vs a single 7970.Thanks for the great review.
phrank1963 - Thursday, May 9, 2013 - link
I use a G850 and 7700 would like to see lower end testing> I choose the g850 after looking a long time at benchmarks across the web.GamerGirl - Thursday, May 9, 2013 - link
very nice and helpfull article thx!smuff3758 - Thursday, May 9, 2013 - link
Thought we are banning you!spigzone - Thursday, May 9, 2013 - link
Also of note is Digital Foundry polling several developers whether they would recommend Intel or AMD as the better choice to future proof one's computer for next gen games. 100% recommended AMD over Intel.Obviously a lot going on behind the scenes here with AMD leveraging their console wins and deep involvement at every level of next gen game development. With that in mind one might expect Kaveri to be an absolute gaming beast.
http://www.eurogamer.net/articles/digitalfoundry-f...
evonitzer - Thursday, May 9, 2013 - link
Great read. I read through the article and the comments yesterday, but had a extra thought today. Sorry if it has been mentioned by somebody else, but there seems to be a lot of discussion about MMORPG's and their CPU demands. Perhaps you could just do a scaled down test with 3-4 CPU's to see how they handled an online scenario. It won't be perfectly repeatable, but it could give some general advice about what is being stressed. I would assume it is the CPU that is getting hit hard, but perhaps it is simply rendering that causes the FPS to decrease.Other than that, I would like to see my CPU tested, of course :) (Athlon II X4 630) But I think I can infer where my hardware puts me. Off the bottom of the charts!
Treckin - Thursday, May 9, 2013 - link
Great work, however I would certainly question 2 things:Why did you use 580GTXs? Anyone springing for 4 GPUs isnt going to go for two+ year old ones (a point you made yourself in the methodology section explaining why power consumption wasnt at issue here).
Why would you test all of this hardware for single monitor resolutions? Surely you must be aware that people using these setups (unless used for extreme overclocking or something) are almost certainly gaming at 3x1080p, 3x1440p, or even 3x1600p.
Also of concern to me would be 3d performance, although that may be a bit more niche then even 4GPU configurations.
DigitalFreak - Thursday, May 9, 2013 - link
Do people not read the article? He works with what he has on hand and what he can get access to.tackle70 - Thursday, May 9, 2013 - link
Great article! The only gripe I would have (and yes I know the reasoning behind it is explained) is the decision not to include Crysis 3 in the testing.The reason I make that gripe is the even though it has no time demo functionality and adds more work is that it is the closest thing to a next-gen game we have right now, and it is also the *only* game I've seen that reliably eats up as much CPU power and as many cores as you give it. It would have been interesting to see it here.
SurrenderMonkey - Thursday, May 9, 2013 - link
Core i7 860 overclocked at 3.6Ghz, GTX580 sli Pci2 x8/x8 = min 44, average 49. Final scene destroy the Ceph Alpha. No overclock on GPUs but plenty of headroom. Not scientific but would be useful to see same scene if someone has a more up to date processor.SurrenderMonkey - Thursday, May 9, 2013 - link
REs 1920 x 1080DigitalFreak - Thursday, May 9, 2013 - link
Suckie suckie, two dolla?SurrenderMonkey - Thursday, May 9, 2013 - link
Great review, the CPU has definitely become less important. I used to change my CPU around every 18 months or my system would show signs of struggling. I bought my i860 in 2009 and it is sitting alongside two GTX 580s (SLI x8/x8). Nearly four years seems like an eternity, got my first GTX580 in early 2011 is the longest I have kept with the same GPU. Shows you that games developers don't challenge the hardware like they used too.SurrenderMonkey - Thursday, May 9, 2013 - link
People who make comments like this do not understand that it is about making a properly balanced system so that you get maximum bang for your bucks. This takes skill and a proper understanding of hardware capabilities and technology. On a gaming system you can trade down on a processor and buy a better GPU (or an SSD or both). When you get it right you get more FPS for the same or less money, faster loading times, and have overclocking headroom to use at a later date.oSHINSAo - Thursday, May 9, 2013 - link
Well i thought my 2600k was old... but im looking is too near to 3rd gen i7 3770k ... will stick with it, and focus on getting CrossfireX config...T1K0L P0G1 - Friday, May 10, 2013 - link
EXCELLENT WORK!!!gnasen - Friday, May 10, 2013 - link
Nice article. Still missing few of the legends: Q9550, i7-920, i5-750.Achaios - Friday, May 10, 2013 - link
@Ian Cutress: Hello Ian, please get a hold on a Quad Core Intel Core 2 CPU (q9540, q9550, q9650, qx9650, qx9770) and include it in your testing. I don't know where you get that "many people are still on Core 2 Duos" maybe you have seen some sort of market research? I still use a QX 9650 for gaming (WoW and SW:TOR-MMO's) and I am very happy with its performance. It would be nice to see how the high-end Core 2 CPUs measure up against modern CPUs.Andy_1 - Friday, May 10, 2013 - link
What a great Article, As I am in the process of making my mind up what to buy in the next two months this answers so many of my questions. Thank you! My main unanswered question some how seemed to get missed OR did I not read correctly??QThe results on the games show the 2 CPU config as scoring zero! is that because the wouldn't run the software on the rig or what?
ajlueke - Friday, May 10, 2013 - link
Great article. For those of us still gaming at 1080p with single GPU set ups, it is nice to know that it doesn't really matter if I spend a little less on the CPU and divert those funds toward a higher end GPU.Gamer4Life63 - Friday, May 10, 2013 - link
First, well done piece with lots of great info. Now then I would love to see this same kind of look done at 1080 resolutions with a mid range card like a 7870 or 7850. Would also love to see some other games added to the mix like a modern MMO or Skyrim that is a bit harder on the CPU.SurrenderMonkey - Friday, May 10, 2013 - link
Yes, Intel has great engineers not to forget the business gurus. Intel did a great job on Larrabee, choose(?) not to be in the Nextgen consoles and it will be jam tomorrow in the tablet and smartphone market. In April Intel reported a downturn of 25% in profits which it attributed to a decline on the PC market. As Anandtech has just shown the GPU is where the money is at, the CPU is a passenger and time to replace is extending. Intel makes good processors but it is also a one trick pony who has failed to move with the times.MarcVenice - Friday, May 10, 2013 - link
You thought of some many things to consider, yet when you say: We know what's missing, you forgot so many things. I didn't read all the comments so excuse me if someone already mentioned it. But what's missing is several games. Crysis 3 for example, or Far Cry 3.You mentioned that 1440p is a niche (it's 2560x1440 btw, 1440p isn't technically a resolution). So why didn't you test at 1920x1080, not only are games more prone to being cpu-limited, but games like Crysis 3 or Far Cry 3 are actually more demanding then games like DiRT 3.
Reason I mention this, is that I've found there to be a rather big difference between a X4 970 and a 3960X in those games, at ultra settings in Crysis 3 and FC3, with a GTX 660, 670 and 680. I know Anandtech doesn't report minimum's, but if you take the time to do as many runs as you did, you can scientifically establish that the minimum fps is also greatly affected by slower cpu's.
Reason I respond after not having done so in years, is that I found your suggestion to pare a 5600K with a high-end gpu to be disputable, and that's me being mild. Especially since you more or less went and said that 'other' websites or testing didn't do their testing properly.
OwnedKThxBye - Monday, May 13, 2013 - link
I agree 100% MarcVenice.The recommendation also doesn't take into consideration the upgrade path of the PC. If you were to follow this suggestion, the probability of having to do a full CPU and motherboard upgrade instead of just the GPU when you next need to upgrade is going to be significantly higher. Most people don't want to do a full system upgrade after 2-3 years because they are CPU limited on the new title they want to play. I say spend the extra $100-$150 on a better CPU and potentially make the PC last another two years.
lesherm - Friday, May 10, 2013 - link
Ian, this is real research and journalism. This kind of in-depth reporting on hardware is exactly what keeps me coming back to anandtech, year after year. Your efforts are appreciated!TheQweaker - Saturday, May 11, 2013 - link
I agree with this.-- The Qweaker.
Animebando - Friday, May 10, 2013 - link
I would love to see this kind of write-up that covers surround/eyefinity resolutions. I've been fairly impressed with how my 7950 handles games across three monitors, and I've been an nVidia fan for years.CookieKrusher - Saturday, May 11, 2013 - link
Good to know that my 2500K is still taking care of business. Now if I could just upgrade this GTX 460 this year, I'll be golden. :-)Tchamber - Saturday, May 11, 2013 - link
I wonder why games are vastly more parallel on the GPU side of things than the CPU side. If a game can utilize 2048 SPs, why doesn't adding 2 or 4 more CPU cores help much?ShieTar - Tuesday, May 14, 2013 - link
Because all parts of the code which can be run in parallel are already running on the GPU, and the CPU is stuck with the code that needs to be serial.MelodyRamos47 - Sunday, May 12, 2013 - link
Sienna. true that Amber`s artlclee is impossible... last saturday I bought Ariel Atom after having made $9498 this-last/5 weeks and-also, $10,000 last-munth. with-out any question its the coolest work I have ever had. I started this seven months/ago and pretty much straight away started bringin home minimum $71 p/h. I follow the details on this straightforward website, Bow6.comTAKE A LOOKOwnedKThxBye - Sunday, May 12, 2013 - link
Love this information it was an eye opener. Great Job Ian!To choose a gaming CPU is a question I am asked to answer nearly on a daily basis from clients or friends in my line of work. While your concluding recommendation are spot on given the information you provided, I wouldn't often find myself giving out the same advice. The reason behind this is the future upgrade path of the PC. My apologies if this has already been pointed out in the comments as I haven’t read every one yet.
Most people seeking a PC upgrade have just started playing a new title and have hit a wall. They are unable to play this new game at the resolution and detail they feel to be the minimum they can put up with. This wall is mostly a CPU or GPU limitation but sometimes it’s both. Of these upgrades the new graphics card is significantly less expensive than a full system upgrade, can be installed easily by most people, and doesn't leave you without a PC or any down time. On the other hand a full system upgrade is expensive, not everyone can put it all together, and often requires an OS reinstallation with data backup.
Let’s say an average gamer (not necessarily you and me) purchases a nice new gaming rig today for whatever reason. It’s likely that within two years or so they are going to hit a wall again. At this point most people have hit the GPU limitation and are able to upgrade the graphics card and away they go again for another one to two years. After hitting this wall for the second time it’s most likely time for a full system upgrade. This process could be only two years for some of us but for others it’s going to be four to five.
What I’m trying to point out is that we can recommend a CPU that is the cheapest while still not limiting our current GPU and get the best possible FPS per dollar right now. But if we do this it’s far more likely we are going to run into a CPU bottleneck early in the upgrade path and instead of forking out a few hundred for a new graphics card after a year or two, we might end up having to replace the both the CPU and motherboard as well.
For this reason I could not recommend an AMD A8-5600K or an equivalent Intel CPU to be purchased with a HD7970 or GTX580 unless you plan to never upgrade your graphics card. Spend the extra $100 to $150 on a better CPU and potentially make the PC last another two years. Maybe the inclusion of some popular titles like Battlefield 3 or PlanetSide 2 would have significantly changed your concluding recommendations. The information provided gives us a good indication of where the CPU bottleneck comes into play but I think the upgrade path of the PC along with what games are being played need to be given a lot more weight for an accurate recommendation to be made. Having said that I could be totally wrong and have recommended the wrong gaming builds for years.
TheJian - Monday, May 13, 2013 - link
I can see a lot of work but only for a future that won't exist for a good long while. You tested at a res that is too high and not showing reality today based on this dumb idea that we'll all buy $400+ monitors. This is the same crap Ryan tries to push (see the 660ti comments section, he pushed it then when they were $600 and ebay via your visa to korea...ROFLMAO - read as I destroyed his responses to me, click ALL comments so you can just CTRL-F both of us). So raise your hand if you're planning on buying a $400+ monitor, to go with an expensive $300 card only to end up at say 28fps in a game like sleeping dogs (avg...so game is unplayable as minimums would be oh I don't know 15fps?). I don't see any hands raised. So we'll be lucky if MAXWELL in Q1 (or whatever Volcanic does for AMD Q4 this year) will be playable at 1440p. Translation, we'll all buy 1920x1200 or 1080p for a while to come unless we own more than one card. Raise your hand if you have multi-gpu's. According to steampowered.com hardware survey that number (last I checked) was under 2%. You're wasting your time. Start writing for the 98% instead of the 2%. I just wasted MY time reading this crap.REALITY: We are all gaming at 1920x1200 or 1080p (or worse, below this). This should be the focus. This would show LARGE separations in cpus and Intel kicking the crap out of AMD and that you wouldn't want to touch that A8-5600 with a 10ft pole. Why? The 7970 would not be the limiter, or at least not every time like here. What % of the people have 3-4 gpus? Give me a break this is what you see as the future? $1200 in gpus and a $400+ monitor? You're pandering to a crowd that almost doesn't exist at all. For what? To make an AMD cpu seem buy-able?
The data in this article will be useful in 3yrs+ when you can hit 1440p at above 30fps MINIMUM on most cards. Today however, we needed to see what cpu matters at a resolution that doesn't make a 7970 look like a piece of outdated trash. You're pretty special today if you have 7970 or up in the gpu.
More AMD CYA if you ask me. Just like we're still waiting months for Ryan to do an FCAT testing article...LOL. We'll be waiting for that until xmas I'd guess unless AMD doesn't get the prototype driver done by then, which means we'll never see FCAT testing here...ROFL.
Ryan has ignored TWO articles now on fcat. It didn't make the 7990 review, and part2 of fcat article never even came. Just keep delaying, your sites credibility is going down the drain while everyone else tells it like it is. AMD & their drivers currently SUCK (cpu & gpu). Their cpu's suck; hence running at a res that shows all your games can't run without multi-gpu and hit 30fps+ MINIMUM - meaning at this res they ALL require more than one gpu making cpu choice a non issue of course. Their gpu's are great but drivers suck so they give away games by the truckload to try to sell a gpu vs. exceptional NV drivers. Lets face it, the best hardware in the world sucks if drivers can't live up to the hardware. Unfortunately AMD blew all their R&D on consoles that are about to die on the vine, instead of GREAT drivers to go with a GREAT gpu.
What do you end up with when you spend your wad on consoles instead of drivers? FCAT showing you suck, runts, stutter, enduro that lacks on notebooks (see notebookcheck 7970m article recently, it was even mentioned here oddly...LOL) and CF that is abysmal and at times showing NEGATIVE scaling for more than one gpu vs....again, NV drivers that have none of these issues. Optimus works (hence nv beats this drum repeatedly and justifiably) and so does their SLI. While AMD sucked for a year (see hardocp driver review for AMD & NV recently) NV got to sit on their butts (driver butts) waiting for AMD to finally get done with consoles and make a "Never Settle" driver that actually performed the way the cards should have OUT OF THE BOX! Thank god for never settle drivers in Nov or Nvidia wouldn't have released monthly driver enhancements from Dec-May...ROFL. People would be stuck with the same perf today as out of the box from NV (as hardocp showed they didn't improve 1fps all year until AMD caught them...well duh, no point giving out free perf when blowing your enemy away all year).
Mark my words...AMD will be writing off R&D for consoles soon. Even activision's Kotick just said last week that consoles (for all the reasons I've said repeatedly here and at tomshardware etc) have a very tough road ahead vs. all the new ones coming out. Sales of Wiiu off 50% after xmas pop. Just one Q after debut nobody cares already! He basically said they'll be watching for the same on the next two (ps4/xbox720). When that happens no games will be made going forward for this crap as we all move to tablet/phone/ cheaper console or PC (for ultimate gaming).
Video killed the radio star. Cheap android gaming killed the console star....LOL.
Ouya, Steambox, Shield (pc to tv here!), wikipad, razer edge, gamepop etc...All stuff that will help kill the consoles and stuff they have never faced before. It was always the big 3, but his time big 3 with little 6-10+a billion phones & tablets chasing them and our gaming time...ROFL. The writing has been on the wall for a LONG while. As usual AMD management screws up. Wisely NV passed on a dying market and only spent 10mil on both Shield and Grid respectively...ROFL. Dirk Meyer wouldn't be doing this crap. They were idiots letting him go thinking he didn't get it. He had a mobile strategy, it just wasn't one that made their CORE products suck while creating it. Management has PIPE dreams. Dirk had REALITY dreams.
http://www.tomsguide.com/us/Next-Generation-Bobby-...
Kotick saying consoles are dead, well he almost says it...Wait and see is basically the same thing...LOL.
"If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and don’t mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feels the same in the OS as an equivalent Intel CPU."
AMD CYA. Total lie. Drop this crap down to 1080p and watch the Intel chips separate the men from the boys and in MORE than just CIV5. ALL games would show separation I'd guess. You must of found this out, which immediately made you up the res huh? AMD threaten the free money or something if you showed them weak or Ryan managed to run FCAT testing?...LOL.
"We know the testing done here today looks at a niche scenario - 1440p at Max Settings using very powerful GPUs. The trend in gaming, as I see it, will be towards the higher resolution panels, and with Korean 27" monitors coming into the market, if you're ok with that sort of monitor it is a direction to take to improve your gaming experience."
Seriously? "If you're ok with EBAYing your $400 "KOREAN" monitor this is a great way to improve your gaming at under 30fps minimum in all games...ROFL. Reworded for clarity Ian :)
NICHE situation is correct in that first sentence...LOL. Again, start paying attention to your audience which is 98% not the NICHE 2% or less. I'm debating buying a 2nd 1920x1200 (already have 2 monitors, one dell 24 and a 22in at 1680x1050) instead of your NICHE just because of what you showed here. 1440p is going to be difficult to run ABOVE 30fps MIN for ages. I'd spend most of my gaming time on the smaller dell 24 at 1920x1200 I think. So I'm debating buying the same thing again in 27in. I want a bigger screen, but not if I can't run 30fps for another 2-3 vid card revs (maxwell rev2?). This is just like I described above with AMD's gpu. Great hardware, but worthless without drivers that work right too. A korean monitor may look great, but what is it worth if you require $600+ in vid cards to have a prayer of 30fps? I'd rather buy a titan, not upgrade the monitor and hit well above 30fps on my dell 24 at 1920x1200 all day! THAT is a great gaming experience I can live with. I can't live with a BEAUTIFUL SLIDE SHOW on a korean monitor off ebay...LOL. I realize you can get a few here in the US now, but you get the point. This is making your niche look like a regular niche is 98%...LOL. Your situation is a NICHE of the NICHE. Check steampowered survey if you don't get what I just said.
http://store.steampowered.com/hwsurvey/
Less than 1% run your res tested here. That's niche of a niche right? The entire group of people above 1920x1200 is less than 2% added all up (and this is out of probably over a few hundred MILLION steam users). Just click the monitor res and it will break them out for you. You wrote an article for NOBODY to show AMD doesn't suck vs Intel? Start writing for EVERYBODY (that other 99%) and you'll be making recommendations for INTEL ONLY.
I'm not saying anything bad against Ian here, clearly he did a lot of work. But whoever is pushing these articles instead of FCAT etc is driving this website into useless land. You guys didn't even mention NV's killer quarter (AGAIN). Profits up 29% over last year, heck everything was up even in a supposedly bad PC time (pc sales off 14%...no affect on Nvidia sales...LOL). They sell cards because their drivers don't suck and a new one comes out for every AAA title either before or on the day the game comes out! That's what AMD should be doing instead of console dev. They gave up the cpu race for consoles too! I'll be glad when this gen (not out yet) of consoles is DEAD. Maybe they will finally stop holding us back on PC's. They stuck us with 720p and dx9 for years, and they're set to stick us at 1080p for another 8yrs. They also allowed NV to not do anything to improve drivers for a year (due to AMD not catching them until Never Settle end of Nov2012). But maybe not this time...LOL. DIE CONSOLES DIE! :)
Here's what happens when you show 1080p with details down...cpu's part like the red sea:
http://www.tomshardware.com/reviews/neverwinter-pe...
Look at that separation!
"It's a little surprising that the Core i3-3220, FX-4170, and Phenom II X4 960 aren't able to manage a minimum of 30 FPS, though they come close. The dual-core chips are stuck at about 20 FPS, and the FX-8350 does a bit better with a 31 FPS floor that averages closer to 41 FPS. Only Intel's Core i5-3550 demonstrates a significantly better result, and we have to assume that higher-end Core processors are really what it takes to let AMD's single-GPU flagship achieve its best showing."
Note only two CPU's managed above 30fps minimum! I guess you need a good cpu for more than just CIV 5 huh? You should have ran at this res with details down to show how bad AMD is currently. PEOPLE, listen to me now. Buy AMD cpus only if you're REALLY taxed in the wallet and can't afford INTEL! I love AMD, but if you value your gaming fun (meaning above 30fps) and have a decent card, for the love of god, BUY INTEL. This was a test with a SINGLE 7970ghz. AMD is light years away from Taxing their won top end gpus. But Intel isn't. The bottom to top in this article at toms was 17fps to 41fps. THAT IS HUGE! And they didn't even show top i7's. It would likely go up into the 50's or 60's then.
Anandtech (not really blaming Ian himself here) is steering people into stupid decisions and hiding AMD's weaknesses in cpu's here, and in FCAT/gpu's with Ryan. I can't believe I'm saying this, but Tomshardware is actually becoming better than anandtech...LOL. WOW, I said that out loud. I never thought that would happen. It's 4:50am so I'm not going to grammar/spellcheck the nazi's can have fun if desired. :) Too bad I got to this article a week late.
http://techreport.com/review/23246/inside-the-seco...
THE REAL CPU ARTICLE YOU SHOULD READ. Note the separation from top to bottom in skyrim here is 58fps for AMD up to 108fps for Intel...See my point? Leave it to Scott Wasson (the guy who broke out the need for FCAT! along with Ryan Shrout I guess at pcper) to write the REAL article on why you don't want a slow cpu for ANY game. This is what happens at 1080P people! Note the FX8350 and 1100T are nowhere NEAR Intel in this review in ANY game tested. The phenom ii x4 980 is slow as molasses also! Note also Scott discusses frametimes which show AMD sucks. Welcome to stutter that isn't just because of the gpu...LOL.
" All of them remain slower than the Intel chips from two generations back, though. "
And this one sums it up best on the conclusion at techreport's article:
"We don't like pointing out AMD's struggles any more than many of you like reading about them. It's worth reiterating here that the FX processors aren't hopeless for gaming—they just perform similarly to mid-range Intel processors from two generations ago. If you want competence, they may suffice, but if you desire glassy smooth frame delivery, you'd best look elsewhere. Our sense is that AMD desperately needs to improve its per-thread performance—through IPC gains, higher clock speeds, or both—before they'll have a truly desirable CPU to offer PC gamers. "
Only anandtech has AMD rose colored glasses people. READ ELSEWHERE for real reporting. So AMD doesn't even offer a desirable cpu for gamers...LOL. Sad but true. Toms shows it, techreport shows it and if I had more time people, I really rip these guys apart at anandtech by posting a few more cpu tell-alls. This site keeps putting up stuff that HIDES AMD's deficiencies. I'd like to buy an AMD cpu this round, but I'd be an idiot if I did as a gamer. I7-4770k for me sir. Spend whatever you can on a haswell based system (it supposedly takes broadwell later) and wait for 20nm gpus until xmas/q1 where the real gain will come (even low end should get a huge bump). Haswell comes next month, you can wait for the FUTUREproof (if there is such a thing) socket one more month. Trust me. You'll be happy :)
I'd put more links, but this site will see too many and call me a spammer...UGH.
colonelclaw - Monday, May 13, 2013 - link
You lost me at '...same crap Ryan...'Never a great idea to preface a wall of text with an insult.
TheJian - Tuesday, May 14, 2013 - link
Well they have previously done worse to me :) I presented data in the 660ti article, called out their obvious lies even with their own data (LOTs of Ryan's own benchmarks were used to show the lies), which prompted Jarred to call me a Ahole and said my opinion was UNINFORMED ;). Ryan was claiming his article wasn't for above 1920x1080 (or 1200) but he was pitching me $600 Korean monitors (same ones mentioned here) you had to buy from EBAY and give you Visa to a nobody in Korea. Seriously? It could not even be bought on amazon from anyone with more than a SINGLE review, which I pointed out was probably the guy reviewing himself :) He had no about page on his site, no support etc, not even a phone#, just an email if memory serves. It was laughable. After taking Ryan down, Jarred attacked ME not the data.What do you expect a person to do after that?
They've been snowing people for a long time with articles like this.
Where is FCAT article part2? Where is the FCAT results from 7990? We are still waiting for both and will continue as I keep saying until AMD fixes their junk drivers and I guess gives a green-light for Ryan to finally write about FCAT for REAL. This is a pro AMD site (used to be more neutral!), I really didn't write it hoping to get love from the viewers. I just wanted the data correctly presented which other sites did with aplomb. You don't have to like me, or the data, just realize it makes sense as shown in the post via links to others saying it. NOT me.
People who stopped at "same crap ryan" were not my intended audience ;) I can hate a person (well I never do) and still value the data in a great argument made by said person. I don't care about them as long as it makes sense. The person means nothing. As I said above I don't blame IAN really, he's just doing what he's told. I even admired the work he put in it. I just wish that work could have been dedicated to data actually useful to 98% of us instead of nobody while hiding AMD's weaknesses. AMD is not a cpu I could recommend currently at all for anything unless you are totally strapped for cash. Even then, I'd say save for another month or something and come home with Intel. I'm not really an Intel fan either...LOL. I was selling AMD back when Asus was leaving their name off their boards (fear of Intel) and putting their motherboards in WHITE boxes! Intel should have had to pay AMD 15B (they made 60+B over the years screwing AMD like this). They had the best cpu's and Intel managed to stall with nasty tactics until they caught them. I love some of Intel's chips but generally hate the company. But I'd consider myself a D-Bag for not telling people the truth and shafting their computer purchase doing it. If I personally want to help AMD and buy a chip I think is way behind, great - I've done my part (I won't just saying). But I wouldn't tell my friends to do it claiming AMD is great right now. That's not a friend. Anandtech's readers are in a sense their friends (we keep reading or they go out of business right?). Hiding things on a massive scale here is not what friends do for friends is it?
I didn't expect any favorable comments from consoles lovers either :)
OwnedKThxBye - Tuesday, May 14, 2013 - link
We might all hate this guy (for good reason) but the words he writes regarding CPU performance in this article have a lot of truth.yhselp - Tuesday, May 14, 2013 - link
Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario.Here are a few suggestions. Since most users that would spend $500 on a flagship video card and $600-$800 on a 1440p monitor and God knows how much more on the rest of the system, aren’t likely to skimp on CPU choice to save a hundred bucks, a different testing scenario might produce more useful information for the masses (regarding cheap/er CPUs for gaming).
A more likely market for an AMD CPUs in a gaming rig would be people on a tight budget – when every buck matters and the emphasis is on getting as fast a GPU as possible. In my opinion, it’d be quite useful to test various AMD CPUs which are cheaper than an Intel quad-core; paired with a 650 Ti Boost and/or 600 and/or similarly-priced AMD video card at 1080p. Of course, this would raise yet another question – are Intel dual-cores faster than similarly-priced AMD quad-cores in this mid-range gaming scenario?
Suggestions for other CPUs:
Core i5-3350P – baseline Intel quad-core performance (cheapest Intel quad-core on the market)
Pentium G2120 – should perform similarly as an i3 for gaming (costs less)
Celeron G1610 – cheapest Intel CPU
TheJian - Wednesday, May 15, 2013 - link
So you're agreeing with a guy that says it's OK to HATE someone but I'm the evil person for pointing out data that is incorrect? HATE? That's not a bit strong? "We might all hate this guy (for good reason)". And people are calling ME offensive? WOW. This reminds me of the gay people who claim to be tolerant, but god forbid any person says something against them (chic-fil-a comes to mind). They want that person tarred and feathered, smear them in the media and never work again, put them out of business, call them names, cheer people who commit violence against them etc...Nice...No double standards there. Another example, Stacy Dash voting for Romney. They called a BLACK woman who spoke her mind a RACIST...ROFL. What? They tore that chick apart merely for having a very well spoken (IMHO) opinion and pretty good reasons for saying them. She didn't sound stupid (despite what anyone thinks about her opinion), but they tarred and feathered her for saying something anti-obama... :( She's a very classy chick if you ask me and they still pick on her (saw some ripping on her on roku last week - msnbc or something).Not sure what his reason is anyway. Did I attack you guys personally? I even let Ian himself off the hook and left the problem at the doorstep of whoever is directing these articles to be written this way. What bothers me most is all the "great article" "nice job" comments to an article that is very wrong and advocates buying a very low end AMD cpu vs. Intel and says it's going to be ok. IT WON'T and in FAR more than just CIV 5 as I showed via other hardware sites.
What part wasn't objective? My data? The other websites giving the opposite of this site? I can't change their data, and there is nothing objective to discuss when the data is just patently wrong as proved.
People can argue I'm not objective on my console beliefs (though backed by sales data, and I freely admit I hate them...LOL but I own an xbox360/2 ps2's - go figure - I don't want another holding my games at 1080p for 8yrs) and the new gen at xmas may sell very well (we'll know in 9-10 months if they scan sell past xmas pop), but the PC comments and data I provided are facts based on data from steampowered's survey, hardocp, toms, and techreport. I could have went with another group also with pcper, guru3d etc...but too many links and this site says your post is spam.
If it was offensive they need thicker skin or stop writing stuff that other sites totally refute. These guys KNOW that when you drop it down to 1080p the cpu is going to SHOW rather than the gpu's shown here (which aren't as taxed at 1080p) showing any cpu can get the job done. Well yeah, any cpu but only when you push gpu's so far they beg for mercy. To me saying that stuff in the article is a LIE when they know what happens turning it down. I wouldn't be so harsh if they were just ignorant of the data, but Anandtech is NOT ignorant. They've been benchmarking the heck out of this crap for ~15 years (I think he started the year I started my 8yr PC business in '97!). I guess you can't call people out for what they're doing today without being called offensive, emotional (LOL) and not objective. I couldn't have written that post if they would have tested where 98% of us play at 1080p right?
What are they doing here at anandtech? Why would they do this? They know what steampowered shows, I basically said the same stuff to Ryan in the 660TI article ages ago but with even MORE proof and using his own articles to prove my points. I used HIS benchmarks.
Ask yourself why we are still waiting for the FCAT articles (now we're up to 2 or more...part 2 of the first, and 7990 data etc)? Ryan said we'd see them in a week. We are into months now.
http://www.anandtech.com/show/6862/fcat-the-evolut...
Where's part2? He still hasn't given us ONE ounce of data using it.
"In part one of our series on FCAT, today we will be taking a high-level overview of FCAT. How it works, why it’s different from FRAPS, and why we are so excited about this tool. Meanwhile next week will see the release of part two of our series, in which we’ll dive into our FCAT results, utilizing FCAT to its full extent to look at where FCAT sees stuttering and under what conditions."
That's from his Part1 linked above. How long do we wait?
Just for kicks:
http://www.anandtech.com/show/6910/origin-genesis-...
"Overall anything short of 5760 with 4x MSAA fails to make a 3rd Titan worthwhile. On the other hand, you do need at least 2 Titans to handle MSAA even at 2560"
Ok, so I need to spend $2000 on two titans to handle some MSAA at 2560 OVERALL in the tested games (heck one hits under 30fps in a game he tested at 1080p in that review). Raise your hand if you think IAN's article is correct...ROFL.
"In three of our games, having a single GPU make almost no difference to what CPU performs the best. "
Yeah in a res that according to Ryan's article taps out two $1000 titans...Then you're right. All cpu's are the same because the Titans are crying for some relief :)
Their recommendation here:
"A CPU for Single GPU Gaming: A8-5600K + Core Parking updates"
"The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD."
No way...So you'll buy $110 cpu and according to Ryan's article on the titan box, buy $2000 worth of titans to go with it to run at the resolution Anandtech thinks is important (2560x1440).
How do I respond to that without being offensive? You should hear what I'm saying in my brain right now...ROFL. The sad part is people are reading reviews like this and thinking it's correct. Look at the first comments on this article "nice work" etc...Really? I don't see a bunch of HATERS on my comments anyway. Just a few who at the least 1/2 agree with what I said ;) Yourself included. Your example proves to some degree, I didn't waste my time.
Sorry if you think my "truth" was hidden. I was attempting to make it more "in your face" for simplicity sake. Maybe I failed a bit...LOL. Can't please everyone I guess.
TheJian - Tuesday, May 14, 2013 - link
Nice...What reason? I defamed a hero of yours? Are they doing you any favors by hiding reality? Can you say after reading the links the other sites are wrong? The point of the links showing the exact opposite of this site is so you JUDGE Anandtech yourselves. I really don't want one of my favorite sites to go away. I just want them to start reporting FACTS as they are without the snow.I don't feel I have to be politically correct all day for that. People need to get over that PC garbage and get thicker skins. We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.
For the sake of your PC purchase, if you intend on buying on their advice, read the links I gave guys. I'm trying to save people from getting burned! Like me or hate me, the data does NOT lie. You just have to look at it and judge for yourself. When one cpu scores 58 vs. another at 108, there is a SERIOUS reason to pick the proper cpu (just one example from above). If you're seriously broke, I'm all for AMD at that point (great integrated with richland probably making a pretty decent experience), but if not...INTEL. But in either case I wouldn't buy EITHER now. Wait for haswell (broadwell goes in it later...important maybe) or Richland which really makes low end gaming possibly pretty fun I think (at least you can play that is). In laptops maybe Haswell with GT3e makes sense as it should get near AMD or blow by them with 128mb in there. But that's not going to desktops. Integrated on desktops from Intel is still useless IMHO and won't affect Discrete sales one bit from AMD or NV.
tential - Tuesday, May 14, 2013 - link
I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p. That's why I laugh when people quote Titan on ANYTHING (which happens surprisingly a lot here). No one has a Titan so why even talk about such a card saying "AMD has no answer for it". Well no one even has the card anyway except for a couple of people. I agree also with the resolution thing. It makes no sense that so many reviews are catered to high resolution and mutli monitor setups.at?People have been wondering why NV and AMD have increased top of the line GFX cards and it's because quite simply, few people have everything needed to exploit such cards. I'd get a 7970, but I don't have a multimonitor setup or a high resolution monitor so what's the point?
Console wise I think the WiiU was a bad for any comparison. It was an upgrade that really brought nothing extra. People who have a Wii don't care about graphics so most of the upgrades of the WiiU are meaningless to Wii owners. The new Xbox and PS4 will be much better in terms of sales.Those console gamers have been dying for a graphics boost.
In the end though you're response explains to me GPU pricing today and why top of the line GPUs are costing more and more. A smaller percentage of people are buying them, because GPUs that are lower end, or GPUs that are older are perfectly capable of doing the tasks needed by gamers today. Maybe when monitors drop in price and more people game at higher resolutions but for now, most people do 1080p, and that's the sweet spot for most people. I know thats the ONLY resolution I ever look and care about.
TheJian - Wednesday, May 15, 2013 - link
Thanks...Console arguments are like ford vs. chevy right? How many people won that argument back in the day? :)If consoles sales after xmas pop continue for 6 months after (unlike wiiu etc that died as Kotick etc point to, wiiu off 50% says something, Vita, 3DS etc down from last revs too, software off also for all), I'll come back and say YOU sir were right :) You have my word. Of course it goes without saying, I'll be saying the exact opposite if it doesn't happen.
Regarding why we need more power...I can show situations where 1080P brought the top end to unplayable. Hardocp just did this.
http://hardocp.com/article/2013/03/12/crysis_3_vid...
They had to turn some settings down even on 680 and 7970ghz and cards below this really turned stuff off (670 etc). People can say, well this or that doesn't make much difference visually, but the point is you can't have everything on without more power (maxwell/volcano should finally make everything on 1080p playable with ALL details on, no sacrifice at all in anything I'd hope).
"Crysis 3 plays a lot better at 1080p resolution, 1920x1080. At 1080p the GeForce GTX 680 and Radeon HD 7970 GHz Edition are able to push the graphics to very high and play with the best experience possible in the game. Granted, we have to use SMAA Medium in order to achieve this. It will most likely take next generation single-GPU video cards to allow us to play at SMAA High 4X at very high at 1080p."
Tombraider has the same issues only worse I guess. :
http://hardocp.com/article/2013/03/20/tomb_raider_...
"If you are interested in playing Tomb Raider the NVIDIA GeForce GTX 680 provided the fastest performance at 1080p, and was the only single GPU video card capable of playing with 2X SSAA at this resolution. At 2560x1600 the AMD Radeon HD 7970 GHz Edition CrossFire setup will provide more performance. For gaming on a budget, or at resolutions lower than 1080p, the GeForce GTX 660 Ti is an excellent option."
So the 660TI I almost bought is for LOWER than 1080p?...ROFL. OUCH. As they point out two cards for above 1080p and only the 680 survived 1080p itself, and only at 2xSSAA. I can site more examples also, but this makes the point. Even 1080p is tough for top end cards if gaming as the devs intended with all candy on is attempted. We need more power, and 20nm should give this from either company I hope. I hope I'll have enough of a reason to buy 1440p for a few games, then flop it over to my dell 1920x1200 when the new cards can't hack my 27in I plan to buy (if I do, might stick with 27in at 1080p, but I like having 2 resolutions native on the desk to play whichever my card can handle). It's comic ryan was pussing 1440p for the 660TI article, but hardocp says that card is for BELOW 1080p...LOL.
ShieTar - Monday, May 13, 2013 - link
Well, if even older dual core CPUs and the weaker AMD parts don't scale at all with a single GPU, it would seem to me like a 60$ Pentium or even a 40$ Celeron with a bit below 3GHz might make a great companion for the typical a 200$-GPU for a Full-HD Gamer. Would be interesting to add any one of those low-cost Ivy Bridge parts to the comparison to see how they keep up with their core ix counterparts.trajan2448 - Tuesday, May 14, 2013 - link
As soon as I saw Crossfire I stopped reading.TheJian - Wednesday, May 15, 2013 - link
One more comment on FCAT missing - From the 7990 review:" The end result is that we’re not going to have FCAT data for today’s launch, as there simply hasn’t been enough time to put it together. FCAT was specifically designed for multi-GPU testing so this is an ideal use case for it and we’d otherwise like to have it, but without complete results it’s not very useful. Sorry guys.
The good news is that this means we have (and will be compiling) FCAT results for our cards based on the very latest drivers. So we’ll get to the bottom of frame pacing on the 7990, GTX 690, and more with an FCAT article later this week or early next week. So please stay tuned for that."
So we're 3 weeks later and no review for this data STILL. Again, people realize the delay tactics here. In another week it will be a MONTH. This is on top of already waiting for FCAT part2 article I mentioned already.
"Our goal with FCAT was to run an in-depth article about it shortly before the launch of the 7990 as a preparatory article for today’s launch. However like most ambitious goals, that hasn’t panned out."
It's not really ambitious when EVERYBODY else is already presenting data article after article. Just keep making excuses. Take a good look at the credibility of this site here people and judge these guys yourselves. Ryan Shrout seems to be able to pump out article after article on FCAT, including his review for the 7990, Titan etc...Every article discusses it at this point. Is Ryan Shrout at PCper.com so much more effective than this huge website? Ryan's asking for donations to upgrade his camera equipment for recording podcast type stuff etc. How many people do you have working here compared to his little site? Which I love BTW. Great site, and he nearly has doubled the funding the asked for :)
At some point I hope people start asking you guys more questions after looking at my posts pointing out stuff most just seem to miss. People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon. You can start with an FCAT article that makes other sites like PCper seem as small as they are.
Are you still trying to figure out how to use it or something? Call Ryan Shrout :)
bds71 - Wednesday, May 15, 2013 - link
Ian: i noticed you were GPU bound a lot. doesn't this sort of defeat the test? (i think you were GPU bound more than 50% of the time). i'm curious why you didn't use eye-finity or nVidia surround to test the quad graphics setup? with that much power under the hood it's almost a necessity. anyway, i don't mean to critisize the review, i think it still had some very usefull information. i just think that the conclusion wasn't complete if you're GPU bound. note: and decreasing the graphics so that your CPU bound is unrealistic - nobody with quad graphics is going to reduce the graphics capability so their CPU bound.bds71 - Wednesday, May 15, 2013 - link
edit: i just read through (most) of the comments above. and, while 98% (doubtful, but OK) may play on a single 1080p screen, the fact is that high end graphics are a waste of money for a single 1080p monitor. and, while some games (like Skyrim and Civ V) use a lot of processor, that type of scenario is not indicative of most games. note: also, most of those 98% single screen 1080p users also probably DON'T have a top-of-the-line (ie: 980 or 7970, much less titan, 690 or 7990) graphics card. they probably have a 200-$300 graphics card and a 100-$250 CPU (ie: mainstream). nor do any of those less than top-of-the-line *need* anything more than single 1080p monitor and a mid-range CPU (of which the AMD or Intel variety will do just fine for 98% of those 98% with a single monitor) from my point of view this article set out to find out how much the CPU is used in gaming. does it make sense then to put a limit on the graphics capabilities? of course not. so you go with the high end (top-of-the-line) graphics solution. but in the end, the graphics capabilities was still limited by the screen resolution - you couldn't really see what the GPU's were/are capable of because they couldn't really stretch their leggs (and, in turn, the CPUs never stretched to their limits to feed such a request).i participate in F@H. as such i also use my GPU's. i've noticed that (depending on the work unit) the GPU's can take as much as 20% of the CPU to keep them fed. is gaming really that much different? the CPU is needed to feed the GPU, and to do those functions that can not be done on the GPU. for folding, it doesn't matter how fast something gets done - so a faster CPU isn't imperative. but, for gaming, the speed of CPU and its ability to keep relevant data going to the GPU does matter. when the CPU can't keep up with the GPU you get slow minimum frame rates and a general "slow" feeling from the game. so, yes, i agree minimum frame rates are important when determining what CPU to use when feeding a high end graphics solution (more so when using more than a single GPU solution). but you still have to let the GPU's stretch thier legs to see how much of the CPU is being used - and that will determine if a CPU is good or not (min frame rates and CPU usage with high end graphics at appropriate resolutions)
yhselp - Wednesday, May 15, 2013 - link
Wow, the sheer amount of 'content' that Jian guy is producing is amazing. You could probably publish a few books worth of comments by now. Is it really necessary to hit everybody up with a 1000-word reply?What I don't get is why you actually do this. You don't agree with what's been tested and how the data has been interpreted; okay, that is your right. And, yes, some of the conclusions drawn might be controversial; but what's your problem? Why don't you just voice your opinion once and leave it be? What are you doing here - are you some sort of freedom fighter for objective data on the internet?
You complain about how AnandTech are doing it wrong and claim that your own observations are objective and valid. From your point of view they might be, but what you are forgetting is that testing hardware is so vast a field, with so many variables that it's impossible to scientifically claim that ANY conclusion is objective, since the very essence of what you're dealing with precludes that. Everything (in hardware testing) is subjective - live with that truth.
It's not about having "tough skin", but having manners and being civilized. You can't expect people to listen to you and take you seriously if you're being rude even if your arguments are valid. Try a more gentle approach - I guarantee your message, whatever it might be, will travel further.
Remember, this is not an article about choosing a CPU for 1080p gaming, also, it's not complete. The provides information for people to interpret their own way. Yes, it draws conclusions at the end that I too think are best left unsaid; but why can't you just look past them? What is your problem? What are you trying to change here? If you don't like AnandTech so much, why don't you just... leave?
TheJian - Thursday, May 16, 2013 - link
Am I supposed to not respond now? You just said I have no manners, am uncivilized, have no objectivity, and previously I’m offensive and it’s ok to HATE me…ROFL. POT – MEET KETTLE. If you were to take your own advice, shouldn’t you have just said “you could word it differently but I agree with the data” and left it at that? No, you took it much further with what amounts to an ad hominem attack on ME. You posted 333 words yourself to do it. :) But thanks for recognizing the work I put in :) I can type 60+wpm though so, not that much effort really and two to three times that with Dragon Naturally Speaking premium easily (pick up a copy if you can't keep up - 1600 words in about 9 minutes...ROFL v12.5 rocks). The homework takes time, but that was already done before they wrote this article as I read everything I can find on stocks I track and parts I'm interesting in.I've watched this site (and toms) since they were born. 1997 I think here. I did leave toms when Tom Pabst himself forced out Van Smith over the sysmark crap years ago (and removed his name from ALL of his articles he wrote there, putting "tomshardware staff" or some such in Van's name's place). That was AWFUL to watch and I loved reading Tom Pabst's stuff for years. Millions of people were snowed there while they made AMD look like crap in articles with sysmark flagging Intel chips and turning off SSE on AMD. Eventually people like Van, I and others said enough that people took notice and it devalued his site before he sold it. Rightfully so if you ask me, as he was basically an Intel shill at that point as many had pointed out by then.
At some point somebody has to stand up and tell the truth like Van tried to do. It cost him his job, but the message made it through. Someone has to be willing to “take the hate” for other people's benefit. :) Or nothing will ever get fixed right? People reviewing stuff for millions need some kind of checks and balances right? There are NONE right now in our govt and look what’s happening there as they spend us into bankruptcy amid scandal after scandal kicking our financial future down the road time and again. If we had checks and balances for REAL our president would be in jail along with many dirty congress members on both sides (he just got caught wiretapping the AP – freedom of speech is being trampled, gun rights assaulted, our constitution is attacked at every turn!). People are DEAD possibly because this guy did NOTHING to save them in Benghazi for 7 hours under attack. What happened in Boston? Etc…I'm seeing the same stuff happen here that happened at Tomshardware. Someone has to correct them instead of congratulating them right? Otherwise so many people will make the wrong purchasing decisions based on bad advice from influential and supposedly trusted people (I still like this site, just want back to the neutral stance it used to have for years). In this economy I'd be thanking anyone who takes the time and effort to attempt to save me from buying a piece of junk with my hard earned money. In a nutshell this is why I take the time to show another side for people to consider. They don’t have to believe me, that’s the point of the links, quotes from those links etc. I WANT you to look at the data and make up your own minds. Either it costs this site tons of hits eventually and wakes them up or they need to be put out of business. If nobody ever complained about Win8 how long would we get crap like that? Look how fast it got an 8.1 version as a response and the product manager fired. Put their feet to the fire or they don’t stop ever.
Anand would have to be seeing his sites traffic go down.
http://www.alexa.com/siteinfo/anandtech.com#
If someone takes the time to prove you’re putting up bad data article after article and there is no defense put up (because there isn’t a defense) you are eventually taken down. Jared attacked me in Aug 2012. Pity you can’t go back a year but you can see this site is sliding at least at alexa for the last 6 months. Until they quit yanking our chains I’ll keep yanking theirs if my time allows! Toms went from 10mil to 2mil in just a couple years. I’m not sure what he sold for but it was far less than he’d have gotten before attacking Van, the article shenanigans etc.
Tell me, what parts of my comments were UNCIVILIZED or RUDE? Did I call anyone a name? Say they are stupid? Did I attack ANYONE personally? Did I do what you did? Actually I did quite the opposite. I said they are NOT ignorant and know exactly what they're doing here (hmm, insinuated intelligence…That’s a good comment right?). I even let Ian off multiple times (he's just doing what he's told no doubt) and noted from the get go he did a lot of work, but due to "someone" pushing bad data to hide AMD's faults it's all wasted. I attacked the crap this site is pushing (crap too harsh for you?), not any of the people themselves (who I'm sure are probably nice guys - well, I can't say that about them all, Jarred attacked ME not the data when I buried Ryan's conclusions and benchmarks). Did I swear at someone? Did I spew hate like the guy who gave a one liner to me? He's claiming its ok to HATE me? When did I ever cross a line like that? Is a debate of the facts worthy of HATE today?
If you hate the length of my post don't read it. Take your own advice, move along please. Was it necessary for you to post 1000 words back? :) I'd say even the HATERS took me seriously (the only ones that responded besides Tential – what 2 total plus a polite tential?) and saw the arguments were valid and listened. ALL of them did in their own way. Only the first below wasn’t rude as you say and just discussed what I was saying- tential - no flare up from him, just good old fashioned debate:
"I don't agree with your analysis on consoles but everything else sure. Gaming for 98% of people is 1080p."
Tential clearly got the message despite our console differences (they weren’t the point really). I’m sure tons of others did even if they’re silent about it. I used to be SILENT. You can’t argue with steampowered.com’s data, nor everyone else showing the res you SHOULD be running here. You can confirm via techreport, hardocp, tomshardware, etc I gave plenty of links and quotes for people to analyze.
"We might all hate this guy (for good reason) but the words he writes regarding CPU performance in this article have a lot of truth."
WOW...But at least he saw the truth, and his name is hilarious to me :) Did I attack back? NOPE. Even when he seriously crossed a line IMHO I did nothing but a polite rebuttal with some questions – still waiting for why he thinks it’s ok to HATE people for simple comments, but I don’t mind either way, even he got the message. Worse you agreed with the hate...LOL
Here’s you:
"Agreed. What he wrote is offending, emotional and hardly objective. However, there's a truth hidden in there somewhere. Consider the following scenario."
Comic, I said nothing bad about people, just their data. But to you, it's OK to hate me for it and then toss comments about my character...This goes back to the double standard I mentioned in my previous posts.
There is nothing wrong with a vigorous debate of the facts in any case and I was CIVIL though critical. This was an article about the proper choice of a GAMER cpu. As presented the data is lies as they presented a situation that doesn’t exist (as even you pointed out in your scenario basically). It would be just "incorrect" if they didn't know what they were doing. But they DO know. They know they’re hiding FCAT data as I pointed out. AMD only talks to them as Guru3d recently pointed out (hilbert did). Odd, yes?
I find it funny I already answered your questions before with comments like this (but why not do another 1600 word essay for you) :) :
“People will eventually JUDGE this site accordingly if you keep this stuff up. I sincerely hope this site returns to good neutral data soon.”
This doesn’t tell you why I’m doing it? I claim OTHER websites I pointed to are OBJECTIVE and VALID. I piled on with my own observations, but I was merely quoting others who all disagree with this site. That’s not subjective that’s FACT. It’s not my point of view; it is the same one as EVERY other site reporting this type of data. Hardocp, Techreport, PCper, Tomshardware. How many do I need before you call me objective? I can give more sites and another 1000 words of quotes…LOL. I can scientifically claim the resolution they chose here to make all cpu’s show the same perf because the gpu is bottlenecking everything, represents less than 1% of the population and I will be RIGHT. Introducing a variable that totally invalidates the entire premise of the experiment is not subjective, it’s misleading at best and easily proved wrong as I have done. My message travelled far enough as nobody missed it as far as I can tell. Mission accomplished, gentle or NOT ;)
If you don’t like my posts, To quote you:
“why can't you just look past them? What is your problem?”
“why don't you just... leave?”
:) Gee, it seems I've upset you ;)
"What are you doing here - are you some sort of freedom fighter for objective data on the internet?"
Already answered and YES, why not :) What are you doing here? Are you some kind of smart alec that objects to people voicing their RELEVANT opinions in a "comment" section? Silly me, I thought that's what this section is for. Can we get back to discussing the data now? You've distracted us all from the topic at hand long enough and it isn't changing the data one bit.
OwnedKThxBye - Thursday, May 16, 2013 - link
Sorry for seriously crossing the line good sir but I still reserve the right to hate you if I choose. A wise man once wrote “We are FAR to sensitive today. It's like nobody can take a criticism these days and the person who gives it is evil...LOL.” <--- this is you =). Keep in mind I was also the first one to agree with you… What you write never fails to bring a smile to my face TheJian, and I hope you don’t stop pointing out the truth any time soon. Just try to keep the next comment shorter so we can read it without so much scrolling..... we don't all own LCDs with 1440+ vertical pixels like we are told to. In the end all we can pray for is a few less gamers to run out and buy an A8-5600K for their HD7970 and for a few of your points to be taken into consideration next time round.yhselp - Sunday, May 26, 2013 - link
First of all, I’d like to apologize for this long-delayed response – I simply didn’t have the time.Truly epic. To start off, you haven't upset me, really; not before and not now - I was genuinely curious as to what it is that you think you're accomplishing by all this (not just this article, others as well). Thus, I set forth to playfully provoke you into responding. Success. Now that you’ve answered, and to be fair – more clearly than expected, I have a better understanding of what urges you to do what you do. Such a peculiar case you are, I am fascinated – are you a troll or aren’t you? Somewhere in between I guess. The arguments you provide are sound, although I still think they’re a bit… let’s not use a word as I’m sure you will twist it into a meaning of your choosing (not originally intended); and most of what you say is, well, adequate – all that makes you not-troll after all. Despite that fact that you would’ve probably responded to anything anyway, I still feel that a ‘thank you’ on my side is necessary for your taking the time to respond; and I’m not being ironic here.
Now, let’s get a few things out of the way. Note that I’m neither defending nor criticizing AnandTech, I’m simply voicing an opinion just the way you are. Very important – I never said it was okay to hate you or anybody for that matter, you deduced that yourself. I simply agreed with the gist of what OwnedKThxBye said. You cannot cling to ever word you read online, I don’t think anybody here truly feels hate, certainly not me. People just throw words around in the heat of the moment just the way you debate vigorously, I’m sure you understand that. The semantic field of the word ‘hate’ in 21st century contemporary English is huge, especially when used in this type of discourse.
Why would you blame me for distracting “us all” from the topic at hand when you are the King of Sidetracking? Gotta love your insights on US politics – it’s like watching one of those documentaries on History and the like. My favorite part is about “gun rights” – nice, so eloquently put. The only reason we still have the Second Amendment is because the US cannot just change the Bill of Rights which is part of the oldest acting constitution in the world – it’s a matter of national pride. The reason it was written is a historical occurrence no longer valid. During Colonial times the settlers had to harbor British soldiers which often mistreated them, and so the settlers needed a means of protection. That is how the Second Amendment came to be. Obviously, this is no longer the case. You could argue the right to bear arms is part of Americannness, but this doesn’t change the fact that the original, intended reason for the Second Amendment is a thing of the past.
Checks and balances for the consumer computer industry – so amusing. Manufacturers, Reviewers and Consumers each checking on the others; that is such an utopian concept. You say it doesn’t work for a country’s government, how do you expect it to work for an industry where money is king? There would always be hidden agendas, you can’t stop that.
I believe I’ve discovered a new form of entertainment, and that is reading Jian’s comments. You, sir, are crazy. I don’t mean this as an insult. Keep on fighting the good fight, I can’t wait to read more of your comments; and, please, never stop sidetracking and using internet abbreviations such as LOL.
azdood - Wednesday, May 15, 2013 - link
Hi Ian, have you ever considered testing time between turns on Civ5? CPU makes a HUGE difference especially as you get deep into a game.tential - Thursday, May 16, 2013 - link
This is partially at that Jian guy and at everyone. I understand the desire for high end GPU reviews but using your OWN earlier posts, you stated that the majority of people game at 1080p. If that's the case, whats the point of pushing for a 7990, Titan, FCAT review when quite frankly NO ONE HAS THOSE CARDS. According to your own data and posts from the previous page.To me it seems like you're just trolling however, because you brought up the point of affordability, I think that that's where the majority of reviews should target. YES I want to see how the 7970 and the GTX 680 perform, yes I want to see the next gen too, but I really don't think we should waste so much time on Multi GPU setups that under 1% of the gaming community has.
How about more reviews on upgrade paths, Price to Performance, how to get the most performance at a reasonable price point. That's what I care to see. Any review in which the hardware being tested exceeds 2k (I mean additional hardware), to me is just boring because at the end of the day, I'm not buying two titans, or two 7990s, or even 3 7970s.
This is of course my PERSONAL opinion, but considering data backs it up, I'd like to see some more reviews cater to the average (when I say average I mean average in terms of the gamer who reads reviews and makes educated price to performance ratio choices) gamer.
This review kind of tries to do that but in all reality, we aren't gaming at 1440p so more reviews at how to get the best performance at 1080p for a good price, while leaving us a decent upgrade path would be nice.
FriedZombie - Friday, May 17, 2013 - link
Could you possibly go to some slightly older processors and GPUs? In particular the i7-990x would be a great start and the lower and upper end of AMDs 6000 series would be nice too (it seems a LOT of people upgraded from the 5000 series to the 7000 series this year) A benchmarking for Witcher 2 would be nice as well as max settings with Ubersampling turned on is extremely taxing on both CPU and GPU because of how inefficient CDProjekt's RED engine is.ol1bit - Friday, May 17, 2013 - link
All I can say is WOW!Nice work!
qulckgun - Sunday, May 19, 2013 - link
62yrs old play ~150hrs a month. Ready to build new PC. Know next to nothing about building new PC. Read various forums and articles and find the comment sections are great at clearing up some of what I didn't understand in the main article. That being said this is one of the most intertaining comment sections I've read in awhile and was pretty informative. It's helped me put into perspective my hardware choices. Please lets agree to disagree but in a respectable manner. Thank you all for your comments and responces, it's an education.Rob94hawk - Sunday, May 19, 2013 - link
This was a great article! I'm surprised you didn't use a QX9770 for socket 775. Any reason for that?bds71 - Wednesday, May 22, 2013 - link
Ian - since the new 4k TV's are out, i think these types of reviews are very indicative of what we can expect once we are able to hook a PC up (using multiple outputs - such as eyefinity or nVidia surround) to a single input 4k TV. for those who don't know, the new 4k standard (3840x2160) is equivelant to eyefinity or nVidia surround at 1080p, but with 4 monitors instead of 3, and in a normal 16x9 format rather than the super wide 3 screen setups. ie: --|--|-- vs ==|== note: equivelant resolution, but not actually 4 monitors :)can't wait for THAT testing to begin. assuming an owner can turn off overscan (so you can see the taskbar at the bottom) i indeed intend to purchase one (likely, soon) and would definately want to hook my PC to it. my GTX690 would likely be able to do OK at such a resolution, but i would eventually want to get another 690 - as soon as i could figure out how to utilize the second card with only a single HDMI input on the TV.
as far as blue ray content - if you wait....it will come :)
Xistence - Wednesday, May 22, 2013 - link
Very nice and long explanation of what most already knew, GPU bound games in single player do not stress the CPU much, however once you go online or play a CPU bound game this information is worthless as AMD will come crashing down at about 40% less fps and your GPU won't be the bottleneck.apollo_lumina - Wednesday, May 29, 2013 - link
These differences in video processing benchmarks equal to nothing,as the clips are too short and simpleTake a full blown Avatar or Inception movie on blu-ray and make it smaller using Handbrake or such
Suffer through the process and you will see that the video obtained with a processor who is not 1000$ is trash:it lacks audio and captions on some parts or some frames are lost
And then you will see the light:why people are buying processors with no faults from the processing line