Hard Choices: AMD CPU update

Points and very possibly a prize...

Bulldozer and Zambezi begat Piledriver and Vishera. Say that three times before bed for a month and you might just remember AMD’s latest chips. So it goes with silly CPU code names. Completely lost you? AMD has just tweaked its FX-branded PC processors. Out goes the AMD FX 8100 series, in comes the AMD FX 8300 series. But am I bovered? And should you care? Read on to find out.

The first thing to appreciate is that the revised Piledriver cores inside the 8300 series do not represent a dramatic new processor architecture. We’re still talking 32nm SOI manufacturing tech and largely the same core design as existing Bulldozer FX processors. Indeed, you don’t even get any extra cache memory. Has anything actually changed?

Most obvious is an uptick in clockspeed, model for model. The existing table topper is the eight-core 8150. It’s nominally clocked at 3.6GHz and Turbos up to 4.2GHz. The new 8350 cranks things up to 4GHz and 4.2GHz.

Yup, that’s no change in top Turbo frequency. But that’s fine by me since you’ll only get the full 4.2GHz with half or fewer of the cores fully loaded. In that context, the 400MHz increase in the basic, all-cores-enabled clockspeed is pretty useful.

It’s a similar situation with the more affordable eight-core 8320. However, the six-core and quad-core models get a boost on both basic and Turbo frequencies, albeit more a more modest 200MHz across the board.

Proc-in-a-box: AMD has tweaked its FX chips

Elsewhere, AMD has polished up areas of the chip that I frankly struggle to care about and scarcely dare bore you with. Like more aggressive schedulers, improved branch prediction and replacing soft-edge flops with hard-edge flops where possible (if you don’t ask, I won’t tell), which helps with power consumption. Like I said, no big changes.

Thus it’s really really clocks and pricing that are of interest. Regards the latter we’re looking at pretty much the same pricing as the outgoing equivalents. Well, a quick scan of the usual online suspects suggest a small premium over the 8100 series, but hopefully that will cool off over the coming weeks.

Put it altogether and these new Piledriver chips are not about to blow Intel’s de facto gaming king, the Core i5-3570K, off the map. Or even the old 2500K, for that matter. However, since last we covered CPU in depth, I’ve done a bit of soul searching and my attitude to AMD processors has shifted slightly.

To be absolutely clear, the Core i5 is still my pick. But I’m increasingly coming round to the idea of good-enough computing in some scenarios. In fact, you could argue that Intel is, too, since its own CPU upgrades have become increasingly incremental of late.

I’ve just taken the new Core i7-3970X for a spin and it’s a pretty epic case of sand bagging. It still has two cores locked out despite the stupid £800 price tag. And it’s barely any faster than the existing Core i7-3960X.

Meanwhile, back in meat of the mid-range CPU market, Intel has been stuck on four cores for several years. And that won’t change when its new Haswell generation of CPUs arrive next year. For a while now, Intel has been main concerned with mobile when it comes to regular client type PCs rather than servers of workstations.

At the same time, Intel does a lot of things that rather get my gander. Like switching off HyperThreading in certain chips. Or locking out overclocking in most of its processors, switching CPU sockets at a drop of a hat and restricting software updates for its storage technology to the very latest chipsets.

HyperThreading: Sir Not Appearing In This CPU

Meanwhile, AMD offers straight forward, reliable hardware that may not be the fastest but makes sense at certain price points. Typically, you get fully unlocked chips and platforms with maximum backwards compatibility. You could argue AMD only does things this way because it’s playing perennial catch up. But whatever the reason, it’s a lot less antagonistic to its customers than Intel.

The problem for you fine fellows of RPS, of course, is that the application type that makes Intel’s chips look best just so happens to be games. At the same time, 120Hz monitor availability is beginning to pick up. And I do love 120Hz, both in-game and on the desktop.

I’ve just had a play with the first mainstream monitor pitched primarily on its 120Hz prowess, for instance, the Iiyama Prolite G2773HS. It will, of course, work with NVIDIA’s 3D Vision tat. But Iiyama has given it 120Hz chops mainly for the benefits that brings in 2D mode. So while PC game engines haven’t exactly been busting boundaries of late in terms of CPU loads, if you want to make the most of that 120Hz panel, there’s no getting round it. You need an Intel chip.

All of which means I must grudgingly maintain my Core i5 recommendation for gamers who play a lot of system-intensive titles. For everyone else, and that includes serious gamers who frequent, shall we say, a more cerebral and less visual games catalogue, I’m much more enthusiastic about the AMD option of late.


  1. Stardog says:

    If they had names that made sense I might actually be able to take a look at AMD processors…This has been the case for well over a decade.

  2. Dana says:

    I couldn’t care less about a brand. But I will always pick the best in price-performance.

    • SelfEsteemFund says:

      This has always been my philosophy too, I did almost feel bad about switching to intel a while ago but until AMD pull their fingers out they’ll continue to lose market share. Steams HW survey says it all, it’s never been this one-sided link to i.imgur.com link to store.steampowered.com

      • SkittleDiddler says:

        Intel has a great marketing department.

      • ThTa says:

        It’s not so much that AMD has to “pull their fingers out”, it’s that Intel spends roughly eight times as much on R&D per year and can easily afford to – they spend more on R&D than AMD makes in pure revenue. This gap in performance is only going to increase unless AMD finds a different approach, leveraging their increasingly few advantages. The main thing they had over Intel for all these years was GPU performance, but even that gap is rapidly closing as Intel’s IGPs continue to improve massively.

        And really, AMD isn’t the only one suffering from Intel’s sheer market dominance and R&D abilities. The ARM SoC sector is going to suffer, as well – Intel was the laughing stock of mobile processing when they committed* around 2010, with units that were far too power- and heat-intensive to compete, and even their performance was relatively poor. But in just two years, their mobile Atom (currently Medfield) line has managed to catch up with a sector that’s been going strong for over a decade. It’s not quite there yet, but that achievement should be pretty indicative of things to come, and Intel isn’t about to give up just because their actual products aren’t profitable yet: They intend to dominate the mobile space, just as they dominate the PC space.

        *(Do note that they had MID-oriented initiatives several years prior, but their roadmap has been massively accelerated since 2010-2011.)

    • PoulWrist says:

      Me too. I had AMD for many years now, and I had Intel before that. Now, I’m back on Intel. Why? Because it was just the best value for the money, and I expect to only upgrade CPU every 3 years.

  3. varangian says:

    >But I’m increasingly coming round to the idea of good-enough computing in some scenarios

    I wonder how important ever faster CPUs are any more. My system is about 5 years old (at least, can’t remember exactly when I bought the bits) with an Intel dual core E6750 cpu inside. I play the usual range of games, GTA, Skyrim, Crysis, Xcom etc. and I’ve yet to find one that stresses out the CPU. I upgraded the graphics card a couple of years back as the original was just carried over from the previous h/w configuration and was therefore c. 2004 vintage. That did make a difference as I was able to turn on higher quality graphics. No doubt sooner or later some epic game will come along that needs a faster CPU but the days of upgrading every couple of years just to be able to play the latest games seem to be over.

    • monsterZERO says:

      Arma 2 is that epic game (at least for me).

    • fish99 says:

      You’d actually be surprised how much FPS you lose by not having a good CPU. Overclocking my I5-760 (quad) from 3.16 GHz to 3.8 gets me around 20% extra FPS in most games, even though those games don’t use anywhere near 100% CPU capacity at 3.16 GHz. Doesn’t exactly make sense to me, but there you go.

      Of course whether you need that extra 20% is another discussion. For me trying to run Skyrim in stereo 3D with nvidias 3D Vision “tat” (as Jeremy puts it), where everything needs to be rendered twice, then it makes a significant difference.

    • Emeraude says:

      Yeah, as someone who’s been suffering from upgrade fatigue for years now, I can’t thank enough the usual suspects in the “holding the PC back” narrative (whether they’re guilty or not). Especially since most of that raw power has for the most part been used for things I find rather meaningless; with a few notable exceptions, we’re still playing the same games, or even more simplistic ones, only with a new coat of paint. And the ones that *are* doing something interesting and new benefit more from good design than raw power anyway.

      Your mileage may vary. Please let’s not talk about Dwarf Fortress.

    • fish99 says:

      Btw I just noticed you list GTA. GTA4 definitely benefits a lot from a fast quad core.

    • Jason Moyer says:

      I have a 5770 GPU, which is pretty modest for 1080p gaming, and I recently upgraded from a 32-bit XP system with an E7200 Core2Duo to a 64-bit Win 7 system with a stock i5-3570. I’ve seen a slight improvement in the framerates of every game I play, although the main benefit has been performance stability. But anyway, even something as crappy as an old 5770 is bottlenecked on slower CPU’s.

      • Eddy9000 says:

        I’ve been running my i5 with a 5770 for the past 2 years or so, currently playing dishonoured on max graphics at 30fps. It’s something I really like about the console development cycle, it almost guarantees that your setup will be adequate for the next 4 years. Won’t even think about upgrading till the next gens come out. Sure it means that graphics aren’t nessecarily pushed as high as they might on PC developed games but the constraints seem to really make optimisation a priority.

        • d34thm0nk3y says:

          They only make it a priority to optimise their games to run well on consoles. They seem to just hope that PCs will all have better hardware to be able to compensate for not optimising.

          • Eddy9000 says:

            there seems to be some crossover however, games like Dishonoured and Skyrim seem to work very well on ageing hardware.

      • Casimir's Blake says:

        I wouldn’t call a Radeon 5770 “crappy”.

        I have a Phenom II X6 1055T-based system running a passively-cooled 5750. XCOM plays almost flawlessly at 1920×1200 (not 1080!) on mostly-high settings.

        I would argue that, if one owns a very good Core 2 (E6600 or above, E7xxx, certainly any Quad) or a Phenom II at a decent clock rate with a similar video card, they shouldn’t feel the need to upgrade any time soon just to improve their gaming. Having at least 4GB of RAM and an SSD for an OS or main drive would be more recommendable upgrades. Though we are now seeing more games making use of 3+ cores, so this is something else to consider.

    • TheMopeSquad says:

      Speaking as a person who is still running off a AMD single core processor and I have still been playing titles like Borderlands 2, Diablo 3, and Dishonored. The need to only get the top of the time feels more like a social stigma, if you were to opt for “sub-standard” performance in AMD would it really affect your modern gaming that much, when these titles can still run on a paleolithic rig like mine?

    • SuicideKing says:

      Dude, i have a Core 2 Quad, and i can see i’m very clearly CPU bottlenecked in newer games.

  4. lucian says:

    Why would you not be excited about better branch prediction and scheduling? Besides caches, those are the sort of things that make a big difference, much more so than clocks.

    Most modern CPUs have peaked when it comes to gaming performance (and most other things). That’s great, one less thing to care about.

  5. Premium User Badge

    james.hancox says:

    Nice picture of the river Vishera there. :)

    EDIT: Do I get a prize?

    • sonofsanta says:

      Google Search-by-Image does make such contests rather trivial now, I fear :/

      • Premium User Badge

        james.hancox says:

        Hah, I didn’t even think of that, I just knew that AMD’s CPUs were named after rivers.

      • Lambchops says:

        I thought it looked like somewhere in Scotland so I was all ready to make an idiot of mysefl!

      • yurusei says:

        I’m sure you meant some of the CPUs are named after rivers.

        Unless there are rivers named Bulldozer and Piledriver somewhere…

    • Jeremy Laird says:

      Google image search ruse duly note but will give you the benefit of the doubt on this occasion.

      What is your current CPU and motherboard?

      • Premium User Badge

        james.hancox says:

        Phenom II X4 965, on a crappy old Gigabyte AM2 board (M61PME-S2P). Trying to squeeze as much out of the old girl as I can!

        EDIT: I tell a lie, it’s the 960T.

      • Premium User Badge

        james.hancox says:

        Oh yeah, thanks for benefit of doubt! :)

  6. ArthurBarnhouse says:

    Since every high end video game I’ve played in recent memory simply recommends a “Quad Core Processor” rather than a clock speed, wouldn’t it be better to just buy an AMD Phenom II 965 Black Edition for about half the cost of an i5 Quad Core processor? You could take the extra $100 and buy a better graphics card.

    • Sparkasaurusmex says:

      You are a genius. This is how I like to build. Save a bundle going AMD and shift the savings over to graphics card.

    • Alastayr says:

      Not necessarily. Look here: link to anandtech.com

      The chart speaks for itself. While the X4 965 BE is “good enough” (it’s currently powering my old gaming rig as well) that Intel lead will only increase as you ramp up the resolution to 1080p, 1200p or even 1440p. If you’re on of those “in the middle” gamers you may be satisfied with that AMD CPU, but it’s three years old now, architecturally a bit long in the tooth and nowhere near competitive.

      • ArthurBarnhouse says:

        Oh don’t misunderstand. I’m not arguing that all quad core CPUs are at parity. But if you had $300 to spend on a CPU and graphics card, surely a 965 BE and a gtx 660 is a better choice than an i5 3.6Ghz quad core and a Radeon 7770.

        • Alastayr says:

          Oh, totally. I was just elaborating on your point as to give less informed people a bit more perspective. At this point in the hardware cycle, it’s definitely GPU > CPU.

      • SkittleDiddler says:

        I’m just purely in Bitch Mode here, but I wish people would stop linking to Anandtech articles when making CPU comparisons. They’re heavily biased towards Intel products, and I’d go so far as to say their benchmarks are untrustworthy. There are certainly more reliable websites for those who are shopping around for a new (or old) CPU.

        • MacTheGeek says:

          Such as?

          I tend to use Anandtech for GPU comparisons rather than for CPUs, but I’m always open to finding new reviewers and methodologies.

          • Admiral Crunch says:

            I like Techreport, especially their 99th percentile frame rate charts. More useful metric than balls out max FPS averages IMO.

          • SkittleDiddler says:

            Tom’s or Overclockers for example.

          • Dahoon says:

            Tom’s? I’ll admit I don’t go there, so I might be a few years or five behind the times, but since when did Toms hardware, of all places, become a go to site for trustworthiness and unbiased articles? I remember it as a site that is for hardware reviews, what GameSpot is for game reviews.

            There’s even a forum thread on the other site you mention (overclockers) about Tom’s, where a former writer from Tom’s agrees that it is biased. Without going there to look, I would bet a beer that anyone going there for buying advise, will end up with an Intel CPU, a Nvidia GPU and whatever latest shiny thing Microsoft has spit out, no matter if it is a rig for gaming or e-mails.

            Bet on? (;

          • X_Bacon says:

            Actually, they recommend AMD’s GPUs frequently on their articles, and used to do the same with CPUs before Sandy Bridge and the Bulldozer flop. Their benchmarks are quite extensive and their reviews are pretty much consistent with other sites’ as far as I can tell, so I’d say they’re clean now.

            If on doubt, there’s also TechPowerUp and Guru3D.

    • MrLebanon says:

      my Phenom II X6 1100T has been doing A-OK with all i throw at it (1080p high setting games.. lots of multitasking), with a good overclock it’s a breeeeeeeze

      • Casimir's Blake says:

        Scan currently have the 1045T for a pretty good price. Seems like an ideal cheap upgrade for AM3 systems still with 2 cores. Not sure it’ll be a huge change over a 4-core Phenom II for gaming, though. I have a 95W edition of the 1055T in my main PC, and only rendering in Blender heavily taxes it.

    • Vorphalack says:

      That’s exactly the thought process I went through when I did my last round of upgrades. Haven’t regretted it in the slightest, that processor is more than adequate for any current PC game.

    • Psihomodo says:

      I currently have an AMD X4 965 BE with a 5870 HD Vapor-X and to this day I play all the games that I can remember on high with more than 40 fps on 1920×1200, and that is quite enough for an old machine.

  7. AndiK says:

    Could someone please explain to me the connection between Iiyama offering a 120Hz panel and their users needing an Intel chip? I feel like I’m missing something there.

    • frightlever says:

      Well… I’m no expert but because AMD is only a viable option for cheap (as) chips, whereas Intel commands the higher performance bracket and 3D as in 3D gaming requires more CPU horsepower.

      So, in conclusion, brown horses are best.

    • MrLebanon says:

      i believe because a 120hz panel will look best at 120 FPS.

      • SkittleDiddler says:

        Are you implying that AMD chips can’t manage to get 120FPS? Because that’s bullshit.

        • MrLebanon says:

          i wasn’t the one who wrote the article. I’m merely stating from what I understood of the article. Jeremy recommended an Intel chip for 120HZ monitor, this guy asked why. Knowing that 120HZ monitor will look best at 120FPS, I gave my perceived reason why this was stated (e.g. intel does a better job at hitting higher frames than AMD, according to the articles conclusion).

          Cool off your CPU before jumpin’ at me.

          • SkittleDiddler says:

            Ok, so you were just parroting what you read without adding any context to your response. Gotcha.

    • Jeremy Laird says:

      Simple. Where games are CPU throttled, Intel will give you higher frame rates. There are times where even with an Intel CPU, you won’t be able to feed 120Hz. That scenario will only be more common with an AMD CPU – at least with anything they currently offer.

      Thus if you are buying a 120Hz monitor for gaming, you’ll get the best out of it with an Intel CPU.

      • Dahoon says:

        But it isn’t that simple. Yes, in a world where money is home-made or hardware grow in your garden, maybe. But in this world it would be very likely that -for most peoples budget- buying an Intel CPU will mean you end up with a cheaper GPU, thus moving the bottleneck to a much higher impacting component in the system and making it even worse.

        I.E. in the end have less FPS than with an cheaper AMD CPU and a better GPU. That Intel > AMD for 120hz gaming is either not looking at the big picture or is biased. Well, that or for rich kids.

        It’s almost like saying a Ferrari is better than a Ford to transport you quickly to work but in the end having to drive it real slow to afford the low gas mileage (not that a Ford is much better) (;

  8. Sparkasaurusmex says:

    So about how many graphics per dollar?

  9. Alastayr says:

    AMD is still generations away from delivering a competitive CPU with regards to power consumption though. Just look at the charts in Anand’s review located here: link to anandtech.com

    The FX-8350 is consistently slower than a years old i5 2500K and draws up to 80 Watts more power under load. 80 Watts! And both CPUs retail for about the same price here in Germany. Under a general usage scenario of 4 hours a day under full load, 360 days a year and estimating the price of a kWh at 0.25 € the AMD CPU is going to cost you at least 28.8 € more every year. That does not include the higher power consumption during idle. And it’s still a much slower CPU!

    Balls. Really. I was hoping AMD could score a hit after Intel basically vacated the playing field and went dabbling in mobile, but AMD struggle to even stay relevant in the desktop environment. Hopefully their next architecture can redeem them, I’d love more competition in the CPU sector.

    • MacTheGeek says:

      AMD has also de-emphasized desktop CPUs, focusing instead on its APU line and on the low-power mobile and tablet markets.

      AMD is betting heavily on HSA rising up to supplant the traditional PC architecture.

  10. clem2k3 says:

    The thing that people always miss on these comparisons is the cost of the Motherboard.

    AMD chips go in cheaper motherboards. It makes a massive difference in system cost which. as said above, can go on the GPU.

    So yeah, its not as powerful as an Intel chip … but for a whole bunch less moolah I’m more than happy with my Phenom 1090T.

    • iainl says:

      On the other hand, when I last bought a motherboard and CPU, I went Intel because none of those cheap AMD boards seemed terribly reliable.

      • Vorphalack says:

        They only need to last long enough to reach the next hardware cycle. My previous mobo was a £50 budget special from AS Rock, built for a rig when I was a broke student that complained about the price of a tin of beans. Lasted 3 years, died just as I was considering its replacement, did its job admirably for the price.

    • Daedalus207 says:

      Only it doesn’t really. A good midrange AMD 990X motherboard (Say the Asus M5A99X EVO for $130USD) isn’t much cheaper than a good midrange H77 or Z77 Intel board (such as the Asus P8Z77-V LK for $140USD). When you consider the total cost of a major system upgrade, which in my case included motherboard, CPU, and RAM, the minor price difference between motherboards isn’t enough to sway a purchase decision, especially once you realize that the FX-8350 CPU and the i5-3570K cost the same.

    • fish99 says:

      Plenty of budget intel chipset boards around.

  11. Schrodinger says:

    The Problem the CPU and Graphics industry is suffering from at the moment is the fact that we are about to reach the limit of what Transistors can do without killing themselves…

    New tech are being worked on but until breakthroughs are made with either Crystal, Quantum or bio computing i do think we will see much other then small incremental jumps in performance and not much revolutionary.

    our current tech is limited by the laws of nature and physics…

    Now as AMD vs Intel goes Amd is cheap and works and i personally use one in my main rig but for my Girlfriend i am getting her an Intel i5 because she has the money to spend on it and it is better then what AMD has right now and shes not an overclocker so the fact that the processor is locked down really does not matter…

    • Rise / Run says:

      I’m gonna go out on a limb here, but future tech (even bio, strangely enough) may also be limited by the laws of fyziks.

    • PoulWrist says:

      IBM are working on a new technology that they’ve had some rather big breakthroughs in lately for carbon nanotube transistors, which could be a breakthrough once it they can manufacture them at the same density, but they did just increase the previous maximum by a factor of 1000.

      • Schrodinger says:

        Yes all tech has its limits but right now we need to find something that moves us away from being on the edge of the limit…

        Carbon Nanotubes i forgot but yes IBM and other big names are putting alot of money into R&D right now because they know that we have to make some breakthrough or we won’t be able to improve CPUs, GPUs and RAM (not that much affected)

        What Tech will be the next generation i have no idea… Quantum seems promising but its complicated stuff. Carbon Nanotubes are very interesting…

        They do consider using “Crystal” as storage and no not like in Stargate and such… but they are having problems with the fact that they need a perfect crystal every time they synthesize one… and if the Crystaline Structure is off it won’t work as intended…

  12. Alastayr says:

    Yeah, their chipsets are really a bit wonky in places. My current 790 FX board from DFI has piss-poor CnQ support and USB performance is consistently worse than what I get on Intel machines (X58 and Macbook Pro). I maintain three AMD machines and I don’t think I’ve had a smooth ride on any of them.

  13. Premium User Badge

    Nathan says:

    Is it still the i5 2500k that you’d recommend? Most other recommendations around for mid-range systems (the PCG rig, Anandtech’s guides) are pointing at the i5 3750K at the moment.

    • yourgrandma says:

      Yes the 2500k is still a excellent processor. The ivy bridge version is only 6-7% faster i believe. Get a decent heat sync (the one that comes with the processor is a POS) and it’s easily oc’d too 4.3-4.5GHZ and you have your self a processor that will still last you years.

      • Oh Tyrone says:

        I’m ty

        • Oh Tyrone says:

          I’m typing on a 2500K machine right now, and it’s actually so fast it sent this comment before I typed it.

          EDIT: What I meant was that it sent that other comment before I typed it. By which I mean this one. They’re actually the same, only one got sent before it was typed.

          • MarloBrandon says:

            Thank you for that clarification. I’m on a 2500K machine myself, and initially I thought that my machine displayed your comment so fast that it didn’t get the chance to finish downloading, but it turns out it was actually your machine after all.

    • Naum says:

      They’re basically the same thing. 3570K is the direct successor to 2500K, and between the Sandy Bridge and Ivy Bridge generations barely anything about the core processor has changed. So for around 10€ more you get ever so slightly better performance and a much better integrated graphics unit on the 3570K, which I find reasonable.

      Edit: Ninja’d of course. As you see, no relevant increase in typing speed on my 3570K vs Tyrone’s 2500.

      • Daedalus207 says:

        My computer is located in a small bedroom and the TDP reduction down to 77w was my main motivation for going Ivy Bridge over Sandy.

    • SuicideKing says:

      Should you get a 3570K?
      if you have a Sandy Bridge i5 then no, otherwise for anything older, yes.

      unless you have low power requirements (or any other special requirements like PCIe 3.0) like Daedalus207.

  14. spleendamage says:

    120 Hz is the way to go, but with the new crop of 2560×1440 monitors coming out now, I am looking to move past standard HD in my 27″. Has anyone here used low price Korean monitors like the Yamakasi Catleap 27″, or Crossover 27Q for gaming?

  15. return0 says:

    This article came at a perfect time, but doesn’t really answer my questions about Piledriver. I’m wanting to build a budget gaming PC, and I feel like I may be looking in the “specific price points” where the article mentions that AMD starts to make sense. I’m looking for a CPU in the <$150 range, which makes both the X4 965 BE and FX-6300 look like very good options over their Intel equivalents. That being said, I don't want to be forced to OC straight out of the box to drive modern games (Dishonored, Borderlands 2, Bioshock Inf., etc.) at decent graphics settings. Does anyone have an opinion as to whether to go with the FX-6300, the X4 965 BE + $40 extra graphics card, an Intel price equivalent (i3, i think?), or budget graphics and splurge for an i5-2500k?

    • MacTheGeek says:

      The FX-6300 is $140 at Newegg… but the i5-2500K is $160 at MicroCenter. If the rest of the parts (motherboard, memory, CPU cooler) are the same price, I’d really be tempted to find a way to scrounge up that extra $20.

      (Actually, I made this choice almost a year ago. Went with the 2500K. But don’t start buying today — Black Friday and Cyber Monday just might be very very good to you.)

      • return0 says:

        Thanks for the advice. What I may just do is plan everything but processor/motherboard and then just buy the best processor/mb that I can get for cheap when Black Friday rolls around. Hadn’t seen the 2500k for $160, that’s tempting as hell. Half of me feels that It may be worth putting more money into CPU than graphics card, purely because it’s almost guaranteed that the motherboard socket du jour will have changed when it becomes time for a CPU upgrade.

        EDIT: JKjoker, thank you for your advice as well.

  16. alilsneaky says:

    Amd bulldozer (and by extension the new ones) are unfit for gaming (as much as I dislike intel for their non-K proc and H mobo shenanigans).

    Mr Lairs should source from this article: link to techreport.com

    While higher end bulldozer cpus may on paper be on par with phenom II cpus (average framerate), they are unable to deliver a consistent framerate.
    They pump out way too many single slow frames, causing a 50 fps framerate to feel jittery and stuttery.
    This makes them completely worthless for gaming purposes.

    Even the 4 year old Phenom II does a lot better in this regard (they were absolutely stellar in value for years). Either keep your phenom II’s or switch to intel, there is simply no valid amd upgrade path.

    • MrLebanon says:

      I want a Phenom III. Phenom II kicks ass… bulldozer/zambezi/piledriver whatever is not doing so hot in its infancy

  17. fish99 says:

    I would like to know what exactly make 3D Vision ‘tat’ btw. If you don’t like 3D that’s one thing, but calling it tat implies it’s a shoddy badly made product, whereas both of my kits are still working 100%.

    • fish99 says:

      Hmmm…. no answer. It’s getting to the point where I barely even read the stories here on RPS, I just scan them for the actual news and then check the comments for the inevitable corrections by people who know better. The actual staff here seem prone to making some of the dumbest comments, like John (IIRC) with his “there’s evidence that piracy almost never leads to lost sales” and despite being challenged on the subject we never saw any such evidence.

      So 3D Vision is tat then. You know my PS3 actually broke, my 360 broke twice. Are those also tat?

  18. El_MUERkO says:

    I wish AMD would allow duel CPU supporting MOBO’s, two eight-core Piledriver CPU’s would help performance and still cost less than a higher end four-core intel :)

  19. bill says:

    That was mostly greek to me. The naming and ranking of processors seems to have gone crazy since I last looked… it used to be easy: 12Mhz was slower than 16Mhz. 1GHz was slower than 2GHz. 2 Cores was better than 1. How the heck should I know if an i5-85672 is better than an i7-90882 FX?

    Is there any logic to these names at all? How do i know which chips in a new laptop are great/decent/useless?

    • SuicideKing says:

      Well, for Intel it’s like this:

      Pentium: Dual core
      Core i3: Dual Core + HyperThreading, addition of Intel HD2000/2500/3000/4000 graphics core
      Core i5: Quad Core, hardware accelerated AES encryption, i believe addition of some L3 cache, Turbo Boost.
      Core i7: Quad Core + HT, more L3 cache.
      i7 extremes: 6-core + HT, more L3.

      -T : very low voltage edition
      -S : Slightly lower voltage
      -K : Unlocked Multiplier
      -X : Extreme edition
      -M : Laptop CPU
      -QM: quad core laptop CPU

      There is some variation in core count, an i5 (usually the -T/S and -M models) may be dual core + HT, non -QM mobile i7s are dual core + HT too.

      The first digit after the family denotes the generation. So an i5-2500 is a Sandy Bridge chip and an i5-3550 is the third generation, i.e. Ivy Bridge.

      see ark.intel.com for more.

      With AMD, i’m less familiar, But the A-series (A4/A6/A8/A10) are Accelerated Processor Units (GPU + CPU on the same die) (technically so are intel’s sandy and ivy bridge chips but Intel doesn’t use the APU term like AMD does) while their FX -4000/6000/8000 series are pure CPUs.

      Now the APU series has Llano and Trinity, Llano being based on the Stars arch from the Athlon II and Trinity is based on piledriver.
      Last year’s FX series is based on bulldozer while the present 8300 series is based on piledriver

      Now Bulldozer/Piledriver isn’t a traditional core/thread setup, so go read about that on Tom’s Hardware and AnandTech.


  20. SuicideKing says:

    Vishera CPUs are very competitive with i5s if you consider heavily threaded performance, though they aren’t as efficient.

    Games generally prefer higher IPC architectures, hence Intel has a distinct lead in games.

    Though if you look at this chart:
    link to media.bestofmicro.com

    You’ll see that in GPU limited games (or at GPU limited resolutions) the difference is pretty low. But that chart is from the single player campaign, the i5s/i7s would pull ahead in multiplayer games.

    Another thing, if you’ve been following hardware news for a while, you’d know that the difference in 99% games b/w an i5 2500/3570 and an i7 3960X is barely 5 fps at the most.