Week in Tech: The PC Is Dead, Long Live The PC

My desk drawer, yesterday

You know the one about the New Scientist editor and his philosophy for the magazine, right? Science is interesting and if you don’t agree you can bugger off? It comes second hand via the shy, retiring figure that is Richard Dawkins and, for all I know, it’s probably apocryphal. But it’s at least in broadly the same ballpark as my feelings about the computer industry. It’s just had such a huge impact on the way we live. And none so much as the PC, even if the image of the poor old thing being devoured alive by a swarm of vicious mobile devices gets repeated so often, nobody really bothers to check if it’s true. And yes, we’ve been here before, kinda.

But in recent weeks it’s all become more baffling than ever. Try this for size: Record revenues for good old Intel, AMD laying off staff while another bit of what used to be AMD is paid $1.5 billion to take away what’s left of IBM’s chip production facility – deep breath – tablet sales tanking, PC sales taking up the slack, an Apple iPad chip with more transistors than an eight-core Intel PC processor, graphics chip vendors stuck on 28nm while Apple pinches all the 20nm production capacity…I’m not sure what to make of all, especially in terms of, ya know, simply playing games on PCs. But one thing is for sure, it’s interesting. And if you don’t agree…

The Intel in fine fettle versus AMD lurching from one disaster to another thing is a familiar refrain, I suppose. But for the record, Chipzilla recently shifted 100 million microprocessors in a single quarter (ie three-month period) for the very first time.

The result was record revenues for Intel (though not, as far as I can tell record profits, so the modern-day Gordon Gekkos that own all the shares no doubt remain unsatisfied). The good news, for what it’s worth, included a nine per cent upswing in client PC chip sales (in other words, boxes for gamers and wage slaves alike, not server or embedded processors), year-on-year.

As for AMD, it’s got a new CEO to go along with a new round of job lay-offs and the ever-present sense of imminent demise. Meanwhile, the company that used to be the bit of AMD that actually made chips, Global Foundries, has just acquired what’s left of IBM’s computer chip production facilities.

The weird bit here is that IBM actually paid Global Foundries $1.5 billion to take the bloody stuff off its hands. That’s how hard chip production has got these days. A company like IBM had fell behind so badly it had to pay billions to extract itself from the game.

More powerful than a PC? Nope, but that A8X chip is still impressive

Not that the fact manufacturers are rapidly approaching the physical limits of conventional chip production is a problem unique to the PC. But I can’t help noticing that Apple’s fancy new A8X chip in the new iPad Air is being made by none other than TSMC at 20nm (that’s a measure of the size of the tiny bits inside the chips and for decades the whole computer industry has been predicated on it becoming ever smaller, allowing faster and cheaper chips, google Moore’s Law for, er, more) .

It’s TSMC that pretty much makes all of AMD and Nvidia’s high performance graphics chips and the fact that they’ve been stuck for ages at 28nm waiting for TSMC to get its 20nm act together is something I’ve droned on about in posts passim.

But it turns out TSMC can make 20nm chips, it’s just Apple seems to have snapped up all the available allocation, which you’d think reflects where TSMC sees its priorities and the future – ie not the PC. Or maybe it’s just the allure of making the chip for one of the fruity firm’s new devices.

And while we’re talking about Apple’s A8X, the bloody thing’s got three billion transistors. In a tablet chip. Yes, yes I know it’s a sort of system-on-a-chip sort of chip with loads of non CPU-and-GPU functionality (though the iPad has plenty of other chips, let’s be clear). And we could even get into a discussion about layout versus schematic transistor counts (honestly, if you don’t know, you don’t need to). But even so, that’s way, way more than Intel’s top mainstream quad-core CPU for desktop PCs, the 1.4 billion transistor Core i7-4790K.

14 billion transistors. Count ’em

The really sobering comparison is that even Intel’s mighty eight-core Core i7-5960X only has 2.6 billion transistors. That’s $1,000 / £750 for a CPU with fewer transistors than another more complex chip that is just one part of an overall device with a touchscreen and wireless comms, a lithium battery pack and all the rest which is yours for just $500 / £400.

Of course, the biggest ever PC graphics chip, Nvidia’s GK110 as found in the GTX 780, 780Ti and various Titans tears the A8X a new Lightning port with over seven billion transistors. So a full-power PC is still the daddy. But it does make for an interesting value comparison.

On the other hand, and just to square this circumlocutory circle, we have news from tech industry soothsayers Gartner that tablet sales are now flatlining. From what I understand, what’s broadly happened is that everyone who wants and can afford a tablet kinda now has one, and the tablets they have basically work fine, so the big growth period is done.

The numbers are a bit complicated (you can see them here, if you care), but Gartner’s elevator pitch is thus: “Consumers’ attention is slowly going back to PC purchases as tablet adoption peaked with mainstream consumers.” Who’d a thunk it?

Any alternative to bashing endless orc-bots?

Anyway, all of this is interconnected in a fascinating but hard to properly pick apart kind of way. What exactly it all means for we gamers, well, don’t ask me.

Actually, when I say ‘we gamers’ I feel a bit fraudulent. I’ve been neglecting my gaming duties of late, what with a few projects on the go, and in an act of penance I’ve, *cough*, acquired an AMD Radeon R9 290X and set myself a mission of getting properly back on the horse.

Question for you lot, then, is what would you recommend to break in the new beast? I’ve got Shadow of Mordor and the Vanishing of Ethan Carter on my short list. What else should I be playing for a combination of middle-brow gaming pleasure and 1600p visual splendour? Actually, my current mood calls for something that falls bang in between Mordor and Ethan Carter in terms of the balance between bashing endless orc-bots to death and wandering around looking blankly at things while pulling the odd lever. Suggestions on a virtual postcard below, please.

61 Comments

  1. yogibbear says:

    Alien Isolation, Civ : Beyond Earth & Legend of Grimrock 2.

    • A Gentleman and a Taffer says:

      Aye, Alien: Isolation would be spot on. It’s a AAA blockbuster played at walking pace, with many a lever to pull. It’s also the best linear AAA game since Half-Life. You should get that.

  2. schurem says:

    Hook up a rift (I bet you can lay your hands on one, being the resident hardware boffin) and get lost in space. Elite:Dangerous. You want to. You need to. It is your destiny.

    Heck even without the rift it is jolly good.

    • Premium User Badge

      steves says:

      Seconding this. Elite is so, so good – your jaw will ache from dropping.

      For the “middle-brow” (ok, kind of low-brow really) the new Borderlands pre-sequel is amazing if you have a 120Hz+ G-Sync monitor, and one of those pictured cards to drive it at a high enough framerate to appreciate things.

      The pre-sequel gimmick is a slow, floaty, low gravity jump thing, but you still aim & shoot with the usual frenetic FPS controls. Paired with stupid-smooth motion, it’s a glorious thing.

    • Armante says:

      Thirding Elite : Dangerous. Beta 3 launches on the 28th, and the current 2.06 is pretty stable. Yes it’s not feature complete, but I don’t really care. It’s just already so good, fun plus it looks and sounds great. At 1600 with three monitors or a rift you will want to play nothing else :)

    • Cinek says:

      Hehehe, nice try.

  3. seroto9 says:

    Watch Dogs can be big, detailed and very beautiful, with the minimum of tinkering.

    You’ll probably get bored with the actual game after a while though.

    • Diymho says:

      I really couldn’t get into Watch Dogs, just seemed lifeless.

  4. Diymho says:

    If you haven’t already, Metro the Redux version and I second Shadow of Mordor

    • Premium User Badge

      Grizzly says:

      I second the Metro: THe REdux. Seems likea perfect place for all that POOWWAARH.

  5. shinygerbil says:

    I read this article while eating a portion of chips.

    I should have played a game where, whenever I read the word chip, I ate a chip.

    Chip.

  6. bhauck says:

    Re: that title picture, I really hate that there’s just enough gold in old CPUs for people to buy tons of them to destroy. I just think they’re really cool, and I’d love to buy a few dozen as scrap, but the darn gold drives the price up.

    • battles_atlas says:

      I have the same problem with wedding rings

      • bhauck says:

        If processors were solid gold, it wouldn’t bother me as much. But you need six different, incredibly dangerous chemicals just to pull any gold out, and you get the tiniest amount, so it feels more like people are really going out of their way to destroy something pretty cool.

        • battles_atlas says:

          Sounds like they’re doing a great job of recycling a finite material.

          • froz says:

            Yes, they are doing really good job:

            link to theguardian.com

            Google for more pictures to see how it looks like. It’s a really, really shameful thing. I wish RPS would write about that (maybe they already did and I didn’t see it?).

          • bhauck says:

            Again, I wish they didn’t have anything valuable in them, so that they’d only be valuable to people who thought they were cool.

  7. kwyjibo says:

    Biggest farce in PC tech is that Apple have just launched Retina displays on their iMacs.

    While PC monitors still have a lower resolution, not to mention pixel density than fucking phones.

    • joa says:

      Would rather have a higher quality monitor than a higher resolution one. Monitors seem to be the one area in which things are continuously getting worse. Back in the day if you put a monitor on native resolution, it would look excellent and sharp, but monitors these days look blurry, because they don’t use real pixels.

      • untoreh says:

        Monitors look blurry because they are all lcds. Crt were not. If you do not want blur on ur lcd either use strobelight with a monitor that has it or wait for 500-1000fps monitors to match human eye blur

      • Czrly says:

        I agree whole-heartedly. As it is, my Alienware 17 has a pixel density that makes anti-aliasing rather unnecessary. I’d much rather have better colours and a darker black-point than more pixels.

    • Premium User Badge

      steves says:

      Apple do their own thing:

      link to anandtech.com

      “This also means that since it isn’t multi-tile, Apple would need to drive it over a single DisplayPort connection, which is actually impossible with conventional DisplayPort HBR2. ”

      Actually impossible! Kind of sucks…will have to wait for display port 1.3 capable graphics cards for this sort thing.

      “Would rather have a higher quality monitor than a higher resolution one”

      Less of your ‘rather’ there please;) I want both! Not at the same time, that’s a while off, but a 5K monitor for desktop @ 60Hz, and capable of running at half that resolution but at twice the refresh, with not horrible colours + response times would be nice.

      Someone make this, I have the money…

      • TacticalNuclearPenguin says:

        If you read the rest of the article it’s still unclear how they’re managing to get enough bandwidth for the job. There are a lot of tricks that can be used and not all of them are “honest”.

        Nvidia did something similar to enable 60hz on HDMI for 4K, but they didn’t hide the needed compromise:

        link to tomshardware.com

        Now, i’m not saying Apple is necessarily using a lower IQ trick and being dishonest about it, but it happened before. They like to have people thinking they can do all sort of magic, like when they released a “patch” that resolved the overheating problems of some mac books, when it actually turned out they got downclocked.

        Meanwhile some DIY nuts soon discovered that the real problem was a pathetic application of thermal paste, but fixing that yourself would kill the warranty off course.

    • Stochastic says:

      I’m hoping that within five years I can upgrade to a color calibrated 4K OLED high refresh rate adaptive vsync monitor. One can dream, right?

    • Premium User Badge

      tigerfort says:

      I’d imagine Apple are grabbing all the high-pixel-density displays the same way they’re grabbing all of TSMC’s 20nm production capacity: they have high profit margins and a huge cash pile, which means they can afford to outbid everyone else. Damn capitalism, spoiling everything again!

      • untoreh says:

        That’s not true, you can buy a shitoad of chinese phones with retina displays. Im typing from a 7″ 1920×1200 display. Apple does not own everything (yet)

    • TacticalNuclearPenguin says:

      Pixel density doesn’t make a display “better” in every sense of the word, much like more megapixels on a camera are a problem if the sensor that houses them is not big enough, in which case they actually are a detriment to IQ, adding more noise.

      I’m not saying that the two things are comparable, but i wouldn’t be surprised to discover that on a too crazy panel there are more color spikes, driving issues like mad pixel inversion and a lot of other possible issues that might not be obvious at first. Even if this wasn’t the case it’s still true that i never saw a mad DPI panel that focused on color quality or contrast rather than novelty.

      Apple displays have some decent factory calibration which makes them better than the majority of screens you can buy, even those that declare they are “calibrated”, while also providing decent uniformity and quality control. Still, they can’t dominate the really professional offerings like Eizo and Nec which are simply way too much ahead, and they really never try to push the density too far but rather focus on the electronics behind them, which is an incredibly understated thing nowadays that people think that you “just need the right panel”, or “Buy any IPS at random”.

      Apple is doing this 5k thing on a 27 incher because they want to use their own form of double scaling, since that 5k thingy is exactly 4X the 1440p of their normal 27 incher. Sure, retina is cool if your OS has perfect support for vectorial content and so on, but i still think it’s a missed opportunity. Such a resolution would have a higher pixel pitch than a 27 @1440p even if it was stretched to 36 inches of glory. Imagine the visual porn.

      • Cinek says:

        Pixel density doesn’t make a display “better” in every sense of the word, much like more megapixels on a camera are a problem if the sensor that houses them is not big enough, in which case they actually are a detriment to IQ, adding more noise.” – that’s completely NOT how screens work.
        Many of the limits in cameras come from physics – desktop screens are not limited by physics, and won’t be for years if not decades to come. Second big limiter is technology available for a certain pixel density, but desktop monitors are nowhere near being the limits of that either. Both of these are more of a problem on smartphones than anything else (usually Android-based smartphones due to the DPI fetish they got) .
        The limit of quality on desktop monitors comes from $$$. Yep. The only problem is how expensive technology, processing and quality control they’re willing to use in a screen at a certain price point. And that’s basically where it ends. Even in 4k screens. That’s what decides about dead pixels, calibration, how evenly lit the screen is, how it’s ageing, quality spread, etc.
        Sorry, but your parallel with camera sensors is a complete and utter BS.

        • TacticalNuclearPenguin says:

          Yeah, i agree on that part, it was mostly poor wording on my part as i already mentioned that my example might be a tad broken.

          What i really meant and didn’t manage to convey is how pushing for more pixels is just a modern craze, as you said, and usually most brands follow this route by disregarding everything else, which is detrimental to image quality.

          Eitherway, Apple doesn’t have an exclusive recipe for such DPI, it’s just that they’re the only one pushing it for desktop space. For as long as there are 10.000 dollars DCI compliant OLED displays that don’t give a damn about resolution and are 100% focused on color grading, or for as much as Eizo and Nec do their thing while LG could easily supply them with far more DPI ( There’s a LG phone that beats apple on DPI ) but they still choose the trusted high quality panel, i’ll stand by this idea.

          It doesn’t matter if i’m wrong about the limits of pixel pitch as long as there are other consequences that are not necessarily related to their physical structure but more to all the rest behind them.

          Besides, whatever the reason, AMVA still can’t go over 100 DPI, IGZO technology was developed for the limitations of amorphous silicon for bigger, higher resolution screens, and the list goes on. I’m pretty sure that whoever goes too crazy on DPI is cutting some other corner, regardless of how stupid my original analogy was.

  8. phelix says:

    I wonder what will happen if AMD truly goes.
    Will Intel abuse its freshly found monopoly? (probably yes)
    Will Intel eventually be pushed of its throne by ARM? And, by extension, is x86 reaching the end of its life?

    • Orillion says:

      They already abuse their near-monopoly, so yeah. I buy AMD out of a sense of duty, but it often feels like I’m the only one.

      • Captain Joyless says:

        No; I, too, buy AMD purely on principle.

      • disorder says:

        I try to think similarly, but this upgrade round, for me AMD just aren’t performing well enough to suspend my disbelief at the buy now box. I wish that wasn’t so, but this time, I still went over. There’s applications AMD’s more core for the $money_unit tactic can work out (probably), but games still aren’t it and it’s a resort, not a strategy of choice. It’s been ages since AMD’s only real hope to get back on top was in somehow leveraging the ATI buy to really combine graphics in tier-1 performance cpu/gpu products like we all maybe hoped, but they just haven’t and worse, don’t even seem to know how they’d go about doing that anymore* (if they ever did).

        Instead, from AMD we got some interesting mobile chips, which Intel still beat on the terms that matter (i.e. power) because graphics aren’t a big deal in that segment. Intel’s manufacturing process is definitely way better than anyone elses, and that’s crucial, as their CPU’s are class A /too/ – which also offers power wins, but also makes them practically unassailable. (Aside – you can bet power consumption is Apple’s reason for getting first on the line for TSMC’s 20nm – and a further aside: it’s surprising and impressive to me that x86 embedded tablet/phone cpu’s are still not competing with ARM in those sectors, even taking intel’s process advantages into account; a situation I hope persists forever).

        In the meantime, on desktop it’s obvious intel are doing little more than treading water. Why would they do anything else. Growth is in selling atoms at below cost, to get anyone at all to shove x86 into tablets and phones.

        * think about the heat output from that per cm/2.

        • montorsi says:

          Indeed, I buy my CPUs on principle. That principle being I get value for my dollar.

          AMD have been underperforming for far too long to reward them for their behavior.

  9. nrvsNRG says:

    “but what if you’re wrong”
    link to youtube.com

    btw,Check out Lords of the Fallen (out in a few days).

    • Wowbagger says:

      I’ve kind of jumped on the hype train for Lords of the Fallen, I hope it lives up to my God of War and Dark Souls mashed up together fantasy.

    • Cinek says:

      I hope this game will have a demo, or something… cause if it’ll have Dark Souls controls (read: your character is completely locked while combat animation is being played) then there is no way I’ll pay for it. But other than this worry – that game looks really great so far.

  10. RARARA says:

    I don’t like change! :(

  11. waltC says:

    Who gives a darn about the number of transistors–performance is what counts. The Intel silicon blows Apple’s stuff right off the map–it’s not even close in terms of performance. AMD is betting on its APUs, deciding, unlike Intel, to abandon the traditional PC pretty much–never mind the PC is still by far the best buy going and is what put AMD on the map and is putting money hand-over-fist into Intel’s pockets. AMD is making a mistake, imo, and needs to get back into the x86 PC space with gusto. The company’s lackluster performance in the PC space for the past several quarters is precisely because AMD has put *nothing new* into that space in quite sometime (besides a couple of too-little-too-late Piledriver revisions–AMD should have shipped a new x86 chipset & a Steamroller x86 cpu this year!)–whereas Intel has put the pedal to the metal and their super PC-chip sales last quarter prove it. Lots of people guessed wrong about the “demise” of the PC…Microsoft and AMD, to name a couple of notables. But AMD is no Microsoft and is rapidly proving it can no longer tow the line with Intel any longer. I think that’s too bad and I hope the recent leadership change @ AMD will return the company to its roots. Intel proves every quarter just how much gold there is in them thar’ PC-chip markets…;)

    • montorsi says:

      This is a significant issue for AMD. I’m looking to drop a couple grand on a gaming PC and they have literally nothing to offer me. Intel’s processors run faster, require less power and don’t cost so much more that it makes sense to even look at AMD. What’s more, when I am looking to give my rig a bit of a boost three years from now, I’ll be looking at Intel again because I won’t want to replace my motherboard and RAM. So at a minimum they’ve lost me as a customer for the next six years. I can’t fathom what the fk is going on over there, and I say this as someone who bought their hardware when they were even remotely competitive.

      The Nvidia 9 series GPUs are also problematic but we won’t get into that. AMD should count itself lucky that it’s powering consoles right now, otherwise… it’d be looking very very bleak.

      • Xzi says:

        +1 to all of that. Just ordered the components for my first new gaming PC since 2010. Re-using RAM, SSDs, HDDs, a PSU, and a DVD RW drive, but the new components are Intel/Nvidia. I had two machines I had built previous to my current, both of which were AMD/ATi (or AMD/AMD if you prefer), as is my current.

        I’m extremely excited to be making the switch, because I know from playing on friends’ machines how much of a performance boost I’m going to see. Intel has blazed ahead of AMD in the last four years. Which is also why I found it a bit puzzling when both next-gen consoles went with AMD APUs. I mean, yeah, they run DirectX 11. At the bare minimum of DirectX 11 hardware. They were already obsolete in comparison to any enthusiast’s gaming PC on the day the new consoles were released.

    • TacticalNuclearPenguin says:

      Besides, even if a certain brand doesn’t believe in PCs, it’s still rather stupid not to keep pushing the x86 space.

      You’ll always get money for the professional world with your professional chips, as a random example, but only if you’re the leader and your stuff is worth it.

      Even if you lose in that market too there’s still Apple, if you prove them your chip should stay in their iMacs, MacBooks, MacPros and so on you get another big return.

      The only deal AMD has nowadays is the slave labor work they’re doing for the new consoles, which simply couldn’t refuse such a ultra cheap offer. Oh, and before i get nitpicked to death, YES, i do understand AMD must have other deals elsewhere, but the ship is still sinking for one reason or the other.

  12. caff says:

    Given the recent survival article on Skyrim and mods, I think it worth a trip back if you aren’t fatigued already.

    In terms of new tech, can you keep an eye on the next breed of 21:9 monitors? I know you were quite dismissive of using them for web browsing etc., but I’m genuinely interested from a gaming/film watching perspective. There are curved 34″ 1440p models appearing now and into early 2015 (AOC u3477Pqu, LG 34UC97, Dell U3415W, Samsung S34E790C).

  13. ansionnach says:

    One of the great things about the PC is how many buttons it has: a whole keyboard worth. Throw in a mouse and you’ve got far more control for games and more – can do far more complex things with less frustration. You can also go back and play X-Wing with that old Gravis analogue joystick if you like (with some sort of gameport adapter or old soundcard). Sure almost everything made today probably has more transistors than the old 486 in the attic, but only its successor has a better catalogue of games. For me, touch screens and game controllers just can’t cut it… and doing most tasks with them feels like trying to make a jam sandwich while playing Starcraft with no arms.

  14. untoreh says:

    AMD layoffs are probably the only ones that make sense, the company as you said already changed CEO with one closer to the field and knowledge of technology. They rehired a lot of people from the gold era and it is totally understandable they are restructuring their work force. It seems they are pretty much going through the same intel did with i7 a consistent shift in their mc design path.

  15. cpt_freakout says:

    Why, Mount & Blade, of course

    • Cinek says:

      lol. Sorry, but M&B got next to nothing to offer when it comes to pushing the boundaries.
      It’s a good game, but good in something else than what we discuss here.

  16. Xzi says:

    It doesn’t really matter how small Apple’s tech is. It’s still always going to be overpriced and underpowered compared to the competition.

  17. SuicideKing says:

    You’ve noted it, though I think it needs to be emphasized more: the CPU and GPU components of the A8X are likely at most 2 billion transistors combined, most likely with the SRAM cache included. Intel’s 5960X is 2.6 billion with just the CPU and accompanying support logic alone.

    I’m also reading rumours that the A8X has a third CPU core, but I guess we’ll have to wait for AnandTech and Chipworks to get on that.

    The more apt comparison for the A8X will actually be Intel’s Core M.

    Anyway, I’ve just been playing a lot of Borderlands 2, lately.

    • TacticalNuclearPenguin says:

      It has to be this way, otherwise it’d be hard to explain the still enourmous performance gap of conventional CPUs.

      Not like Apple is extraneous to this sort of marketing anyway, with their own little benchmarkS on the website cherrypicking only the best possible result of some unspecified test, with lines like “50x MORE SPEED!”

      Anyway, yes, it’s a tricore and so far it’s admittedly a very solid offer in the mobile space, easily beating some mobile octacores that god knows which kind of silly architectural compromise they made in order to squeeze so many cores in there.

  18. Cinek says:

    Star Citizen. Obviously. Easily the most beautiful game on a market with enormous attention to details, beauty, engineering, and immersion. Pretty much the best combination when you want to have an immersive world that at the same time looks stunningly beautiful.

  19. Metalhead9806 says:

    Recommendations:

    Metro Redux, Divinity Original Sin, Endless Legend & Dark Souls II.

    ^ those are the games that have taken most of my gaming time this year.

  20. marmot says:

    You got confused there or something?

    First, above the photo, you are writing about the i7 4790k which has 1.4 billion transistors. The photo shows a GTX Titan Z (7.1 billion transistors) and its caption says “14 billion transistors. Count ’em”. Putting a caption regarding a CPU under the photo of a videocard isn’t something I would expect from a tech review author.

    Second, the 4790k is definitely not a “mainstream” CPU, like you’ve called it. It’s in the high-end category, both price- and performance-wise.

    Third, the comparison between the mobile A8X and the i4790k is so much apples vs oranges, that I imagine it could only be done by a person who has just counted 1.4 billion CPU transistors on a GPU.

    What is this, a gamers’ website or Cosmopolitan’s tech page?

    • Mechorpheus says:

      It might be the ‘gamers choice’ or whatever the choice phrase is, but for Intel the i7-4790K is ‘Mainstream’. They reserve the ‘Enthusiast’ moniker for their LGA-2011 parts. Apparently you’re only an enthusiast if you demand more than 4 CPU cores (which won’t make a blasted bit of difference in performance of any game I shouldn’t imagine).

      • TacticalNuclearPenguin says:

        You also get far more PCI-E lanes as a random example, and previous SB-E and IB-E offerings saw the “cheapest” chip as a 4 cores with huge margin of overclock and, in case of IB, a properly soldered heatspreader instead of the pathetic budget oriented attempt of the mainstream version.

        In this light, an “Enthusiast” is simply one that demands a level of attention to detail and extra performance oriented features that would make little sense in the “Mainstream” platform.

        I understand the confusion, because the “Mainstream” platform is fast enough. Well, here’s the problem, “fast enough” is a dirty word for some people.

    • TacticalNuclearPenguin says:

      I don’t see the issue, really, since so many mobile producers like to spill numbers at random without a standard to go by, often summing the grand total of the whole platform.

      Jeremy did the same, only his numbers were bigger!

    • Dudeface says:

      Titan Z has 2x GK110’s at just over 7 billion transistors each. Hence 14 billion transistors. And has already been pointed out, the 4790k certainly is a mainstream CPU, not being an enthusiast Haswell-E CPU. Care to retract some of those condescending comments?

      • Diymho says:

        Damn you got there first! Been peeing me off all day but haven’t had the chance to reply to marmots silly comment!

  21. Mechorpheus says:

    I’d second the recommendation of Alien (although those bloody Androids do take the sheen off it somewhat), and, controversially, The Evil Within is worth a go as well. Sack off the framerate cap, and run r_forceaspectratio 2 which reduces the bars to a manageable size and its a good time.

    Also check out GRID: Autosport, if you fancy some racing action. Looks wonderful at high-resolution, and is easy enough on the hardware for you to get 120fps.