Hard choices: The Week In Tech

By Jeremy Laird on November 29th, 2012 at 1:00 pm.

It's a barren, featureless desert out there, Darling: The future for the PC?

Couple of questions for you hardware freaks to ponder this week. Is it time to think the unthinkable, to do the undoable and ditch the hallowed keyboard n’ mouse control interface for PC gaming? Oh, and is the desktop PC dead? The former’s something I’ve wondered for a while in relation to PC interfaces in general, but now somebody is actually having a proper stab at bettering ye olde rodent and fiddlestick. The latter bombshell, meanwhile, follows rumours Intel will stop selling desktop CPUs in a little over a year. That sounds bad. Fortunately, the reality isn’t altogether catastrophic.

So, an email dropped into my inbox earlier this week regarding the WASDIO. Would I be interested in spreading the word was the jist of it. This little gadget is currently being Kickstarted and replaces the keyboard half of the equation, hence the name. WASDIO. Geddit?

WASDIO that?

Anyway, make of the WASDIO what you will, I’m clueless as to its particular merits. Apparently only one has been built and they didn’t fancy lending it to me. But its existence does beg the question of whether the keyboard and mouse can be improved upon for proper, grown up gaming of the beard stroking variety.

 Is this the future of PC gaming?

Let me put it this way. The keyboard and mouse wasn’t designed for gaming. It was commandeered for gaming. Surely it would be highly serendipitous for it to just so happen to be the best tool for the job?

At the same time, touchscreens are beginning to eat away at the keyboard and mouse’s dominance over mainstream computing and that full-body-motion-sensing Kinect shizzle is all the rage in console land. Apparently. I don’t do consoles.

Anyway, touch or motion sensing could add an interesting new layer of interactivity. But is either likely to take over PC gaming? Or will it be something completely new like the WASDIO? Colour me a Luddite, but I’m dubious about the prospects of a new game controller or control paradigm replacing the keyboard and mouse any time soon.

Nothing I’ve yet experienced comes close. For me it all comes down to Counter-Strike. When I first started playing, it was the usual drill. I’d spawn. The round would start. And I’d be mown down in picoseconds. Pretty quickly, I was convinced half the map was infested with script kiddies touting the latest aimbot. They were too good, too precise, too consistent.

“People” gaming courtesy of Kinect. Please kill me, now

But eventually I got that good, that precise, that consistent. Well, on occasion. With a keyboard and mouse it’s like any other sport or act of physical dexterity. Playing tennis. Driving cars. You get in the zone and everything flows effortlessly. And I can’t imagine that ever happening with something ghastly like a game pad.

But then maybe I’m an anachronism. After all, I like thin-beam racquet frames with maximum feel and don’t do driver’s cars without manual gearboxes. Hell, I get upset by fly-by-wire throttles.

Call it a massive fluke, but maybe the mouse and keyboard can’t be bettered after all. What do you reckon?

The PC’s dead. Long live the PC

A terrifying little scare story popped up on red-top PC hardware rumour site SemiAccurate on Monday. The title read, “Intel kills off the desktop, PCs go with it.” Nice.

Now anyone who knows Charlie Demerjian knows he’s not shy of a little hyperbole. He’ll tell you straight up he’s happy to spin the angle of a story for dramatic effect. Not make things up, mind. But certainly play to the gallery. That was more or less what he once told me, anyway.

And that’s what he’s doing with this little story. I don’t dispute the claimed facts, though nor do I confirm them. They go something like this. Broadwell is the successor architecture to next year’s Haswell CPUs from Intel. Right now, we’re on Ivy Bridge chips. But you remember that, right?

Anywho, Broadwell will be built on 14nm. And here’s the kicker. Broadwell will be ball-grid array only. That’s a type of electrical connector and it means the chips will need to be soldered onto motherboards. The implication is pretty obvious. You won’t be able to buy CPUs separately. At best they’ll come soldered to motherboards.

Some Intel CPU socket madness from yesteryear

So, no more mixing and matching of motherboard and CPU. Personally, I’m not that bothered. CPUs are fast approaching good-enough status in terms of performance. It’s storage and graphics you really need to worry about. That will be especially true by 2014.

Anyway, the really bad stuff has already happened with Intel locking out overclocking from all but premium-priced chips. So soldering CPUs to motherboards is symbolically dramatic, but with AMD uncompetitive for as long as I can remember, the golden age of enthusiast x86 computing fade several years ago.

The only thing that really worries me is the prospect of Intel deciding to kill the third-party motherboard market, something it could easily do. It’s not obligated to sell CPUs to motherboard makers. I’ve never liked Intel motherboards. So that would be grim.

Given that Intel isn’t making the progress it would like in phones and tablets, it’s not hard to imagine it actually happening. The pressure for revenue growth may simply be too much.

Goodbyeeeee!

Keep that line of thinking going and you can also imagine Intel one day locking out third party graphics. Let’s say AMD does die or at least completely gives up on high performance CPUs, as seems a fairly realistic prospect. In that scenario, Intel would have the option of no longer hooking up its mainstream CPUs to a high bandwidth PCI Express bus. You’d be forced to run Intel integrated graphics.

After all, by providing a PCI Express bus, Intel is enabling NVIDIA’s graphics business. If Intel ever feels like its integrated graphics is good enough for gaming, once again the temptation to squish a rival may be too much.

In reality I don’t see that happening in the next few years. It would be so bad for the health of the PC as a consumer platform. But five years or more from now? I wouldn’t rule it out entirely. It’s yet another reason to hope like hell that AMD somehow survives.

UPDATE:
In a recent statement to US-based Maximum PC, Intel has confirmed that it remains committed to delivering LGA sockets on the desktop for the “foreseeable future”, which is reassuring but, in truth, doesn’t mean a great deal.

__________________

« | »

, , , , .

124 Comments »

  1. coffeetable says:

    AMD’s supplying the chips for the 720 and the PS4 isn’t it? And as much as those are low-margin contracts, it does mean it’s around for another few years at least.

    • Moraven says:

      Console video cards:

      Wii – AMD
      PS3 – Nvidia
      360 – AMD
      Wii U – AMD

      PS4 and Next Xbox are not confirmed to be Nvidia or AMD.

      So they had 2 of 3 last gen. Certainly helps but the problem is them getting their CPU business back up and competitive. Purchasing ATI is one of the last good things AMD has done. And even then it took awhile to integrate them into their CPUs.

      • The Sombrero Kid says:

        They are confirmed to not be nVidia, both companies are very vocal about being treated like shit by nVidia.

        • selenabryan62 says:

          If you think Vincent`s story is astonishing,, a month-back my girl friends mom also earned $7404 just sitting there a twenty hour week from home and their friend’s ex-wife`s neighbour has done this for 3 months and broght in more than $7404 parttime On there laptop. apply the tips available here,.WWW.MEL7.COM

    • Tyrone Slothrop. says:

      Or be bought out by a party which cuts their losses and makes them do little else. I’m not suggesting that’s probable but it is an alternative to consider.

  2. Xzi says:

    At some point, and I’d imagine it would be sooner rather than later, would anti-monopoly laws not come into play? Maybe not simply with the decision to have CPUs soldered to motherboards, but certainly if AMD were to call it quits on CPUs and Intel decided not to allow any other motherboard manufacturer access to their CPUs.

    If Microsoft can be called on it, then Intel better tread carefully.

    • tigerfort says:

      Because Microsoft were punished so effectively for their anti-competitive behaviour, presumably in some other world.

      • Xzi says:

        Well they had to be split up into what was essentially multiple companies. And each of those has clear competitors now. Apple in the OS department, and Sony in the console department.

        What’s described in the article, Intel knocking off all its competitors for mobos or GPUs in one fell swoop with brute-force tactics, would be even more blatantly illegal than what Microsoft was penalized for.

        • Tacroy says:

          Did I miss something? As far as I can remember, the whole “split MSFT up into different companies” thing was dropped as soon as Bush appointed a new Attorney General.

          The console division is still very much the same company as the OS division, which is how they managed to have a ~10 – 30% failure rate on first gen 360s (depending on who you believe) and not go completely bankrupt.

      • PoulWrist says:

        Well, Microsoft got smacked, but Apple can do whatever Apple pleases, or so it would seem. So maybe that whole thing about monopoly and such is a thing of the past.

        • Brun says:

          This.

          2000(ish) – Internet Explorer installed by default with Windows? Monopoly! Break up that greedy corporate scum!
          2007 – Safari installed by default with iOS and the app store prevents installation of 3rd party browsers? Totally fine!
          2008 – “Browser” installed by default with Android, but Google App Store allows for installation of 3rd party browsers, making this identical to the offense cited in the antitrust action vs. Microsoft? Totally fine!

          I get that a lot of the US/EU v. Microsoft case was based on how MS was charging OEMs, but it’s unbelievably stupid that Apple has not gotten dinged for some of the same charges Microsoft did almost 10 years ago, given that their closed ecosystem makes those same offenses even worse. Unless of course, the charges based on IE and WMP were stupid and flawed…

          • Xzi says:

            Ah, thanks for clearing that up. I had forgotten the details of the issue.

          • AngoraFish says:

            The IE case hinged on MS having >90% of the PC market, AND using that dominance to move into a new existing market (browsers) and freezing out strong existing competitors (predominantly Netscape). In mobile platforms, there are at least three strong competitive operating systems and competition is currently extreme. Today’s mobile phone enhanced browsers are only a niche market on those platforms, and in any case are actively promoted by the manufacturer’s app stores. The MS case would never have got up if MS had smaller market share. Anything under about 80% dominance would likely have been enough for MS to dodge the wrath of the regulators, and no current mobile phone OS comes close to that.

          • El_Emmental says:

            Monopolies aren’t regarded as illegal if there is no abuse of the power (gained from the monopoly), if you make a very good product and your competitors are doing a really shit job at prodividing a better products/services – without any abusive barriers/obstacles, then you’re not likely to be split or fined for that.

            Microsoft used its monopoly power (through its OEM business networks) to force its browser product on computers, preventing competitors from having a “fair” chance on the home computer market (because Windows was in a position of monopoly).

            -

            When Google does that on Android, first there is competitors on the mobile OS market, second custom Android version can features different pre-installed browsers, third people are installing apps much more easily and often than back in the IE days (users usage is an important part of it).

            Regarding Apple and its iOS, given the fact their mobile OS is only installed on their own products, and the unity between the hardware (iphone & such) and the software is what makes the value of that product, Apple can argue that separating Safari from iOS will not only break the unified “Apple experience”, it will also force them to support these 3rd party browser, as their global reputation/value is from the seamless user experience (somethink that can be broken with a faulty 3rd party browser).

            As far as I know, it is not illegal to provide a “package” product, as long as that package isn’t using a monopoly over a “free” market to prevent competitors from having a fair chance. Apple hardware is the most closed market that ever existed, and you can’t force Apple to let 3rd party companies sell their own Apple-hardware-compatible OSs, otherwise businesses are now forced to make one single task ever, which is, in terms of business/economical efficiency, rather bad.

            If iOS was avaible on non-Apple product, then it would be different. Same if Microsoft were bundling IE Explorer on their own hardware line back in 2000, and was fully integrated in the “Microsoft Experience”. This is also why, as far as I know, Google doesn’t prevent people from installing 3rd party browsers, even in the custom Android builds made by OEM, because Android is used on a “free” market (non-Apple smartphones).

        • Continuity says:

          Apple is a niche where as Microsoft is (was) ubiquitous. Shady practice in a niche market where people use your products by choice and not because they have no other choice, is rather different from what Microsoft’s position was, or Intel’s position will be.

          Mark my words, even if the US regulators go soft on Intel the EU wont.

    • Syra says:

      Not necessarily, in the CPU space if competitors failed it could be considered a natural monopoly.

      I don’t think they can get away with crushing industries by not supplying parts which have low costs associated with switching from self welded proprietary motherboards to third parties, ultimately that will hurt their own economies anyway, particularly removing pci ports.

    • Brun says:

      You’re forgetting about ARM. If AMD goes down, various ARM manufacturers will become Intel’s primary competition. They’ll rule the x86 market but if current trends continue ARM will start encroaching on x86.

      • yallyall1 says:

        Yep. ARM will probably be intel’s biggest rivals in the future (in the cpu market at least). Whilst I sincerely hope AMD pull through, the decision for them to not compete in the performance sector means intel does have a monopoly. I believe a while back there was an article about the steam users hardware survey, and it showed a 70-30 share. If Intel stop using the traditional LGA sockets (whilst using LGA for the high end cpus), AMD will get the ‘enthusiast’ market. Intel will get the ‘performance’ market and the mainstream market. They’d upset a lot of people, but they’re in the position to do it (if they can defend their actions). My main concern here is the motherboard manufacturers. They could be made redundant pretty quick if Intel wanted, or have to produce a lot of motherboards very quickly which could create reliability issues

        With apple *allegedly* putting their support behind ARM cpus, recent linux patches and win8 RT I can see a very bright future for ARM. It’s a while off now, but then so is ‘Broadwell’. It’s subject to revision, and I can’t see Intel carrying through with this.

  3. james.hancox says:

    Intel’s already laid out their idea of what the future of the desktop is: the Next Unit of Computing. Soldered CPU, no separate GPU, tiny footprint. Basically a Mac Mini.

  4. yogibbear says:

    Haswell will hold us out for a couple more years (2013 & 14)… I’m not too worried about BGA… so long as it means Mobo’s are more reliable… but I do worry about the inventory issues this places on Gigabyte/Asus/Asrock/MSI/Biostar to accomodate BGA in their mobo’s… or maybe it will mean less flexibility in their offerings. i.e. not able to put a cheaper CPU in a fancy mobo and vice versa. Though I’m more worried about what this means for overclocking a CPU and if it will even be allowed on Broadwell. If AMD can release a half decent x86 CPU then Intel will still have to release an enthusiast grade CPU that is LGA based.

    • Subject 706 says:

      This. Intel’s board partners should go apeshit if this is true. Though that Semiaccurate article actually says that the successor to Broadwell – Sky Lake – will be socketed again.

    • El_Emmental says:

      “so long as it means Mobo’s are more reliable”
      Well, if the CPU fry or is faulty, you have to change the whole thing.

      If a component, or just a port (RAM, PCI, PCI-E, etc) of the motherboard fry or is faulty, you have to change the whole thing.

      If the thing is 10% more reliable, but the overall maintenance cost double, then no, I prefer separate CPU and MB.

  5. SuperNashwanPower says:

    In the year 2016:

    “RPS: Console gaming since 2016″

    PC gaming isn’t dead, but there’s a lot of greedy fuckers trying to kill it and put a walled shopping garden round it’s corpse

    • Hoaxfish says:

      don’t you mean “phone gaming since 1973″

      • dee says:

        don’t you mean “intra-cortex dopamine release software since our glorious corp-leader’s ascension”

        • SuperNashwanPower says:

          This. This is the one I meant.

          Also willy attachments

          • El_Emmental says:

            Even – hell, especially – for girls, since the successful worldwide ban on sexism of 2016.

          • Phantoon says:

            Mentioning gender is sexist in the future. Prepare to be fixed.

  6. killuminati says:

    Well in many cases, switching to a new CPu it means changing even the MOBO so I don’t think it is a big deal. Next time I’ll be looking for a total upgrade of my PC I’ll be browsing one less window and have everytihng in the MoBo sectio of an hardware site.
    I think there will be different version of the same Mobo with different CPUs on.. well will see :)

    • Continuity says:

      I think the implication is that Intel might try to squeeze out its mobo competitors, so not simply a case of soldered or discrete CPUs but a case of enthusiast mobo manufacturers disappearing and us just being left with whatever Intel decide we need.

      Of course this is sensational nonsense but then thats Charlie for you.

    • Ragnar says:

      I doubt it, it would be way too many product lines. Just take a look at the latest Z77 chipset. Asus has at least 7 motherboards with that chipset. Say Intel puts out 7 processors. Would Asus / Intel give us 49 different mobo / cpu combinations? No, that’s way too much work for them.

      So what we’ll get is entry level mobos affixed with entry level CPUs, and high-end mobos affixed with high-end CPUs. Where as now, we pick a CPU and then find the mobo we want to go with it, in the future we’d have to settle on the pair. It would certainly limit choice, probably increase prices, and I don’t know if it would even simplify shopping at all.

      • rb2610 says:

        I imagine prices would remain similar, while the lack of competition would enable Intel to bump their prices, I expect that combining the CPU and MOBO would if anything reduce production costs considerably.

        • Ragnar says:

          Lack of direct competition to AMD CPUs (since now you’re comparing CPU + Mobo, assuming AMD is still around) would lead to higher prices for CPUs, as you yourself said.

          Lack of direct competition between motherboards (since now they’re all bundled packages) would likely result in higher prices for those as well, since now instead of having countless motherboards to choose from to pair with your i5 3570K, you may be down to 1-2 models per vendor, so they no longer have to be priced competitively against cheaper motherboards that are paired with less desirable CPUs. Kind of a “You want the i5 3570K? Then you’re going to have to pay more for the premium Pro ++ motherboard that comes with it.” scenario.

          I don’t see anything that would reduce production costs. The CPUs still need to be made, the motherboards still need to be made, only now instead of throwing in a generic socket you have to purchase the CPU from Intel and solder the CPU into the mobo. Intel could price their CPU / Mobo combos for less (while charges 3rd party mobo companies a premium), but finding a silver lining does not dismiss the steadily darkening sky.

  7. James G says:

    Do not want. While on one level I’ve never replaced the CPU without also changing socket, and therefore needing a MB replacement, this can only serve to limit flexibility and choice. The cynical bastard in me wonders if Intel have specifically waited until AMD check into hospital with a DNR before deciding to take this route.

    • Continuity says:

      Of course, taking a step like this with an effective competitor would be shooting yourself in the foot… but if have an monopoly you can do what you want.

      That’s why monopolies are bad and are regulated against.

      • enobayram says:

        I don’t know how much of this is actually for a monopoly (maybe 100%) but all I want to say is that there might actually be technical reasons. For example, at today’s clock speeds and bandwidths, the contact resistances on the sockets might have started to become a limiting factor.

        On a side note, I don’t understand why AMD isn’t ruling the market. After their acquisition of ATI, they’ve obtained a unique position where they can provide an extremely tight coupling between a good CPU and a good GPU. AMD Fusion sounded so brilliant when I’ve first heard about it. I thought that might be the end of Intel.

  8. Snids says:

    I like the look of the Wasdio. I’ve always wanted analog movement for my left hand. Thats one benefit that gamepad users have over K+M.

    • grundus says:

      I like the idea of the WASDIO but I can’t help but feel it’s a bit dull and dated already. It reminds me of stuff that was supposed to look futuristic back in 1998.

      • Gnoupi says:

        Like the Strategic Commander? http://www.activewin.com/reviews/hardware/joysticks/microsoft/stratcomm/index.shtml

        I tend to think same. If you add this kind of obnoxious, large movement to replicate what was just 4 keys to press, it feels more tiring than anything. (And having owned a Strategic commander and attempted to play some FPS with it, I can attest that while it works, it’s more effort than necessary, and hardly effective.)

        • cowardly says:

          i’ve got a strategic commander, using it for GW2 at the minute and its great for that. probably wouldn’t use it for FPS or anything you would want precision movement for, i switch to WASD for jumping puzzles and such but its great when you’re trying to move and fire off skills.

          got the deadzone wound in so i don’t have to move it too far to activate a direction and that seems like it would be an issue with the WASDIO, looks like you really have to crank it over.

    • Gnoupi says:

      The Logitech G13 could answer this need for you.

      It takes the WASD and its surroundings, and add an analog joystick on it.

      • Tams80 says:

        Yeah the Wasdio is just a gamepad, kind of mixed with a mediocre joystick.

        The G13′s analogue stick isn’t great for movement, but the only real competitor is the Razer Nostromo (formerly Belkin) and that only has a digital stick that is hopeless for movement (though the device itself is quite good).

        • MrLebanon says:

          how is the Nostromo? I remember considering it once cause it was on sale but it looks kinda uncomfortable

          • Brun says:

            The Nostromo is nice and actually is more comfortable than using a keyboard like the G15. My biggest complaint about it, however, is that for many of the games I play it simply does not have enough buttons. Most games require the use of the entire left half of the keyboard, including the number row and Ctrl, Alt, Shift, and Tab. So that’s 5 rows of keys (beginning with 1, Q, A, Z, and Ctrl). The Nostromo only has 4 rows (Q, A, Z, and Ctrl) making it insufficient for many games. It’s particularly egregious in games like WoW as the lack of the number bar really limits your options as far as keybinding. So far I’ve only been able to really use it in two games – SWTOR (I played a class that only had a few hotkeys to hit) and GW2 (the relative simplicity of the control scheme meant that the Nostromo’s buttons were sufficient).

    • sonofsanta says:

      Likewise. I think the pudding that proves it is driving games. Driving games without analog is just daft. When I’m driving my car, I don’t rapidly jerk my steering wheel left-then-central when it’s only a shallow corner, I just turn it the correct amount – so having to tap a key to get the right amount of steering is painfully stupid.

      Likewise, for the majority of FPS/third person/actiony adventury games, I don’t need 90% of my keyboard. Remember when they did that to a controller? The Jaguar?

      I’m not sure the WASDIO is the right device, I think I’d prefer something Nunchuk style but with a few more buttons, but it’s certainly a path worth walking down IMO.

      • PedroBraz says:

        Thats interesting, the busses I ride everyday seems to have no problem driving exactly like that. Near accidents do seem to happen on a regular basis though.

        • El_Emmental says:

          perhaps the guy is using a keyboard to drive the bus -and- play Desert Bus Extra Edition on facebook at the same time

  9. Shivoa says:

    The move to lower power consumption / laptop focus seems to be the larger issue. AMD may have given up on the x86 cutting edge but so, it seems, has Intel. They’re going to push down the power use and see where they can go in the 5-50W zone (designs generally have an order of magnitude of viable TDP levels in them, for example a 500mW ARM chip doesn’t scale to 50W) but that means we aren’t seeing what performance characteristics can come from a 100W+ CPU. Maybe the server market will keep that side of things going along (like the Sandy Bridge-E CPUs which are also Xeons and give much higher memory bandwidth but only for a select few who demand that and games aren’t tailored to only work well on those niche systems at the cost of not working well on the mainstream machines which have similar compute levels but much less memory bandwidth).

    • Baines says:

      I’m actually glad for the push for lower power consumption without sacrificing existing performance.

      It really bugged me that my P4 doubled as a space heater. Power usage (and the subsequent waste heat) was continually rising in the chase for better performance. I’m not a big laptop fan, but the chase for better laptop support I feel brought some sanity back to PC hardware.

      • Shivoa says:

        While my GPU is pushing easily North of 200W (up to the cap of 375W) then it doesn’t really matter to me if my CPU is using 100W rather than 50W when I’m mains powered with a big desktop case with very large, quiet fans. The push to remove wasteful power spend is great, faster switching to low power states means dynamic clocks are viable (and so the Core architectures have decent single-threaded performance despite being quad core designs) and some of the Haswell stuff is very interesting but I’d still rather see what they could do with a 100W TDP cap on the top end parts rather than moving ever downwards with each die shrink (rather than the GPU designs that use die shrinks to completely change the performance metrics and stabilise power consumption – they got a bit crazy for a couple of years and it looked like computers would need 800W+ PSUs but are now back to sane thermals and arguably the current crop don’t even push as hard as they should at the $500+ areas)

  10. Persus-9 says:

    The WASDIO certainly looks like a lot better idea in terms of the future of gaming peripherals than most I’ve seen. It seems quite a modest idea really. Just replacing the WASD keys with a sort of squat joystick. I’m not sure how good it will be since I’m not sure my left arm is as coordinated and quick to respond to my commands as my left fingers are and it would seem to require larger movements, particularly when switched from hard left to hard right. I’d like to see this reach the market but I’m not going to put down a $100 for one when good old WASD, while perhaps not ideal, are certainly just fine for the sort of amateurish fun seeking gaming I get up to and the great thing is they double up as a keyboard for me to be able to type stuff like this.

  11. malkav11 says:

    I’m not against the idea of a new control scheme. But for me to adopt it, it needs to offer some benefit over what we’ve already got and as far as I can tell, neither touchscreens nor motion controls actually do (at least for gaming). At best, they’re “more intuitive”. Perhaps. But if they lose functionality (and so far, they invariably do), that’s unimpressive as an argument for them. People who are sufficiently interested in gaming obviously can and do learn existing control mechanisms. People who are put off by that necessity rarely transition to becoming “core” gamers even with the supposed gateway of those more intuitive controls.

    Oh, and the problem -any- control scheme that isn’t the default on the platform will suffer is a lack of support from software. So even if it’s a good idea, it’s unlikely to take off.

    • Persus-9 says:

      That looks like it will be one of the neater things about the WASDIO is that it looks like it’ll act as a keyboard emulator with the movement of the stick just producing combinations of key presses as far as the games are concerned.

      • Sparkasaurusmex says:

        I’d say that’s a knock against it, not a neater thing. It’d be great if it was a standard and had drivers for games, but emulation isn’t ideal. At least it could emulate a controller for the analog, but a lot of games would get confused if you were using analog and keyboard.

        I think a great joystick would be mounted sideways, so you can rest your arm on the desk while moving it.

  12. Roz says:

    If CPUs will be soldered onto MOBOs, looks like my i7 will be the last chip I get from Intel.

    Counterstrike was always a KB+M game, no one who is a serious player, playing in ESEA or eSports, has ever used a gamepad/controller to play.

  13. wccrawford says:

    I’m very much looking forward to the WASDIO. I’d been trying to come up with a way to get analog movement controls while keeping the accuracy of the mouse, and I hadn’t come up with anything.

    This looks like a great way to do it, and it even supports games that can’t handle the mouse/analog combo by mapping it to keypresses, if needed. I did ask to confirm that it’ll act as a normal analog joystick as well, and they said it does.

    The only improvement I’d like to see would be being able to twist it to turn, instead of turning with the mouse. That’d let the mouse range over the screen freely while the left hand controls all the movement. Except looking up… Ugh. But with an Oculus Rift, the looking part would be handled, too.

    • Qwentle says:

      The only problem with the WASDIO though is that it is a joystick, with the associated downsides (and upsides). You get analogue movement, which is important, but you lose the ability to quickly switch between opposite directions as you have to go through a neutral state to get to it.

      • AmateurScience says:

        Also – I may be wrong – but moving the thing means you move your whole arm, instead of just a finger, which will take longer, probably? Also, some bias in terms of ease of moving it left and right (I’m trying to imagine using it and I think I would be slower moving it left that moving it right – I did break my wrist v.badly a number of years ago though so it might just be that)

        • Ragnar says:

          Agreed, it looks like a step backward from thumb joysticks. I remember using a joystick for movement in MechWarrior back in the day and it was cumbersome and slow – appropriate for a hulking mech, not so for an athletic person.

  14. GallonOfAlan says:

    WASD-mouse is unsurpassed for first-person shootings and pointy-clickers like adventures and top-down RTS, but there are better ways to control fitba games, racers, flight sims …

    • Bayonnaise says:

      This will probably go down badly, but I’ve never been able to get past the knowledge that being good at shooters with a mouse is basically just about clicking on people’s heads. It feels like web browsing. There’s a disconnect that I just don’t get with a controller, and I admit I’m probably alone in that…

      *ducks*

      • Tacroy says:

        Being good at shooters with a controller is basically clicking on people’s heads too, it’s just your controller is a lot harder to use.

        Though actually it the motion usually ends up being a click-drag across heads because the controller isn’t as precise.

        • Bayonnaise says:

          Yeah, I’m not saying a controller is a much more immersive solution, but at least it feels like navigating the crosshairs to the head is something I could get wrong as often as I get it right. Not a skill as such, but at least an action that requires thought. Mousing to a little man’s head doesn’t give me that.

      • Baines says:

        That’s one of the game design issues with mouse-aim. Sometimes pin-point accuracy can be a detriment.

        A standard joystick can support 8 directions (not counting neutral), but there were good games that let you fire in only four, two, or even one direction. But these days, if you support mouse, people will ask for mouse aim. Particularly if you have 4 or 8 direction shooting.

        I remember playing a made-for-PC overhead shooter years back (Meteor, maybe?). I particularly remember one stage where a room set-up had been carefully designed so that you had to make a difficult shot to get past the guard. For a moment, I wondered how to best make the shot. Then I remembered that I could just put the cursor on the target. Cursor on target, press fire button, bam, guy was sniped and killed with the basic pistol. Mouse aim completely killed the level design.

        Some games even add randomized inaccuracy to counter mouse accuracy. And I have to wonder at times is that always really better than having limited fire directions that always behave exactly the same?

        Don’t forget that sometimes the controller is part of the experience. Playing a lightgun game with a mouse is just sad.

        I wrote all that without even getting into the issues with a keyboard, particularly my bugbear of when developers think “Well, there are 108 keys. I’m only using 57 of them. I should add some more stuff.” (Roguelikes were particularly guilty of that, where developers fought tooth and nail against the idea of a less cluttered control scheme, and the number of keys used seemed to be considered a measure of quality.)

        • Sparkasaurusmex says:

          Developers don’t look at keyboards and try to use more buttons. The only trend I’ve seen lately is to use LESS buttons. One button for everything! What are you supposed to do? Press Space and you will do it!

          • Dahoon says:

            Exactly. Like running and climbing in Ass Creed. It’s a hideous design choice made for console gaming.

            “Better hurry. I’ll run …oh, I’ll climb instead? Thank you for deciding that for me random wall.”

          • Brun says:

            Ah the contextual action button – yet another one of Ocarina of Time’s great contributions to video gaming.

          • Ragnar says:

            Honestly, after a certain point I start to get overwhelmed with buttons. I mean, the console controllers have 14 buttons, which is certainly a limiting factor, but right now I’m playing ME3 and I’m using 5 buttons on my mouse + 2 for scroll wheel + WASD + 12 buttons bound on the keyboard. That’s 19 not counting WASD. I could be using more buttons on the keyboard – there’s at least 4 additional buttons mapped by default, of which I can’t reach one and forget to use the other 3 – but I’m already hitting my limit at 19 buttons + movement + aiming.

        • Droopy The Dog says:

          My old analogue joystick laughs at your piddling 8 directions.

          I also doubt any successful game dev has just made up controls to fill up all the available keyboard bindings just for the hell of it.

          • Ragnar says:

            I’ve played flight sims where damn near every keyboard button was assigned to something. But that seems more purposeful / realistic than “just for the hell of it.”

            For that, you’d need to turn to WoW and it’s ilk, where whole peripherals are designed to give you access to more buttons. 2 thumb buttons? Ha! You need 12 thumb buttons! ATM the Hunter class has 51 unique spells, to which you could add potions, trinkets, other items, custom macros, etc. Makes me wonder how I ever managed to play it.

          • Baines says:

            I mentioned Roguelikes as an example. If you want to jump back in time a decade or two, there were people who did equate the number of keys used to some form of game quality when it came to Roguelikes. If you didn’t have two help screens worth of key commands, then you weren’t offering options. And people fought against the idea of simplifying the layouts, arguing that it was dumbing down the game.

            I’m not saying that they should have had a single all-purpose “action” key (nor a single “action menu” key), but you could find some really special case commands that could have been folded together without being weird and without harming gameplay. But the attitude was that the keys were there, so why not use them?

      • Ragnar says:

        Except that when you click on something with the mouse using basic OS UI, the mouse cursor moves while the rest of the screen remains stationary. When you’re aiming with the mouse, the whole screen moves while the cursor remains stationary.

        Personally, I get much more of a disconnect with aiming on a controller, as with the mouse everything feels smooth and fluid and natural, the same as if I moved my head to look around in real life, while with the controller I feel like I’m guiding the crosshairs to the target, or guiding the camera around, instead of just looking. I think the reason for this is that the mouse allows much finer control with infinitely varying amounts of acceleration and deceleration, which I can’t match on a joystick.

  15. BarneyL says:

    Would there not be a serious monopoly case if Intel started blocking other GPU manufacturers from plugging in to PCs? It would be on a par with Microsoft blocking third party software on windows.
    Plus unless they can produce a GPU that matches the performance\value AMD and Nvidia provide you can expect a pretty swift comeback for non intel gamong platforms.

    • yogibbear says:

      This has nothing to do with the PCIe interface and GPUs. We’re just talking about Intel proposing to ditch the LGA socket. I.e. you buy a motherboard that has a CPU already soldered onto it, just like AMD-E and Intel Atom motherboards do currently.

      • All is Well says:

        I think you missed the bit at the end of the article where Mr. Laird explicitly mentions a scenario in which Intel cuts out PCIe support from their CPUs. Purely speculation, of course, but relevant none the less.
        Or did you perhaps mean to say that it’s so unlikely it isn’t even worthy of discussion?

        • yogibbear says:

          Oh, my bad, I stopped reading when I thought the article had made its point. Yeah if I did read it, I’d still consider it ridiculous.

  16. fredc says:

    I haven’t used an Intel CPU / mobo for about 10 years. The extra £100 or whatever for mobo and CPU would get me better benchmark performance, but each time I’ve had to look at a new board, Athlons have worked out to give me entirely adequate desktop and gaming performance for a lot less money.

    If intel wants to sell crippled, proprietary hyper-expensive shiny black boxes that BSOD every year (I’ve twice bought PSs specifically to play GT, both times losing them to RLOD within a year, and have heard similar from pals with x-boxes/PS3s), it would be a great opportunity for AMD.

  17. trjp says:

    Mmmmmerrrhhhhhhhhhh (however you spell that) darling…

    On the control issue tho – the PC has always offered OPTIONS – it’s not, and never has been, about ‘keyboard and mouse’.

    You use what works best for you and the game – you don’t demand “keyboard and mouse” for every game – you don’t look down on “joysticks” (we had them before consoles existed)…

    As for the desktop demise thing – it’s hyperbole but if AMD fail to find a way out of their hole, it’s exactly the sort of thing Intel ‘could’ do – because in a world with no serious competition, they can do anything they feel like…

  18. Jason Moyer says:

    “I like thin-beam racquet frames with maximum feel”

    That’s because they’re the best racquets for skilled players. Nothing beats the classic Prince Graphite Pro or the Wilson ProStaff (and many pros use those still, often repainted to look like whichever current model they’re supposed to be advertising).

    Oh, on topic, ideal controllers depend on the game. First and third person shooters? Keyboard and mouse (duh). Third-person melee games, sports games, platformers, etc? Gamepad. Flight sticks and FFB wheels are self-explanatory I suppose. The idea of there being an ideal interface for every type of game seems silly to me.

    In regards to Intel, they supposedly have an LGA chip in the pipeline as a successor to Broadwell, so I wouldn’t worry too much yet. I’m not convinced that BGA is such a bad thing though.

    • SuperNashwanPower says:

      I want to be able to play doom 3 with a daisy wheel printer.

    • Baines says:

      Yes, it isn’t really a question of the keyboard and mouse being bettered, nor is it that the mouse and keyboard cannot be bettered. Mouse and keyboard works for certain games. Other controllers work better for other games. Games built for a particular controller generally work best for that controller. (Which in turn I think leads to certain advocates. A diehard PC gamer, at least up until recently, was going to mostly play games designed for a keyboard and mouse. A diehard console gamer is going to mostly play games made for a controller. And both would complain when they played a game that wasn’t suited to their controller.)

      But on the subject of the mouse, haven’t there been people that have argued that a trackball is better? Of course, even if you have a trackball controller, the games you play are going to be designed for a mouse. (I’ve also seen people advocate replacing a game pad analog stick with a trackball. There, though, the “coded for a stick” issue really rears its head, as the two controllers produce different kinds of output even before you get to how the game then processes that output.)

      • Jason Moyer says:

        I’ve only felt comfortable with trackballs in a few games (Missile Command, Marble Madness…ummmm, probably something I can’t remember) but I suspect that might be more because I haven’t used one much. I think a trackball would be good for something that requires mouse-like control but also benefits from the inertia. I’m not sure what games that would apply to, though. Rock Of Ages, surely (and, actually, my big turnoff with RoA is the control scheme, something a trackball would fix I think).

        • Baines says:

          The arguments I can recall that favor a trackball over a mouse is smoother movement, smoother changes in movement, and the lack of need to “reset” your mouse when you move it too far. Some of those have been alleviated by improved mouse tech. (The trackball option has arguably lost the most ground though with the sheer increase in the number of buttons that mice can now support.)

          When it comes to consoles and analog thumbsticks, trackballs are more mouse-like than a thumbstick. A console FPS might benefit from trackball aim, for example, and controller mod attempts tend to focus on the right thumbstick.

  19. Goodtwist says:

    Nah, you’re exaggerating much too much.

    First, if Intel decided to effectively stall the evolution of gaming hardware, and any new hardware didn’t give the gamer a reasonable benefit, then nobody would care to buy such crippled hardware in the first place. People would either keep their old hardware or competition would arise within a glimpse.

    Second, Intel could run into serious anti competition legal trouble.

    • Sparkasaurusmex says:

      Exaggerating too much? Quite the contrary- he’s trying to tone down the exaggeration by that other site.

  20. Casimir's Blake says:

    There is an extensive discussion of the Broadwell / BGA issue over at TechReport.

    Chuckula posted the following, though I would encourage you to check his post in full:
    “Haswell continues on the desktop in a rev 2 version in 2014 in the standard LGA format we have come to expect, while Broadwell is exclusively targeted at mobile solutions where BGA packaging is already well established so nothing really changes. Could there be some all-in-one Broadwell solutions that make it into desktops? Sure, but they wouldn’t be any different than the all-in-one solutions using mobile chips that you see now, so once again: nothing changes.”

    In short, BGA-only Broadwell looks to be a mobile part. LGA will likely carry on as the standard for Intel’s desktop tech, at least in the short term. My opinion: Intel may be perceived to enjoy a bit of OEM strong-arming but they would be absolutely nuts to effectively dissolve the enthusiast PC sector that they are a huge part of, even if it becomes less relevant in future.

  21. kregg says:

    Clicking on the pictures breaks the entire blog page for me.

    BLOGFACE

  22. Steven Hutton says:

    I guess we’ll (and I hope we do) see an increase in people buying controllers for specific games and genres. People are certainly willing to do this (Guitar hero) and it works very well.

    My most treasured position is and will forever remain my glorious, glorious arcade stick. I bought it for Super Street Fighter IV and have simply never looked back.

  23. AmateurScience says:

    Re: keyboards, the really great thing about my PC is does just about everything I need a computer for: gaming, yes but also typing, web surfing, stats, etc etc and I do all of those (with the occasional segue into gamepad territory) using my mouse and keyboard.

    Whilst they might not *always* be the best tools for the job, they are sufficiently good that I don’t feel any urge to replace them with a device that only does one thing, and will sit around gathering dust the rest of the time (the gamepad is excused because when I use it it doesn’t need any space on my desk).

    Re: Intel and the dreaded soldering iron, they’ve always struck me as a bit consumer-hostile with their practice of a new socket for every tick (or is tock?) so I’m not hugely surprised. I guess they’ve basically decided that the ‘enthusiast’ consumer is not worth chasing (I’m not sure how much of their business we represent in any case). It’s disappointing because regardless of anything else it will limit choice, and that is bad.

    • Bhazor says:

      Pretty much.

      Theres a reason we still have keyboards as opposed to any of the various ridiculous alternatives we’ve seen over the years.

  24. Bhazor says:

    Funnily I’d say Kinect and touch screens are on the way out. Again.
    Certainly the number of games built around them has dropped to next to nothing.
    Now it’s either an add-on/gimmick or because they simply don’t have any other choice (fondle slabs). Once you get over the novelty you’re just left with a whole lot less precision and a whole lot more typos.

    Aside: The day I’m forced to type with a touch screen is the day I buy a pen and notepad.

    • Dahoon says:

      Once you get over the novelty you’re just left with a whole lot less precision and a whole lot more typos.

      That goes for a gamepad too. At least in my case there are no games that are better with a gamepad (well, there are some, but that’s because of console controls remaining in the PC version). A gamepad for gaming is the same to me as using a Swiss pocketknife as the only tool in the kitchen. Possible? Yes. Anywhere near as good as proper tools? No.

      • Brun says:

        The reason the gamepad survives is due to its ergonomics rather than its functionality. Few people will deny that the keyboard is more functional, simply because it has more buttons. Try using a keyboard in a typical console use case (on the couch or floor, with no surface other than your lap) and it’s quite uncomfortable.

  25. fugo says:

    surely the wasdio is going to leave you with gorilla arm syndrome with nothing to rest your arm on?

    the only keyboard replacement i’ve ever had any time for was the nostromo (original belkin version). that must be the way to go – long term comfort, plenty of keys within reach, no time used learning the device and an analogue thumbstick under your thumb for movement.

  26. Lev Astov says:

    The keyboard can definitely be improved upon for gaming input. At very least, we need proportional keys, kinda like that Microsoft prototype keyboard with pressure sensing keys. That way we could smoothly accelerate movement in games and still have tons of keys to chose from.

    As for that WASDIO thing, it’s a good idea, but I suspect it is a terrible implementation. Notice how he uses his whole lower arm to move the thing? I don’t buy their statement about gross motor control being better. I don’t even do that to steer my car, as I plant my elbows on the door and console and use my wrists and fingers to torque the wheel slightly.

    I’ve used a 3D mouse ( http://www.renderosity.com/mod/forumpro/media/folder_9/file_433775.JPG ) to control games before and it’s pretty nice, but definitely going to take some getting used to. If they made the WASDIO handle short and squat like that, it could be controlled with your wrist and fingertips.

    I’d heard that said about Intel before and I highly doubt it. I don’t believe any company with investors to placate would outright eliminate their core market. The ball-grid array thing just means they’ll be going to something like the Pentium II cartridge of yesteryear. Also, if they started shutting out their competition, there’d be an immediate antitrust crackdown by the US government. Especially considering how many PCs the government uses.

    • Dahoon says:

      More likely EU would stop it. Since when has the government actually done anything before it was way too late? The law says stuff like if you own a domain from a country TLD, you are under that nations law. Do you see the government force Facebook to comply with that or do they let it slide? (facebook.de is under German law, facebook.se under Swedish, etc.)

      Look at the Microsoft spectacle. They are still rolling like they want to and then say sorry afterwards and get a tiny slap on the wrist (compared to what they earned). Like when they took 15 months to fix that the Browser choice screen didn’t show up in Windows 7 as required by law. “Oh, sorry about that. We will fix it ASAP!” No, the government is always out lawyered, out lobbied and out gunned. Put in one word: Apple Inc. (Okay, two then.)

      • Brun says:

        Like when they took 15 months to fix that the Browser choice screen didn’t show up in Windows 7

        I don’t blame Microsoft for fighting that judgement tooth and nail every step of the way, especially since Apple doesn’t have to provide the same screen on iOS.

        • El_Emmental says:

          W3C statistics:
          - 83,7% Windows OS
          - 9% Apple OS

          I hardly see how Apple is abusing its monopoly power over the home computer market OS with its below-10 percents of total users online.

          The browser choice menu was there because Microsoft -also- had more than 80% of the home computer OS market.

  27. ObiDamnKenobi says:

    grammatical error near the top; you mean Raises the question. A small oversight I’m sure.
    begthequestion.info

  28. mzlapq says:

    A different reading of the Intel story:
    Intel does not believe it can produce 60W 14nm processors during its “tick”. Since the architecture will not change, there will no benefit to get a desktop Broadwell over Haswell (that is, the only benefit will be lower power consumption, which will be negligible when using 60W+ CPUs).
    More importantly, where are the 8 core mainstream CPUs?

    • El_Emmental says:

      Where are the softwares with full quad-core support ? :P

  29. Simplisto says:

    The WASDIO device confuses me. Since it requires a program containing conversion profiles for each game, how is this any different to simply using an existing (and much cheaper) joystick/flightstick in the same way?

  30. Loz says:

    Can I just say thanks for putting the Blackadder screen grab in there?

    BAAAAAY!

    • SuffixTreeMonkey says:

      Intel: Only a madman would cripple the 3rd party motherboard business just to get some extra cash.
      Blackadder: So you won’t do that, Intel?
      Intel: BAAAAAAH!

  31. fish99 says:

    “Is this the future of PC gaming?”

    No.

    Btw why would I need to worry about graphics horsepower in 2014? The next gen of console will probably be less powerful than my PC is today, as was true of the original xbox, 360 and ps3 when they were released.

    The intel story seems to be a misreading of the situation if they’re going to return to sockets the generation after.

  32. InternetBatman says:

    I don’t give a crap about overclocking, but I’ve had a CPU go bad and replacing the CPU is far easier than replacing the Motherboard. As the need to upgrade goes down, faulty parts are now my primary concern, and it seems like they’re making future repairs more costly for no reason.

    Also, I firmly believe that extra mouse buttons plus an analog stick nunchuck are better than keyboard mouse for shooters.

  33. MirzaGhalib says:

    The WASDIO is just a joystick with well-placed buttons. I remember trying this with a flight control joystick and a mouse in the original Team Fortress in the early 2000s or maybe even before Y2K (XD) and it simply did not work well. Using gross muscle movement for both wasd movement and mouse aim just made the entire process too confusing.

  34. Tei says:

    Before Wikipedia you had Nupedia.
    Before the PC you had closed platforms.

    People that try to create wallet gardens and try to combat the PC always lose in the end, because openes is the power of us all, while closeness is the power of only one. And more than that, you can’t win a war against the PC, because even if you win at battles, it will just not die. The PC is a idea, freedom, and a group of people that want that idea. You must kill these people and kill a idea, to kill the Pc.

  35. SuicideKing says:

    1) Broadwell will be 14nm not 22nm.

    2) I highly doubt this’ll happen till the 10nm stage. Even then, as CPUs move more towards the SoC category (already have the CPU, GPU and mem controller on the same die), you’ll still have to plug them into the mobo that’ll probably only have the South Bridge and i/o ports, fan headers, etc.

    3) this soldered thing might just be for mobiles/tablets and other embedded systems. Highly unlikely to make it into the desktop. Maybe laptops get it too.

    4) killing of the third-party mobo/GPU market will simply open the doors to ARM based platforms. Not a good move.

    5) They’ll never be able to substitute the discreet GPU market, even if they kill off the entry level.

    Read more about intel’s possible plans (and this isn’t a semi-accurate report, if you know what i mean): http://www.anandtech.com/show/6355/intels-haswell-architecture

    p.s. also, how will intel ever manage to make up for all the mobos produced by ALL third parties? Not feasible imo, and that’s not even their core market.

  36. Continuity says:

    Charlie Demerjian… I should of known. I stopped reading the Inquirer because of his sensationalist reporting.

  37. reyn78 says:

    I’m sorry but doesnt this whole Intel thing ignore economics? Intel simply does not have a production capacity to push mobo producers out of the market when it starts to integrate cpu and mobo. Thus it will have to strike a deal to cooperate with these companies. Otherwise its cpu sales would be limited by production capacity of mobo unit.
    Most likely scenario would thus be that you would have to buy mobo plus cpu set. It is bad as a cheap mobo plus powerful processor combo would be harder to achieve, but far from pc is dead situation. Similar with gpu, the market for pc gaming is too big for inte to just call quits on it. It would immediately create a niche ir rather a cavern for companies to exploit – market hates vacuum. If milions of people globally vote with their money that they like pc gaming noone sane will ignore it. If intel is dumb enough to do so it will only resurrect amd (assuming it is dead) since even an inferior processor with external gpu will be better than in built intel card.

  38. LintMan says:

    OK, does anyone know where that screenshot at the top of the article is from? I know I should know this.

    • spleendamage says:

      It’s from Blackadder goes forth (or the fourth and final season of the Blackadder television show on BBC). I can’t recall which episode off the top of my head.

      • LintMan says:

        Ah, OK. It’s been 20+ years since I’ve seen any Blackadder, so my memories of the show are a bit cloudy, but it was definitely familiar. Thanks!

  39. zain3000 says:

    “God, it’s a barren, featureless desert out there!”
    “Other side, sir…”

  40. Hoaxfish says:

    Pricing for the Surface Pro has been announced:
    http://www.techradar.com/news/mobile-computing/tablets/microsoft-reveals-pricing-for-surface-with-windows-8-pro-due-in-january-1116221

    A 64GB tablet will go for US$899 while the 128GB nabs a US$999 price tag.

    • Brun says:

      No touch or type cover with either package, so getting the full-featured Surface Pro will set you back at least a grand.

      • fish99 says:

        Who at Microsoft thought these things would sell? I said this when they were announced, but one lacks an x86 cpu and therefore won’t run most windows software (surely the main selling point of a windows tab), and for the price of the other you would get an awesome laptop with way better specs. Who would want to spend that much on a tablet when the ipad is half that price and a nexus 7 is a quarter that price?

  41. Jraptor59 says:

    The PC is not dead. I, and many other enthusists , are sitting here with uber hardware, waiting for something other than a console port for gaming, So, develop one? The market is ripe for exploitation, with PC hardware (i7/680) are waiting…companies, use us to begin the next PC spec war. Or, wait for the consoles to try and catch up (7 year cycle).

  42. Aardvarkk says:

    I’m excited for what this can do for computers and gaming:

    https://leapmotion.com/about

    (Sorry if someone already linked it and I missed it.)

  43. Nate says:

    “CPUs are fast approaching good-enough status in terms of performance. It’s storage and graphics you really need to worry about.”

    I don’t keep up with hardware enough to directly contradict you here– but really? Framerate hunters are notoriously CPU bound. I feel like we’re at the point where the only thing that faster video cards can give us are lazier graphics engines that don’t have to worry as much about overdraw.

    And storage? Are you talking about loading screens, or loading chunks? I think there’s some room for bus improvements, but that has a lot more to do with the CPU (architecture) than with storage. And in any case, the short-term solution is just more video memory, which is exactly what we’ve been seeing.

    For the future, it’s a lot easier to gobble up extra CPU than it is to gobble up extra video capability– and I would argue that games that rely more on CPU than on video are better games. To take advantage of better video, we’re either going to need to see a major output revolution that leads to a demand for >60 frames per second, or else a massive increase in the art budgets of video games. To take advantage of more CPU, all you have to do is is apply realistic behaviors to more of the objects in your game. I know, “all” you have to do– but consider that realistic behavior is often the easier way to program things, and that quite a bit of games development is finding facsimiles for realistic behavior that don’t need too many cycles.

    Dwarf fortress, as an example, is always going to need more CPU. Shogun II, the most recent game I’ve been playing, would see no benefit from increased video (the existing 7k polygons and 256^2 textures are a little wasted on a model you tend to see five pixels of at a time [
    EDIT: or whatever it is]), but increased CPU could allow pathfinding on an individual basis rather than a unit basis.

    Still, like I said, I haven’t been keeping up as much as I used to, so I’m sure there are some good reasons for the claim– I’m just wondering what they are.