By Jeremy Laird on November 29th, 2012 at 1:00 pm.
Couple of questions for you hardware freaks to ponder this week. Is it time to think the unthinkable, to do the undoable and ditch the hallowed keyboard n’ mouse control interface for PC gaming? Oh, and is the desktop PC dead? The former’s something I’ve wondered for a while in relation to PC interfaces in general, but now somebody is actually having a proper stab at bettering ye olde rodent and fiddlestick. The latter bombshell, meanwhile, follows rumours Intel will stop selling desktop CPUs in a little over a year. That sounds bad. Fortunately, the reality isn’t altogether catastrophic.
So, an email dropped into my inbox earlier this week regarding the WASDIO. Would I be interested in spreading the word was the jist of it. This little gadget is currently being Kickstarted and replaces the keyboard half of the equation, hence the name. WASDIO. Geddit?
Anyway, make of the WASDIO what you will, I’m clueless as to its particular merits. Apparently only one has been built and they didn’t fancy lending it to me. But its existence does beg the question of whether the keyboard and mouse can be improved upon for proper, grown up gaming of the beard stroking variety.
Let me put it this way. The keyboard and mouse wasn’t designed for gaming. It was commandeered for gaming. Surely it would be highly serendipitous for it to just so happen to be the best tool for the job?
At the same time, touchscreens are beginning to eat away at the keyboard and mouse’s dominance over mainstream computing and that full-body-motion-sensing Kinect shizzle is all the rage in console land. Apparently. I don’t do consoles.
Anyway, touch or motion sensing could add an interesting new layer of interactivity. But is either likely to take over PC gaming? Or will it be something completely new like the WASDIO? Colour me a Luddite, but I’m dubious about the prospects of a new game controller or control paradigm replacing the keyboard and mouse any time soon.
Nothing I’ve yet experienced comes close. For me it all comes down to Counter-Strike. When I first started playing, it was the usual drill. I’d spawn. The round would start. And I’d be mown down in picoseconds. Pretty quickly, I was convinced half the map was infested with script kiddies touting the latest aimbot. They were too good, too precise, too consistent.
But eventually I got that good, that precise, that consistent. Well, on occasion. With a keyboard and mouse it’s like any other sport or act of physical dexterity. Playing tennis. Driving cars. You get in the zone and everything flows effortlessly. And I can’t imagine that ever happening with something ghastly like a game pad.
But then maybe I’m an anachronism. After all, I like thin-beam racquet frames with maximum feel and don’t do driver’s cars without manual gearboxes. Hell, I get upset by fly-by-wire throttles.
Call it a massive fluke, but maybe the mouse and keyboard can’t be bettered after all. What do you reckon?
The PC’s dead. Long live the PC
A terrifying little scare story popped up on red-top PC hardware rumour site SemiAccurate on Monday. The title read, “Intel kills off the desktop, PCs go with it.” Nice.
Now anyone who knows Charlie Demerjian knows he’s not shy of a little hyperbole. He’ll tell you straight up he’s happy to spin the angle of a story for dramatic effect. Not make things up, mind. But certainly play to the gallery. That was more or less what he once told me, anyway.
And that’s what he’s doing with this little story. I don’t dispute the claimed facts, though nor do I confirm them. They go something like this. Broadwell is the successor architecture to next year’s Haswell CPUs from Intel. Right now, we’re on Ivy Bridge chips. But you remember that, right?
Anywho, Broadwell will be built on 14nm. And here’s the kicker. Broadwell will be ball-grid array only. That’s a type of electrical connector and it means the chips will need to be soldered onto motherboards. The implication is pretty obvious. You won’t be able to buy CPUs separately. At best they’ll come soldered to motherboards.
So, no more mixing and matching of motherboard and CPU. Personally, I’m not that bothered. CPUs are fast approaching good-enough status in terms of performance. It’s storage and graphics you really need to worry about. That will be especially true by 2014.
Anyway, the really bad stuff has already happened with Intel locking out overclocking from all but premium-priced chips. So soldering CPUs to motherboards is symbolically dramatic, but with AMD uncompetitive for as long as I can remember, the golden age of enthusiast x86 computing fade several years ago.
The only thing that really worries me is the prospect of Intel deciding to kill the third-party motherboard market, something it could easily do. It’s not obligated to sell CPUs to motherboard makers. I’ve never liked Intel motherboards. So that would be grim.
Given that Intel isn’t making the progress it would like in phones and tablets, it’s not hard to imagine it actually happening. The pressure for revenue growth may simply be too much.
Keep that line of thinking going and you can also imagine Intel one day locking out third party graphics. Let’s say AMD does die or at least completely gives up on high performance CPUs, as seems a fairly realistic prospect. In that scenario, Intel would have the option of no longer hooking up its mainstream CPUs to a high bandwidth PCI Express bus. You’d be forced to run Intel integrated graphics.
After all, by providing a PCI Express bus, Intel is enabling NVIDIA’s graphics business. If Intel ever feels like its integrated graphics is good enough for gaming, once again the temptation to squish a rival may be too much.
In reality I don’t see that happening in the next few years. It would be so bad for the health of the PC as a consumer platform. But five years or more from now? I wouldn’t rule it out entirely. It’s yet another reason to hope like hell that AMD somehow survives.
In a recent statement to US-based Maximum PC, Intel has confirmed that it remains committed to delivering LGA sockets on the desktop for the “foreseeable future”, which is reassuring but, in truth, doesn’t mean a great deal.