Week in Tech: Intel Overclocking, Bonkers-Wide Screens

By Jeremy Laird on April 18th, 2013 at 6:00 pm.

Don’t sling your old CPU on eBay just yet. Too many Rumsfeldian known unknowns remain, never mind the unknown unknowns. But the known knowns suggest Intel is bringing back at least a slither of overclocking action to its budget CPUs. It’s arrives with the incoming and highly imminent Haswell generation of Intel chips and it might help restore a little fun to the budget CPU market, not to mention a little faith in Intel. Next up, local game streaming. Seems like a super idea to me. So, I’d like to know, well, what you’d like to know about streaming. Then I’ll get some answers for you. Meanwhile, game bundles or bagging free games when you buy PC components. Do you care? I’ve also had a play with the latest bonkers-wide 21:9-aspect PC monitors…

Easier overclocking with Haswell?
So, Intel’s Haswell chips and overclocking. Never a fan of the extreme overclocking, it’s still always made sense to me to overclock where it was easy, safe and gave you something tangible to suck on in terms of frame rates.

A few years ago, squeezing out an extra GHz or so from an Intel chip was pretty much the norm. You could buy a budget chip and have a high end experience. It was a no brainer. With Intel’s current chips, Intel has overclocking on lockdown. It is possible with some hardware combinations, but strictly on Intel’s terms.

Well, the latest news is that things are looking better with the new Haswell chips, due out on June 2nd. The elevator pitch is that it’s a new architecture rather than the same old bits shrunk smaller thanks to tinier trannies, if you know what I mean.

Anyway, with Haswell Intel is opening up the baseclock a bit. The details of how Intel has changed access to oveclocking in recent years are complicated and boring. But the dumbed down version is that mucking about with the baseclock became problematical when Intel started putting more and more gubbins onto the CPU itself, like the PCI Express controller.

Things got worse when Intel more or less clock locked the whole chip, tying all the elements together. Overclocking the baseclock ramped up frequencies everywhere. Game over unless you bought a premium priced chip with an unlocked multiplier.


The things I sit through for RPS…

For Haswell, it seems Intel is offering a selection of baseclocks. Along with the default 100Mhz setting, 125MHz and 167Mhz will also be on offer. These are divider-enabled numbers, so the rest of the chip is unaffected.

Admittedly, those are pretty coarse steps in frequency. And I’m not entirely clear whether this will be offered on literally all Haswell chips and whether you’ll need a fancy motherboard with a high end version of the upcoming 8 Series chipset to have your fun.

But it’s worth remembering that mid range Intel chips have a modicum of multiplier overclocking access. And as before, there is actually a little scope for tweaking the whole chip by a few percentage points. Put all that together and it certainly seems like there’s hope for something resembling the good old days of giant-killing budget chips.

OK, it’s all a bit complicated and it would be infinitely preferable for Intel to just let us do what we want with the hardware we’ve paid good money for. But the end result might be pretty much what I’m after. A high end experience for relative chump change.

Good for graphics, too?
As it happens, Haswell is also interesting on the graphics side. Again, all the details aren’t out yet, but all the indications are that it’s a major step forward for integrated graphics, especially for mobile. The top version of the graphics core is expected to be around the same as Nvidia’s GeForce GT 650M mobile chip.


Die Nvidia GeForce 650M, die. That’s german for, “The Nvidia GeForce 650M, the.”

If so, that’s going to make budget gaming laptops pretty interesting. And cheap. Hopefully, anyway. Watch this space. I’ll dish the details on Haswell graphics as soon as the evil NDAs lift.

And so to streaming…
We touched on this briefly before, but the idea of having a single PC that can stream games to any device in your homestead sounds pretty damn sexy to me. But there are plenty of unknowns. One thing that concerns me is the risk of, for instance, Nvidia locking it down in some way. There’s talk of Nvidia requiring a Tegra device to enable streaming, which would be a pity.


Now we know why it’s called Shield

Fortunately, AMD is now making noises about getting in on the action and a little competition always helps keep things open. Anyway, put your thoughts, hopes and fears below and I’ll pump Nvidia and AMD thoroughly for information.

Bundle or bung?
Finally, a couple of side issues. Game bundles are something that the graphics card makers have been giving me the heavy messaging treatment of late. AMD recently extended its Never Settle Reloaded bundle to include Far Cry 3 Blood Dragon, for instance. The bundle kicks in a little further down the range, too. It’s even a retrospective deal. If you’ve previously cashed in a Never Settle Reloaded deal, you can pick up Blood Dragon gratis (full details here).

I suppose it depends on what you make of the titles on offer. But Bioshock Infinite is on the list for some cards, so it ain’t entirely shabby. But do you care? I’m told bundling is wonderful. You decide.

Stop that, it’s silly
I’ve also had a little play with one of the new generation of ultra-wide PC monitors. I’m talking 29 inches, 21:9 aspect ratio and 2,560 by 1,080 pixels. As far as I know, they’re all using the same IPS panel, at a guess made by LG. So they should all look fairly similar.

The basic image quality of the Philips effort I reviewed is rather lovely as you’d expect from a modern IPS panel. And it’s hilariously, bizarrely wide. An obvious observation, but these things really are pretty dramatic.

They’re great for feature films, obviously. It’s nice to see all four corners of the display filled for once. But PC monitors, frankly, are piss poor value if watching movies is the main concern.

What they’re not great for is normal PC fare. Vertical res really counts for pedestrian stuff like browsing the web, and you’ve got no more than a basic 22-inch panel. Instead you get loads of what is often superfluous width.


Wider than two extremely wide things. In a pod.

These new 29 inchers also beg the question of what’s too wide for comfort. Actually reading text located at the extremities of the panel feels distinctly sub optimal to me.

But whether 21:9 works for games is the really interesting question. Currently, I don’t have the answer. I thought it was a hell of a lot of fun. But I also fear the novelty factor and when it will wear off. There are practicality concerns when it comes to on-screen menus, too. It could work really well. It could be ridiculous depending on how you like things set up and how configurable a given game is.

It’s also worth noting this brave new 29-inch gen is pretty aggressively priced. A little under £400 appears to be the norm. OK, that’s hardly throw away money, but it’s just attainable enough to be interesting. I’ve probably gone mad and these things are a passing fad. But I can’t deny it. I found superwide gaming strangely compelling. Would I actually buy one? Nope.

, , , , , .

34 Comments »

  1. Sakkura says:

    Re. overclocking, it will also be interesting to see how well the unlocked chips will overclock. Ivy Bridge disappointed compared to Sandy Bridge, apparently due to heat building up between the actual silicon and the chip’s aluminium cap. Modders replaced the material between the two and achieved higher clocks.

    As for the increased baseclock leeway, that’s adopted from their high-end LGA 2011 platform. You know, the one where processor names would end in “extreme” just to soothe the blow to your nest egg. Glad to see that filtering down to mere mortals.

  2. Yargh says:

    the ultra widescreens could be a possible replacement for dual monitor setups. Less cables to manage at least and no bezel in the middle of your viewing area…

    • Wut The Melon says:

      …But really, isn’t QHD (2560×1440) a much more logical step forward? I mean, if these ultra-wide things were really that much cheaper, perhaps, but they’re still rather pricy.

      • LintMan says:

        @Wut The Melon:

        Yes, exactly. We really need to move on from 1080, please. Current gen smartphones are all 1080 now – a desktop monitor with 4x the diagonal needs to do better.

        As for the 21:9 ratio – I really don’t see any benefit. It’s not nearly as usefiul as dual widescreens, and if we’re just trying to get more screen area, taller would be better and give more pixels for the same diagonal length.

        • Premium User Badge

          Andy_Panthro says:

          Why do we need to move on from 1080p? What does it offer for an average gamer? (genuinely interested, I’m not sure how much higher resolution is actually going to improve my experience).

          • ecat says:

            Why do we need to move on from 1080p?

            Move on from?

            My old CRTs from last century were 1200p. The 1080p letter boxes of today are thanks to ‘technological innovation’ aka cost cutting mixed with blind consumer acceptance. 1200p Accept Nothing Less (unless it brings OLED goodness… maybe)

          • LintMan says:

            Higher resolution is all about information density. In-game HUD or info panels can be smaller (in terms of actual screen real estate area % covered) but still just as clear, or if they keep the same screen %, they can pack it with more detail or information.

            Also, monitors are trending larger but resolution is not. I’m seeing 27″ monitors now that are still 1920×1080. Come on! This means that pixels are getting fatter/blockier. What looks great on a 20″ 1920×1080 monitor isn’t going to look as good on a 24″ or especially 27″ one.

            Also. I use my PC for more than games, and I want all the screen real estate I can get. I’ve had a 1200p monitor since 1999 or so (non-widescreen CRT back then). It galls me that about 14 years later, things have barely advanced, and monitors of this line resolution or larger are still fairly uncommon.

          • DonDrapersAcidTrip says:

            “Also, monitors are trending larger but resolution is not. I’m seeing 27″ monitors now that are still 1920×1080. Come on! This means that pixels are getting fatter/blockier. What looks great on a 20″ 1920×1080 monitor isn’t going to look as good on a 24″ or especially 27″ one.”

            1920×1080 resolution still looks great on my 50 inch plasma tv.

          • Premium User Badge

            jezcentral says:

            @DonDrapersAcidTrip, but how much time do you spend looking at it from 2 feet away?

          • DonDrapersAcidTrip says:

            is 1920×1200 seriously too blocky and pixellated for you guys on monitors? Just how close are you guys sitting to them? Do you got coke bottle glasses or something?

          • kaffis says:

            When you get above 24″? Yeah, 1920×1080 is insufficient for a monitor at a 2-3′ viewing distance when it’s that big. It starts to look rough.

            I really don’t understand why manufacturers think they can/should push this 21×9 crap, though. Extreme widescreen gaming will always be a niche market, because when you get that wide, flat projections just get too distorted for most people to be satisfied with the way it looks. Curved displays with that kind of aspect ratio could catch on if games supported them with a different rendering path that provided an appropriate projection; but I suspect that they’re simply too late — we’ll have consumer VR penetration before anybody thinks to try to bring that to market.

            In the meantime, while we wait for Oculus Rift and whatever follows in its footsteps to compete with it to gain widespread support and acceptance, I’d rather display manufacturers focus on far more useful formats like 2560×1600 (or, if you must, 2560×1440, though why anybody would *prefer* a 16:9 ratio for PC work baffles me). Heck, I’ll continue to use my 30″ 2560×1600 display alongside my VR, because it’s fantastic for office and web applications.

  3. 11temporal says:

    Well, if I’m buying a graphics card what’s bundled with it doesn’t even enter the picture as far as decision which card to buy goes.

    Streaming would be great if I could stream games to my gamewise-shitty laptops in the way remote desktop works. I guess streaming to some dedicated gaming handhelds would also be nice though it probably wont be enough to make me buy such stuff unless it’s universal – working with almost every game.

  4. Zenicetus says:

    My first thought on that 21:9 monitor was that it would be perfect for flight simulators. But then I wondered what it would look like with TrackIR swinging the view around in an air combat sim, like Rise of Flight. I dunno… the ultra-wide aspect ratio might be nausea-inducing. Might be better for more static cockpit views like civilian airliner sims.

    The wide view could be good for certain strategy games. Maybe not something like Civ where you need equal attention in all directions of the map, but for the tactical battlefield in Total War games it could be very nice for keeping track of your army. Horizontal situational awareness is usually more important than front-to-back depth in that type of perspective view.

    Maybe not so good for a FPS? UI elements like health bars and mini-maps at the screen corners would be a long way from the central aim point. That could be distracting.

  5. Fred S. says:

    My 8800GTX video card came with Crysis. The card was great, ran fine for several years and finally got glitchy so I replaced it with a GTX 550 ti from Zotac. That card didn’t come with a game. I didn’t really care either way.

  6. SWKineo says:

    Am I the only one who thinks that “21:9″ should be “7:3″?

    • Fred S. says:

      21:9 is easier to relate to 16:9.
      That’s also why 16:10 instead of 8:5.

      • TechnicalBen says:

        I feel multiple shades of stupid for never noticing that.

  7. Moraven says:

    Last year I got two AMD cards, 79xx and 68xx. Both came with Dirt 3. I have no interest in racing games and gave them away in the RPS Kindness club. Some people like to sell these bundle games on eBay. I think Dirt 3 would have maybe sold for $10-$20.

    Now, the 7870 we just ordered is bundled with Free FC 3 Blood Dragon, Bioshock & Tomb Raider, limited offer. 3 Current release games. If I wanted, I could possibly turn around and sell that for $60-$100. One way to get a cheaper card. I think AMD before had a Hitman bundle before this one.

    I think I might sell them now and pick the games up later on a Steam Sale.

    By no means the bundle drives any decision for what card I want. Not a bad deal for both sides. Free games to consumer and developer might get some word of mouth sales.

  8. iainl says:

    21:9 isn’t that far off having two 4:3 monitors side by side, but without an annoying join when you do want to game. Given the popularity of dual monitors I can see it working.

  9. Arglebargle says:

    Word I’ve heard from audio sites is that the early Haswell samples overclocked like champs, but the recent chips are very mixed.

    • chris1479 says:

      Who cares? The i5 2500k is a total monster, mine’s at 4.4 Ghz on air – unless you’re into extreme CAD software and the like there’s just no point upgrading right now.

  10. Cytrom says:

    I see absolutely zero chance that intel’s top igp would even come close to a dedicated nvidia 650m.

    For once, while it may have some dedicated cache, it still ultimately relies on system memory to serve as virtual videoram, which still provides significantly less memory bandwidth than dedicated gddr5 memory with a humble frequency and interface (not to mention it leeches system memory).

    Second, the theoretical maximum raw processing power of the new igp (based on the known specs) not considering any bandwidth issues is still significantly weaker than that. The strongest trinity apu (holding the current strongest igp) is roughly 25-35% stronger than the current intel hd4000 (which holds 16 processing units). The top amd igp is still only about as fast or slightly weaker than a dedicated amd 5570… which is pretty much worthless in desktop range, and less than mediocre in a notebook (slightly below a dedicated 7670m). GT2 will have 20 of the mentioned processing units and early tests show that the performance gain is close to linear, thus it should score near the trinity in theory.
    Optimistically assuming, that the GT3 will have twice the processing power of a hd4000… or even more optimistically twice the power of the gt2, thats still pretty damn far off from 650m. The graphics core itself may come close to 640m / amd 8750.. which are about half as powerful as the 650m, but then it’ll be still crippled by memory bandwidth knocking it further down.

    Even that kind of performance may come with serious graphics quality compromises.

    The strongest intel igp is gonna be still total crap for any serious gamer, however it could be strong enough to enable ultrabooks without dedicated graphics to play most games casually with crappy settings and low resolutions, without sacrifising the low profile and long battery life of an ultrabook, which is an impressive improvement in itself, but not quite as impressive as intel tries to make it look.

    • SuicideKing says:

      Well, Intel had a working demo at CES, with the Haswell GT3e (aka HD5200) part running along side a 650m. Saw the video, and it was pretty close. Not a qualitative benchmark, but we’ll have to wait for that.

      Also, it’s not for ultrabooks, its for laptops that currently use something 650m-class. This stuff should be far more efficient. You could actually game with that without being tied to a power plug.

      The on-package cahce is supposed to be a 64MB DRAM cache, and apparently will act as L4 for the CPU, but that’s not confirmed.

      Desktops wont get it, so “serious” gamers can get their 680s and 7970s and their mobile counterparts.

  11. fish99 says:

    Not convinced about 21:9 given the amount of media in 16:9 format, and the likely flakey support for that aspect ratio in games. There’s also the issue someone mentions above about things getting horizontally stretched at the edge of the screen (due to the divide by Z projection used in computer graphics, which isn’t how our eyes see the world), a problem which gets worse the wider the screen is.

  12. SuicideKing says:

    Yeah should be interesting to see how the BCLK overclock works. From what i understood it’s not a divider, in the sense that it’s not decoupled from the rest of the controllers, just that the other controllers will have their clocks readjusted to maintain ratios and stuff.

    I wonder what it means for AMD, though. i3s offer serious gaming performance competition for anything AMD at stock clocks, wonder what’ll happen with this. Plus, slower i5s with locked multipliers but an open BCLK would be pretty damn sweet.

    Jeremy, surprised you didn’t mention the multiplier on k-series chips will allow values till 80x, which effectively enables a 8 GHz max clock speed. With LN2, of course…. :|

    On the streaming thing…well i find it a lot more practical than cloud gaming and all that, because home networks are much faster and can be used to stream other content as well. A multi-purpose network setup that also lets you stream is pretty neat imo.

    Nvidia’s been going into professional local/personal remote server computing with GRID as well…i think the future lies here.

    Someone wrote on one of the forums i visit…that in future each housing complex, locality or building could have its own game streaming server that could stream games to individual houses. Maybe all of this will lead there, eventually?

  13. Pootank says:

    Admittedly, those are pretty coarse steps in frequency. And I’m not entirely clear whether this will be offered on *literally* all Haswell chips and whether you’ll need a fancy motherboard with a high end version of the upcoming 8 Series chipset to have your fun.

    Please don’t become of those sites that uses the word literally unnecessarily.

    I’m no grammar nazi but that shit is nasty.

  14. Premium User Badge

    VelvetFistIronGlove says:

    21:9 in a single monitor? I’d say the world of Deus Ex monitors is not too far away: we just need to make them oddly non-rectangular and stupidly transparent!

    Until then, I’ll just go back to playing Dishonored in 48:9… being able to using peripheral vision in that game is very nice.

  15. Caiman says:

    The main application for those 21:9 monitors would seem to be having the convenience of a dual monitor setup in a single monitor, having the ability to spread two or three working windows across the screen real estate (three windows is something you can’t get away with using a dual monitor setup), and having a much better working area if you’re doing any kind of NLE (audio or video) or similar. For games and straightforward web browsing though? Not so much. It’s a work asset, not a gaming one.

  16. Lev Astov says:

    Jeremy, I’ve been gaming on a super wide home built projector array and the novelty has never worn off. It’s quite painful to go back to 16:9 when I’ve been gaming on 8:3 and more recently 16:5 for over five years now. Vertical res is definitely an issue, though, and is why I’ve recently switched to 1920×1200 projectors.

    Some games don’t work perfectly with super widescreen out of the box and take a bit of tweaking, but it’s always worth it. The widescreen gaming forum (http://www.wsgf.org) has many fixes available and tons of tips and tricks. Seeing new aspect ratios like 21:9 in off the shelf monitors really excites me because it means developers will start thinking of these things.

    For the 21:9 monitors to take off, I think they’ll really need 1200 vertical pixels, but we’ll see.

  17. belgand says:

    I just bought a 660 Ti a week or so ago and the included bundle of credit for some F2P games I’m probably not going to play felt terrible. To the degree that I lost some respect for NVIDIA for offering such a crappy bundle. Even worse is that the other day they announce Metro: Last Light as the new bundle option. So yay, the game I wanted is now the new thing that I didn’t get. It didn’t influence my decision, but I’m certainly far less happy because of the crap bundle.

    • fish99 says:

      You could sell those F2P codes for PS2/WoT credits for a decent amount of real money, probably enough to buy Metro LL. Either ebay them or get on the PS2/WoT forums.

  18. El_Emmental says:

    Bundles are a good idea for the average consumer, as he can see/understand what his graphic card is made for (playing demanding 3D games).

    Like with games coming with a computer (or just installed on it – more or less legaly by the manufacturer/distributor) back in the 90s days, it can let a novice (young or old) enjoy an impressive (in terms of graphics) game without having to shop around for a while (asking for recommendations among friends, or the clerk at the store) – especially since I’ve mostly seen respectful bundles (with a few exceptions, sure): most were with okay-to-good games, I haven’t seen much “hey, we’ve got so many unsold copies of that crappy game we’re giving it for free !”.

    I have plenty of friends who rarely search for informations/articles about games by themselves, who enjoyed recommendations they noticed in a magazine or a “mainstream” gaming website, or got in these bundle.

    If the bundled game is correctly chosen by the vendor (= a good, accessible game, who can turn on medium-high on that graphic card), it’s an added value for the uninformed consumer, and can be an excellent way to introduce that user to “serious” PC gaming (= not just WoW, CoD and whatever the social pressure got them to play).

    The ideal case is:
    - Joe/Jane (or his/her parents) just bought a computer at the store.
    - It’s basically an i3 with 4 or 8 GB of RAM because they didn’t get the cheapest one (and its Sempron/Celeron, 2 or 3 GB of RAM), but it comes with a GPU barely able to keep a 20 fps on the latest CoD/BF3/Dota 2/LoL/etc at low settings.
    - Joe/Jane save up money and shell out £80 for a neat card (and hopefully their PSU can handle it, otherwise it’s approx £40 more).
    - “Hey look Joe/Jane, other cool games exist, it’s not just CoD/BF3/WoW !” and Joe/Jane found out about AssCreed, Far Cry 3/BD, Tomb Raider, Bioshock, etc, living happily ever after.

    But of course, for PC gamers who know how to ‘build’ a PC (or at least swap a PCI-E card) and buy their games during sales on Steam/GMG/etc, like most people reading RPS, such bundles are just “oh, there’s a game”.

    They already know if they’ll probably like it or not, when and at which price they’ll get it (or not), on which DD platform, etc – what matters the most for them is how balanced it is with their CPU, the GPU’s performance regarding their favourite games, how much VRAM it has (especially if they’re rocking multiple-monitors or anything bigger than 1080p), how much watts it eats up during load, how good is the heating stock fan+rad, o/c potential, etc. Stuff the average consumer doesn’t know.

  19. cawt says:

    Regarding streaming, I have this silly dream of some sort of terminal that would be connected to the main pc via ethernet, relay image and sound to the tv via HDMI, and offer USB connectivity for various control methods.
    Is anyone planning to make this happening? Does it already exist?

  20. deadly.by.design says:

    I think I’d love one for graphic design, but that’s about it.

    Then again, I made my way through design school on a 1024×768 12″ Powerbook, so whatevs. Efficient software workflow can compensate.