Week in Tech: Ultrawide Monitors, DDR 4 & New Intel CPUs

By Jeremy Laird on June 5th, 2014 at 9:00 pm.

Utterly pointless, but ooh it's purty: Asus's Hoth-spec mobo

Christ, it is Computex again? I can’t keep up. Surprisingly, we’ve never had a round up from what remains the greatest show of PC hardware on earth. But let’s pretend we’re old hands and ponder what another 12 months has bought us barring an incremental uptick in cynicism and one step closer to cold, infinite oblivion? Quicker, cheaper SSDs (yup, that again). Yay! A properly cheap and fully overclockable Intel CPU. Huzzah! The fastest optical mouse sensor ever. Haroo! Super-wide, beyond ultra-HD monitors. Argh! DDR4 memory that will revolutionise gaming (allegedly). Er, zorg?! And even an Hoth-spec tundra-camo motherboard. Mother. Of. God. Oh, and I’ve finally clapped eyes upon one of those cheap TN 4K panels and can confirm that they’re damn good and put an end to the need for anti-aliasing – yes, really. Ride your rodents for the round up.

UPDATE:
Thanks to an AMD Freesync demo, the possibility, albeit still remote, of firmware Freesync updates for existing monitors has emerged…

SSDs. Again
So, quicker SSDs? Yup, a repetitive theme, so I’ll keep it short and sweet. The long-awaited SandForce SF3700 SSD controller is finally about to ship out.

The big news here is improved simultaneous or ‘mixed’ read/write performance as opposed to just pure read or write throughput. Apparently, a good drive today will achieve circa 250MB/s for an 80/20 read/write traffic mix.

If I’m reading things right, the new SandForce controller ups that to as much as 1.3GB/s. No doubt an ideal/cherry picked scenario, but mixed traffic is certainly more realistic than one-way traffic. Watch this space.

Inevitably, Computex has also seen a bunch of M.2 SSDs appear and the form factor looks to be increasingly the one to go for in terms of ye olde ‘enthusiast’ PC, which overlaps gaming rigs in the Venn diagram of PCdom. We’re in the realms of contrived consumer categories, of course. But my general feeling is that if you or I were to build a PC a year from now, we’d almost definitely choose M.2 for the boot drive rather than SATA Express.

Intel’s cheapo chip
Next up, Intel’s anniversary-special Pentium G3258 processor. We’ve touched on this before, but here are the key numbers:

- 3.2GHz baseclock
- Two cores
- Fully unlocked
- $72 (circa £60)

No HyperThreading. No Turbo. Who cares? There is one snag. It’s not getting the NGPTIM stuff, aka the decent thermal interface material that comes with the new Devil’s Canyon Core i7-4790K and i5-4690K chips. It’s getting the same crappy material that is currently holding back existing Intel Haswell-generation processors. Poo.

Tight-wad Intel apparently hasn’t given the Pentium G3258 the same improved thermal material as the other new Devil’s Canyon Haswell refresh chip. Bah.

Still, I hold out hope it will hit 4.5GHz, in which case it’ll probably be all you need for a cheap gaming rig for $72 or I assume about £60. Win.

Any colour so long as it’s white
Meanwhile, Asus has a whole hill of new stuff, but two things caught my eye. First, the ROG Gladius gaming mouse. You get the usual niceties, like a figner-tip DPI switch.

The newsflash is the 6,400dpi optical sensor capable of tracking at up to 200 inches per second and 50G acceleration. Are these numbers actually of any import? Does super-high DPI matter beyond a certain point? I have no fugging, idea. But I like big numbers! What do you guys think?

But by far the snazziest Asus product is the limited-edition TUF Sabranco Z97 motherboard decked out in Empire Strikes Back tundra-white camo. Utter gimmickry. I’ll have three, thanks.

For all your gaming pleasure: Asus’s new Computex clobber

Super wide!
Then there’s Dell’s U3415W. Here we’re talking 34 inches, 21:9 aspect ratio, IPS and 3,440 by 1,440 pixels. It’ll be just one of a load of 34-inch 1440p panels and consensus is slowly building that it might just be an awesome solution for high-end gaming.

Critically, it adds up to five million pixels, which is a tonne less than the eight million of a 4K panel. So, materially less GPU load. And yet it’s still going to be seriously high detail (a 27-inch 2,560 by 1,440 panel gives you 3.6 million pixels) and gaming on a very grand scale. Interesting at the very least.

As for pricing, no word as yet. But I reckon a good yardstick involves existing 29-inch 21:9 panels. I don’t think these 34-inch IPSers are going to be positioned as truly premium. They won’t be cheap, but a budget brand like AOC (there is indeed an AOC U3477Pqu incoming) might make them attainable. Dare I hope for about £400 / $500, tops?

I’ve been banging on about 4K, but maybe 21:9 super-wide is actually where it’s at for gaming?

DDR bore
Anywho, the usual memory-making suspects have been hawking their brave new DDR4 wares and promising big things for gaming. Well, they’ve got to say something. The truth is, DDR4 will be restricted to the high-end X99 platform and Intel’s Haswell-E rebadged server chips for the foreseeable.

DDR4 will be interesting when it transitions to Intel’s mainstream socket – integrated graphics can do with all the bandwidth it can get. But as far as I’m aware, the next-gen Broadwell chips are DDR3, so that’s several years away.

4K redux
And finally, I’ve had a quick look at one of the new 28-inch 4K screens using TN panel tech. For the most part, the new panel is seriously impressive. Save for the vertical viewing angle, it does a very good impression of an IPS panel. Nice colours. Good contrast.

For me personally, 28 inches is still too small for 4K in Windows. I don’t want to have to set the scaling beyond 100% because it just doesn’t work and set to 100% everything is simply too small.

On the upside, the ludicrous pixel density at 28 inches means you truly, honestly no longer need to use anti-aliasing, which helps with the GPU load. There’s virtually no aliasing visible without AA enabled. But despite my general enthusiasm for 4K – games do look stupidly good – I’m not sold on 4K in this form factor.

STOP PRESS: Freesync firmware updates for existing monitors
Last minute addition. AMD’s been showing off a Freesync prototype monitor. Not huge news. Here’s the interesting bit. It was a standard retail monitor merely updated with firmware. In reality, I suspect few monitors will be upgradable in this way and that’s essentially AMD’s line. But a few may be if their manufacturers are feeling generous. For the lucky few, what a nice perk that will be. You’ll need a DisplayPort 1.2a-compliant graphics card, too, of course

__________________

« | »

, , , , .

40 Comments »

  1. phelix says:

    21:9 good for gaming? I’d figure 21:9 is good for watching cinema-ratio’d films and sod-all else.
    Non-multimedia stuff like web browsing and office programs, and games as well, already feels needlessly clumsy in 16:9 on my laptop.

    Then again, I still think 16:10 should be the standard for all monitors, so maybe I’m just behind the times.

    • BTAxis says:

      Why, by the way, are we calling them 16:10 and 21:9, as opposed to 8:5 and 7:3?

    • LintMan says:

      @phelix: I agree – 16:10 (or 21:10) would give more useful screen area. Especially when most website seem to limit their width to 1024 pixels.

      • bp_968 says:

        While I tend to agree that 16:10 is vastly superior to 16:9 I do think 16:9 *can* be tolerable when using the higher resolutions. 1920×1080 feels “cramped” vertically in windows, but moving up to 1440p almost completely eliminates the cramped feeling. So while I’d prefer 16:10 in general, moving up to the larger vertical resolution with 1440p (or more) seems to mostly correct that cramped feeling.

        I’m very curious to see how these 21:9 1440p panels “feel” with general gaming, surfing and development (having that much screen area without a bezel splitting it down the middle would be wonderful). For now I’ll stick with my two 16:10 IPS panels, but I’ll keep my eyes on these.

    • Silent_Thunder says:

      Well perhaps I’m biased as a simracer, but I’ve found 48:10 to be a great monitor ratio. Of course this is with triple screens, (3x 16:10) and the side monitors are angled. Not sure how well things would look on a single flat ultra-widescreen though.

      Consequently means that for most games that don’t support ultra-widescreen comfortably I can use the side monitors for other tasks, such as maybe a movie, or an article I’m reading during processing slowdowns in games such as Football Manager.

      Really wish more games that used supported ultra-widescreen resolutions actually rendered the three monitors separately, because otherwise it results in some weird issues with perspective. Of course in order for that to really work, we either need to standardize the angles we all put our monitors at, or have ingame settings for monitor angle. Otherwise again that little beast perspective rears it’s ugly head.

      However, for productivity at the least, especially in the workplace, ultra-wide single monitors are a godsend, but that’s not the gamer market in that case.

      • fish99 says:

        Yup, to me 3 screens rendering separately is a better solution than one very wide screen. The distortion at the edge of a 16:9 screen is already pretty bad.

        For people who don’t know what this looks like, load up any FPS and compare the width of character faces when they’re in the center of your screen, and at either edge. The basically become 30-50% fatter at the edge.

        • captain fitz says:

          That’s the way a 3d engine renders perspective. Regardless of how many monitors you’re using, the farther out from the center an object gets the more distorted it will be.

    • kael13 says:

      21:9 is great for gaming. I own the LG 34UM95, LG’s version of the 3440×1440 panel mentioned above. It’s fan-bloody-tastic, but I did have to send it back as my particular monitor had some horrific backlight issues in one of the corners. Apparently most other people have had perfect or near perfect units, though.

      Can’t wait for my replacement, going back to 16:10 is weird and scary.

      • Shietar says:

        I own the same LG monitor, and so far my experiences are mixed at best. It’s great for modern strategy games, that support the resolution, but sadly there are very little games which do. More than half of the games I tested will refuse the ratio and just stick to 16:9, and some games completely fail to run in full-screen mode.

        So, I still like the ratio itself, but I’ll keep a second 16:9 monitor on my desk as a fall-back option for the not so supportive games.

    • onodera says:

      I agree. And 4K should mean 4096 horizontal pixels, not 3840.

  2. Stimpack says:

    Damn Intel and their crappy TIM. Ugh!

  3. heyricochet says:

    Hope the 21:9 is well priced, I’m a triple monitor gamer (48:9!) and I love it. Having the wide field of view is so much more immersive and great for a whole bunch of games. Having a lower priced wide screen option can only encourage more developers to include wider options in the base options rather than having to rely on the work of a few amazing modders in the ultra widescreen gaming community.

    • aperson4321 says:

      I have been using 3 samsung 24 full hd for multimonitor for some years now, but i recently got the cheapest 4k monitor from samsung that got 60hz with display port, I had to adjust every single screen setting on it to get the image right, and the viewing angle and stand is not that good.

      When i first used full hd in a game I got a “WOW! effect”, 4k gave me the same feeling, I takes some time for the brain to “get it” but when it happens the amount of extra details is amazing, 4k is to AA multisamling what shitty fxxa is to AA multisamling, is just so detailed! I makes the fans of my nvidia 780 evga classified really loud. But for games that hates multimonitor and got lot of small details then its fantastic! (still pricy as hell, would have been wise for me to wait a year or two, but still tons of fun).

  4. Neurotic says:

    So any one of those new G3258 CPUs would blow my trusty old Q6600 out of the water, right?

    • huldu says:

      I doubt it. You must realize that a lot of games nowadays use four cores and that is where the dual core fails, since it only has two cores. You could sit on a single core 3ghz and clock it up to… 4ghz you’d still have problems running modern games since the ghz isn’t really what matters. Games make great use of the cores, far more than the ghz on your cpu.

      Just look at wildstar for example, minimum requirements are a dual core cpu. It’ll run like crap on a dual core compared to a quad. It’s just that simple. The ghz just won’t make any real difference sadly.

      I can see people adding that cpu into cheap work desktop or school computers at most. But for gaming rigs it’s just fail unless you consider running games that are over 5-6 years old cool.

      • Neurotic says:

        That’s good to know, thanks! I was looking at hardwarecompare yesterday, seeing how the Q6600 stacks up against a modern i5 and pondering a mobo+cpu upgrade. And then I read this piece and cheapness flashed in front of my eyes. :D

        • Shietar says:

          As a matter of fact, this CPU would make a rather decent gaming CPU. High Frequency will give you better performance than a high core-count, it’s just that you would need a 5.5GHz dual-core to cope with a 3GHz Quad-Core in any game that can use 4 threads or more, so for most realistic cases the quad-core will be the better option.

          Regarding the comparison at hand, the modern core architecture gets a significantly better performance at the same clocks, anywhere between +50% and +100% in comparison to the old Kentsfield. If you look at it this way, the C2Q6600 is giving you 4*2.4 GHz= 9.6 Gigacycles per second, the new Pentium gets 2*3.2 GHz = 6.4 raw Gigacylces per second, but those can be expected to be just as efficient as 9.6 to 12.8 Gigacycles of a Kentsfield CPU.

          So, depending on the specific game, the Pentium would be just as good or up to 30% better than the C2Q6600. But thats stock, and the kentsfield is quiet likely to be the better overclocking CPU in relative numbers (If you can clock the 6600 to 3.3 GHz, the Pentium needs to hit 4.4 GHz to keep up).

          So, if you go and buy this CPU, and a new socket 1150 board, you probably end up with the same performance for almost 200$ (but save a little on electrivity). On the other hand, get the new board + the cheapest core-i5, and you easily doubled your CPU performance for 300$. Makes a hole lot of more sense, doesn’t it?

      • Makaze says:

        This is… not entirely true.

        A lot (a whole lot) of games use a single central thread for the engine and just farm out things that lend themselves to asynchronicity like physics and sounds to a separate core.Dual cores makes a massive difference as background operating system tasks and those aforementioned secondary threads can be shifted over. The difference between dual and quad though is much less pronounced. The op system is already on a separate core and the secondary threads of a game engine will rarely be enough to peg a core.

        I’m not saying a quad won’t perform somewhat better. But we honestly haven’t gotten good enough at truly multithreaded programming in the game world to really make full use of more than a handful of CPUs.

    • Mittens89 says:

      Im running a Q6600 too! Managed to push it up to 3.3 GHz using an aftermarket cooler which I bought the other day. I love this processor.

      • Neurotic says:

        It’s a really faithful old doggy isn’t it? My processor and motherboard combination are firmly glued into 2008, but they’ve never let me down yet! :D

    • fish99 says:

      I know when I upgraded from a Q9550 to an I5-760 and then to a I5-3570K I saw way more speed increase than you would expect from the frequency increase. So don’t assume that a core 2 running a 3.5 (or whatever) is the same as a 1st, 2nd or 3rd gen I5 running the same speed, the transistor counts go up massively and the designs are more efficient, so the chips do more work per cycle.

      I5-760 to I5-3570K alone picked up roughly 20% framerate in games across the board, even though the vast majority of games weren’t CPU limited (which barely even makes sense to me, but there you go). Of course that upgrade also included DDR2-to-DDR3.

  5. serioussgtstu says:

    Is there any word on when these Devil’s Canyon CPUs are going to be available?

  6. cloudnein says:

    I’m assuming by “4K” you mean “UHD”? Anyone doing 4096×2160? Shooting this resolution for a short film but can’t watch it full-res without plunking down $10k+US. (So I’m holding off, why spend $1k for UHD only to want to upgrade a year later to 4K?)

  7. DanMan says:

    The Freesync news got my spidey sense tingling. Why, I don’t know. I’m still wondering I even got one.

    But as has been said, I also doubt that manufacturers will let you upgrade your monitors (for free).

  8. Syra says:

    Can someone explain why DDR4 is a big deal when I already have DDR5?

    • DanMan says:

      You probably got GDDR5 on your graphics card. He’s talking about DDR4 RAM to put in your motherboard, and we’re still at DDR3 there.

    • Neurotic says:

      Is it not GDDR5? I think that’s something different from normal mainboard memory:

      “•DDR3 runs at a higher voltage that GDDR5 (typically 1.25-1.65V versus ~1V)
      •DDR3 uses a 64-bit memory controller per channel ( so, 128-bit bus for dual channel, 256-bit for quad channel), whereas GDDR5 is paired with controllers of a nominal 32-bit (16 bit each for input and output), but whereas the CPU’s memory contoller is 64-bit per channel, a GPU can utilise any number of 32-bit I/O’s (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit, 12 for 384-bit etc…). The GDDR5 setup also allows for doubling or asymetric memory configurations. Normally (using this generation of cards as example) GDDR5 memory uses 2Gbit memory chips for each 32-bit I/O (I.e for a 256-bit bus/2GB card: 8 x 32-bit I/O each connected by a circuit to a 2Gbit IC = 8 x 2Gbit = 16Gbit = 2GB), but GDDR5 can also operate in what is known as clamshell mode, where the 32-bit I/O instead of being connected to one IC is split between two (one on each side of the PCB) allowing for a doubling up of memory capacity. Mixing the arrangement of 32-bit memory controllers, memory IC density, and memory circuit splitting allows of asymetric configurations ( 192-bit, 2GB VRAM for example)
      •Physically, a GDDR5 controller/IC doubles the I/O of DDR3 – With DDR, I/O handles an input (written to memory), or output (read from memory) but not both on the same cycle. GDDR handles input and output on the same cycle.”

  9. GenBanks says:

    I have an irrational desire for one of those Ares graphics cards… Looks really nice… Would look beautiful in my new z97 build using a maximus vii ranger… but i’m already broke :(

  10. tvcars says:

    “For me personally, 28 inches is still too small for 4K in Windows. I don’t want to have to set the scaling beyond 100% because it just doesn’t work and set to 100% everything is simply too small.”

    The exact reason my next monitor will have to be massive. If we’re increasing the screen real estate by x4 then the monitor will have to be x4 just to keep the same dot pitch. 1080p on 24in now is a pain in the rectum, I don’t want to know how painful x4 on anything less than 40in would be, but I also can’t imagine having a 40in monitor in front of me for gaming and surfing the net. We need a proper 3d OS, built from the ground up in 3d, then you can just use z scaling to manage size and everything would work.

  11. angelinajullie says:

    so happy to reconnect my files again here, newamericanjackets

  12. Love Albatross says:

    21:9 monitors are nice…when you can get games to run at the correct resolution. Battlefield 4 looks fucking great on a 21:9, but other games have struggled to understand the aspect ratio and it results in various parts of the UI being chopped off. Even BF4 does that on the odd occasion.

  13. kyrieee says:

    28″ is already bigger than I want, I’d prefer something like 24″. Besides, you can always run windows in 1080p on a 4k monitor if the scaling is that bad. I don’t really care how crisp my folder icons look.

  14. Lord Byte says:

    I’ve found my preferred DPI @3200 and I’ve been in the loop since the Microsoft Intellimouse :P I’ve been gentle moving it upwards over those years until I found the point at which higher DPI actually mean less fine control for me.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>