Week in Tech: Flicker-free Screens, AMD Noise, Nvidia 780 Ti

By Jeremy Laird on November 7th, 2013 at 8:00 pm.

Suffering from headaches, tired eyes and all-round gaming fatigue? Must be that flickering LCD monitor ripping up your retinas. No idea what I’m on about? BenQ would have you believe flickering LCD monitor backlights are the new evil and it has the solution. Flicker-free backlight tech. I’ve tried it and can reveal whether it’s the next big thing after 120Hz-plus panels. It’s not. Next! Graphics. AMD and Nvidia are currently squelching about and looking grumpy following of one of their traditional pissing contests. An unpleasant image but it’s good news because it means things are very closely matched. Still, we need to tidy up a few details after all the new GPU launches and some last minute changes including AMD’s Radeon R9 290 and its dodgy cooling and final specs on the Nvidia Geforce GTX 780 Ti.

Flicker-free LCD screens, then.

The short version:
It’s mostly twaddle, ignore it. Skip to the graphics stuff below.

The long version:
BenQ is punting a new PC monitor with alleged flicker-free LED backlight properties. Can’t say I’ve ever had an issue with flicker on an LCD monitor. CRT screens, yup. LCDs, nope. Then again, there was a time I’d have scoffed at the benefit of going beyond 60Hz refresh on an LCD panel.

I still can’t entirely compute that, what with 48fps being good enough for HFR movies. But I was simply wrong. 120Hz-plus is lovely and has become a source of some woe. I love my 30-inch panels. But I want 120Hz pretty badly, too.

Anyway, the issue here involves backlight modulation. Run a typical monitor at full brightness and the backlight is simply on. No flicker. No opportunity for flicker. However, crank it down a few notches and the problems, allegedly, appear.

Low-Hz flicker used to be a fundamental issue

That’s because lower LED backlight brightness settings are usually achieved by pulsing the backlight on and off, a technique known as pulse-width modulation. The dimmer the setting, the more time it spends off. Say hello to flicker.

So says BenQ, anyway. Frankly, I can’t tell the difference. This may be a subjective observation. Some people, for instance, are more sensitive to the rainbow effect from cheapo DLP projectors than others.

But whether it’s the rainbow effect, anti-glare sparkle, inverse ghosting, gradient banding, IPS glow – whatever – I tend to find myself towards the OCD end of the sensitivity spectrum when it comes to minor display technology flaws. And I really couldn’t sense the difference with flicker-free technology.

I suspect it’s also quite telling that BenQ’s bumpf tells you that the best way to spot the benefits of flicker free technology is to put a large fan in front of both its screen and a conventional screen. At which point the flicker on the conventional screen presumably becomes apparent.

I am not making this up, it’s the best BenQ can come up with

Oh, BenQ also suggests you take a picture with a digital camera. You’ll see the flicker in the form of banding in still images. If these two examples are the really the best BenQ can come up with, I’m not sure much more needs to be said on the matter.

But if you want to know more or if you think you may be sensitive to backlight flicker, go here. The same website also has a flicker-free screen database here.

Also, for the record the screen I looked at is the BenQ GW2265HM. It’s actually a damn fine 22-inch 1080p screen for a whisker over £100 thanks to a VA panel that’s claimed to be good for 3,000-to-one static contrast (the blacks are bloody brilliant). So it’s definitely worth a look, just forget the flicker-free nonsense.

That graphics stuff
As for graphics, we’ve covered the major points in recent posts. But here’s what you need to know from the very latest developments:

AMD
There’s something weird going on with the cooling on AMD new R9 290 boards:

1. The second-rung R9 290 looks fabulous on paper, cranks out awesome numbers for £300-ish
2. But AMD has upped the fan speed at the last minute
3. This makes the performance even better
4. But it also makes an already noisy card trend towards cacophonous

Nvidia
Full details on the GeForce GTX 780 Ti are out:

1. Yup, it’s the full 2,880-shader GK110 Monty
2. Memory is a ‘mere’ 3GB
3. It’s still faster than anything else, including AMD’s 290X and Titan
4. It’s stupid money (£550-ish)

Where does that leave us?
We need a bit more time for things to play out. At a little over £300 the AMD Radeon R9 290 blows everything else away at the high end for bang-for-buck and would be the obvious choice. Putting the noise to one side, I reckon it will give you a gaming experience that’s largely indistinguishable from a £550 780 Ti.

But if it’s as noisy as some say, that’s a problem. I haven’t heard it running with the shouty new fan firmware, because I’ve been too busy driving this:

A game changer with great graphics but makes the odd surprising noise

Which is quite the thing about town and happens to have a pretty nice line in graphics rendering and a few noise issues itself (two-pot range extender is, er, interesting). But isn’t going to help us get any nearer a final answer for this latest round of the GPU war.

My advice is to wait a bit for the dust to settle. AMD may make further revisions. Board makers will have their own cooling solutions, too, so any noise issues with the reference design from AMD may turn out to be moot a month from now.

Anyway, all that’s left to say for now is that only AMD could take what ought to be a winning position with the new Hawaii / R9 290 GPU and cast doubt on the whole enterprise courtesy of what is a pretty minor issue in the broad scheme of things, namely the cooler. If it was any other company, it would be a minor scandal. From AMD, it’s depressingly predictable.

__________________

« | »

, , .

52 Comments »

  1. CookPassBabtridge says:

    I am in the market for spending some silly money on a top end rig, simply because I always wanted to (read: Have wanted something tippy top for the last 10 years but never had the cash) and can at last afford it. I can’t wait to see what comes out on top, though I want to wait until the manufacturers come out with some better cooling solutions. Hoping to go for either a 780Ti or 290x, a 2550 x 1440 monitor, backed up with a Rift. No idea about processors though. What would be a good match for those components?

    • sebmojo says:

      Conversely, if I’m running a GTX 460 1gig, what’s my cheapest option for a noticeable upgrade? 650Ti? Is the 460 still close to the sweet spot?

      • Sheng-ji says:

        For a cheap, noticeable upgrade, the 650ti would be a great choice – seriously consider the extra few moneys for the boost and the 2Gb version though, they are worth the money!

        • nrvsNRG says:

          (sorry supposed to reply to the guy above)
          The 650Ti would be a completely pointless upgrade from your 460 seeing as how close they are in peformance (Even a 560Ti is better then that,and they’re dirt cheap).
          http://www.videocardbenchmark.net/high_end_gpus.html

          Unless of course you are talking about the 650ti BOOST editions which is still only a couple steps above a 560Ti.

          @sebmojo. In your case the 660Ti is the sweet spot right now especially with the recent price drops.

      • Unclepauly says:

        Listen to nrvsNRG. The 650ti would be so barely noticeable I’d be rightly pissed off at my self if upgraded from a 460. My thought on a noticeable upgrade is at least a 35% boost in performance and preferably 50%. The 650ti does not deliver this. It is more like 0-20%. Yes I said zero the 460 matches the 650ti in some games :D. The 660ti is closer to that magical 50% boost that feels so good.

    • Sheng-ji says:

      i7 4770K cooled by a TC14PE, ASUS z87 Deluxe, 8Gb low profile Ram, 256gb samsung 840 pro, Seasonic X-series 850 Watts.

      Alternative would be: i5 4670K cooled by a Silver Arrow, Gigabyte G1.Sniper 5, Plextor M5P SSD, XFX Pro 750.

      Feel free to mix and match from the two lists and use them as a jumping point for your own research!

      • CookPassBabtridge says:

        Hi there, thanks for the suggestions. Would you recommend keeping clear of AMD offerings, or do you just have an Intel preference?

        • Asdfreak says:

          I would rather look at Jeremys earlier guide on cpus. It is still vallid as far as i know. You should also rather take a i5 than a i7 if you are going for gaming, because the chips are the same and i7 only has extra virtual cores, which are suboptimal for gaming, as two virtual cores might be emulated on the same physikal core, which would actually be slower than having just a i5 with only the real cores. I think Jeremy explains that in the cpu guide. Also he is probably advising intel because they are faster at running applications with few threads, while amd is faster on a lot of threads, but games usually only have a few at most, some even running everything on one thread. Thats why intel.

          • Joshua says:

            Frostbite 3 and some other new game engines reverse this trend, though, with full 8th core usage, making the AMD processors perform with a very interesting price/performance ratio (As in, much better then intell). This change has only kicked in this year, though.

      • aequidens says:

        If you’re going to get a samsung ssd now, I’d suggest not the pro but the evo, rapid mode is too great to pass up on.

    • Ragnar says:

      I would recommend an I5 3570K for processor. It’s basically as fast as the 4670K, but cheaper with better overclocking possibility. You can spend a lot on an aftermarket CPU cooler, but I’ve had great results with a CoolerMaster Hyper 212 Evo. It all depends on how much of an overclock you want to push.

      Also, if you really want to spend silly money, 3 monitors is awesome. I’ve got 3 23″ 1080p monitors on an Ergotech stand off a GTX 670, and enjoy the peripheral vision much more than a larger screen. Running a game on a single-screen now feels like I’ve got blinders on.

  2. thestjohn says:

    Neither of the 290s are what I would call “cacophonous” under normal operation, but I’m not going on record saying they’re nice and quiet either. They’re of a very similar loudness to the reference HD6950/6970 from a couple of years ago.

    • jrodman says:

      Is normal operation running a videogame at maximum quality?

      • thestjohn says:

        Yup. I’m not claiming they’re quiet, indeed the 780/780Ti reference cards are quieter, but having played on a system for quite a few hours in a case with good airflow a few feet away, its not dramatically more audible than other reference GPUs I’ve used.

        Actually thinking about it, the sound level may not bother me as much as it may others; different people obviously have different tolerance levels and/or have problems with different sound frequencies.

  3. Sakkura says:

    The GTX 780 Ti is significantly louder than the GTX 780 and Titan too.

    Anyway, the rule has always been to avoid reference coolers, particularly on the AMD side of things. So just wait for non-reference coolers to arrive. Problem solved.

    • lordcooper says:

      Any idea how long that’s likely to take? My 7750 is killing me.

      • MrEclectic says:

        IIRC end of this month. There have also been rumours that AMD was holding back manufacturers, to first see what the 780Ti performance would be like, before the release of custom coolers and factory OCed cards.

  4. FurryLippedSquid says:

    My lowly GTX 260 is bastard loud when the graphics are flowing, so I doubt the R9 290 would bother me in the slightest. Who hasn’t got a set of kick-ass speakers blaring at least enough to drown out fan noise anyway?

    • Sakkura says:

      Me. But then my headset is excellent at shutting out ambient noise, and can get proper loud if I really want it to.

    • phuzz says:

      It’s during the quiet bits that I notice the noise from my GPU, especially as my CPU is watercooled and dead quiet.
      I should really watercool my graphics card as well, but at water block is something like £80, that I could spend on getting a better graphics card. I fancy getting a new monitor as well, decision decisions.

  5. rapchee says:

    you got an i3 or you’re testing it?
    or is this a joke i’m not getting?

    • phuzz says:

      I think he tests cars as well as computery bits, although with a name like i3 it’s hard to tell.

  6. Didden says:

    The 290 blows even the titan out the water, let alone the 780 for a third of the price and has plenty of room for overclocking to boot. No doubt the manufacturers will put out quieter and and OC versions in the next month or so. Either way, RPS should be screaming this card as a victory for consumers rather than wallowing in the noise, as competition brings prices down across the board – and frankly nVidia have been taking the biscuit for a while now.

    • Sheng-ji says:

      Indeed, the GTX 450 was louder.

      • jrodman says:

        I am personally quite happy with my GT640 kitted out to be passively cooled.

    • Jeremy Laird says:

      Yeah, I think it’s probably a tiny bit early to scream victory for the 290. As I said in the post, on paper it blows everything else away. And I’m really pleased it exists for a little over £300.

      But there are concerns. And not just the noise. It runs very hot and it wouldn’t be the first AMD card that ran hot and failed in numbers. You may be too young to remember the X800 XTX, young padawan (OK, an ATI card). It may all be dandy, but like I said, a little caution to see how things play out is probably prudent. If it was my own money, I’d probably roll the dice on a 290. But recommending what others spend their hard earned means being a bit more careful.

      • CookPassBabtridge says:

        Hi Jeremy. I don’t know if I misunderstood, but in the review I read of the 290 vs the 290x they said part of the reason the 290 looks so good against it is because of the throttling that kicks in on the 290x when it gets hot, losing it a massive chunk of its processing speed and actually dropping it below the 290 (1000MHz down to sub 940MHz I think it said). If companies like ASUS come along and bolt on some more capable coolers that don’t need the throttling, would this not see the 290x very clearly back on top?

        Edit: Here is the page of the article that makes the reference, just below the final benchmarking table and bolded in red:

        http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/4

        • GamesInquirer says:

          You misunderstood, they either meant on price/performance ratio or they’re just talking about the range in general.

          Few places put the 290 above the X in performance because most of those hardware geeks (not an insult) have well cooled setups (great cases with mutliple fans, tidy cable arrangement, great CPU cooling, etc) where the 290X isn’t throttled as much as in that one instance. But yes, a better third party cooling solution and a better overall environment should negate that too.

      • Didden says:

        Young Padawan! Oh the insult Jeremy! :) My first PC was a hand built 386 DX25 that sat in an original 1981 IBM case, designed to survive the zombie apocalypse. A friend put it together for me, gave me the DOS disk’s and then said good luck and left me to it!

        So old enough to know we’ve been here before and will be again:

        http://anandtech.com/show/1062/3

        And let’s not forget those full length behemoth’s nVidia inflicted on the world that required a small reactor to power. So the 290 is in familiar territory, and as always, speed combined with price, wins.

        PS. I think I owned a X800 of some variety, albeit briefly.

      • Sic says:

        Now you’re grasping at straws.

        You know as well as any of us that custom coolers are more than capable of handling this card, and that the X800 is too old to be of any useful comparison (both in engineering and cooling terms).

        It’s OK to like nVIDIA cards better. I do so myself. It’s not OK to like nVIDIA cards better and say something else, just because it looks better on paper to “officially” be “unbiased”. Per the performance tests, and the throttling, the card is a monster being reined in by the reference cooling. It could very well be a game changer. It would be pertinent to say as much, and not dismiss AMD/ATI outright, when they obviously aren’t to be dismissed at all (with this card). Nerds (in the Plottian/Stemkoskian sense) everywhere are rejoicing because of this card. Telling the opposite story, just because it’s an AMD/ATI card is dishonest (I’m referring to your last line).

  7. sandineyes says:

    PWM apparently only affects a small set of users, but BenQ is hardly doing something revolutionary here; many monitor makers are ditching PWM for alternatives, or cranking up the frequency high enough that it would not likely affect anyone at all.

    VA panels, however, are pretty interesting. They are supposed to offer color accuracy and view angles somewhere between TN and IPS/PLS panels, but with shockingly good static contrast ratios. Pixel response times, however, are apparently quite poor on them, so not sure how it would work out for gaming. And they are available in only 1080p, which for people like me who are looking into 27inch 1440p monitors, is a bit annoying.

    Also, on the one hand, the R290 is quite amazingly priced, and with an open cooler design it might actually work, I’d probably pick up a 780 if they dropped the price again to match. If g-sync is as good as it is supposed to be, and will come to 27inch 1440p monitors, I don’t want to be shut out waiting for a competing technology, or for nvidia to stop being bastards and license it.

    • BrightCandle says:

      I do find the PWM ditching amusing. LCDs didn’t originally come with pwm backlights, they are only recently being introduced as they save some cost. The problem is that more than a few people hate the effect it creates on the screen (once you have seen it you always see it). Its kind of ironic that one of the companies that championed its introduction is now championing going back.

    • stahlwerk says:

      I Wonder if the PWM is responsible for the high pitched whine that some displays emit. If it is, then I’m glad it’s on its way out, since I had absurdly bad luck finding a quiet one in the past.

  8. Don Reba says:

    the blacks are bloody brilliant

    Is that a good thing or a bad thing? Having the blacks shine brightly would be the last thing I’d want.

    • CookPassBabtridge says:

      Oh my god that is well bad racism

      • Iron Ladyboy says:

        Yeah. Don’t put that on the back of the box unless you want to be a hate crimer. :/

  9. GamesInquirer says:

    Most people will end up with non reference cards utilizing customized cooling solutions, often the same for both brands, the usual Windforces or whatever each third party calls their own, which will yield even better performance and OC potential. I’ve not bought a reference card from AMD or Nvidia in ages, if ever, although most of my GPUs in the past have been with Nvidia chips, the last being a GTX285 (my current is an 7970 OC), it’s almost always been Gigabyte, MSI or others, whatever felt like the best choice at the time. The lackluster AMD cooling solution is a practical non issue some people echo just to downplay the reality of great GPUs that match or surpass Nvidia’s top at far lower price points even post-price cut, meaning Nvidia’s only advantage remains their software like Shadowplay and mainly PhysX enabled games (not their Windows drivers, neither company has been very problematic except odd cases like those overheating Nvidia GPUs and outside Crossfire which once again is for the few who buy multiple GPUs but is also improving), though AMD’s own Mantle is gaining bits of ground too with the recent addition of companies supporting it including Eidos Montreal (known for the last Tomb Raider, Deus Ex: Human Revolution and of course the next Thief game), Oxide (Stardock fostering Civilization veterans, their new engine will likely be used in many if not all of the Stardock family’s future games) and Cloud Imperium (Star Citizen therefor possibly, eventually, CryTek/CryEngine in general) on top of EA/DICE.

    The heat and noise will only become a real issue if those third party solutions arrive and still fail to fix it. Though I must say even the cards running that hot clearly don’t fry, they’re within working limits. As undesirable as that might still be, it would only be a real issue when put in an already hot rig that’s not maintained well so that its fan is eventually clogged by dust and it ends up frying for real. Not exactly your average enthusiast’s rig, not exactly the primary market for these higher end GPUs. Only one card has ever fried for me and that was my fault, a custom X800XL with passive cooling (yep, no fan on an already hot range of GPUs) that I chose to shove in my at the time barely cooled system.

  10. hemmer says:

    Been running a BenQ VA panel for almost a year now, very happy. Better colors and contrast than TN panels, but still response times rather close to them. I find it’s a good middle ground, and quite affordable as as well.
    Bang for buck indeed.

    As to the new AMD cards, I suspect the noise issue will largely be eliminated by Sapphire and the likes. I haven’t bought a stock ATI/AMD card in ages, the price difference is usually miniscule to nonexistant.
    In any case, I certainly hope the two giants manage to match each other rather evenly. We can only profit from some healthy competition.

    • Sic says:

      That doesn’t quite make sense. VA is traditionally the technology with the worst response time. TN has the best, IPS lagging somewhat behind, and VA being forced to use various “overdrive” tech and similar to just keep up.

      Don’t get me wrong, I like VA panels myself, been using them for years for the black levels, contrast and colours, but they’re really not very fast at all.

  11. mrwout says:

    This is not the first time I see a car pop up in one of your posts…are you secretly Jeremy Clarckson?

  12. CaidKean says:

    About flickers in LCDs: Read up on http://www.blurbusters.com/

    Basically monitor devs have realized that they can use strobing backlights to eliminate motion blur inherent to LCDs due to the way human eyes work with the sample-and-hold nature of LCDs. However the tehnique causes the LCDs to flicker equally to the way a CRT at the same Hz would.

  13. remon says:

    Maybe you should check the new Eizo gaming monitor, the FG2421. It’s the first VA monitor with 120Hz (there’s a turbo mode that goes up to 240Hz), and 5000:1 contrast.

    • fish99 says:

      Can’t find on the specs whether that’s static or dynamic. VA panels have their own issues though, like the weird shadow detail loss at just off from straight viewing angles and generally inferior pixel response.

      I’d definitely be up for a 120Hz IPS when one comes around, assuming it’s well reviewed for it’s 3D abilities. Pretty happy with my 120Hz Asus TN for now though, it’s quick, the 3D is great (245 hrs of skyrim in the last few months, all in stereo 3D) and the viewing angles are about as good as you can get on TN.

      • remon says:

        Of course we’re talking about static contrast. Eizo advertizes 5000:1, I’ve seen reviews where after calibration they hit 4,800:1, which is amazing. The drawback, being Eizo, is the price, about 480 pounds in UK.

    • JoeyJungle says:

      That sounds awesome! I keep on hearing about super cool new monitors with high res/refresh/response time/great colors. Now I just want all of the manufacturers to adopt G-sync and everything will be more than awesome.

  14. SuicideKing says:

    The 290X and 290 with custom coolers from board partners like MSI and Gigabyte will be the stuff to keep an eye on. Sadly they’re all MIA.

    • Low Life says:

      As crazy as it sounds, some tech site wrote that AMD is keeping exclusivity of the cards for a while. I guess they don’t want to make a good first impression?

      Or they can’t produce enough chips so they have to keep the sales volumes low.

    • Wedge says:

      Yeah I was wondering what the hell was going on. Nobody ever uses the shitty reference coolers in practice, yet all the R9′s are using one right now? Seems fucking retarded when the custom ones always outperform stock ones in every way at virtually no extra cost.

  15. Snakejuice says:

    While we’re talking about monitors – does anyone know when the first decent 120hz+ g-sync monitor will be availible?