Week in Tech: Cheap 4K, Adaptive-Sync, DP1.2a, Screens!

By Jeremy Laird on May 15th, 2014 at 9:00 pm.

Sammy's £500, 60Hz, 4K monster

4K, 6-bit, 8-bit and 10-bit panels, G-Sync n’ FreeSync n’ Adaptive-Sync, 120Hz-plus refresh, DisplayPort 1.2 and 1.2a, backlight modulation, multi-stream vs single-stream and IPS vs PLS. The PC display market is completely out of control. But in a good way. Things are developing faster now than at any time I can remember since getting into this game. And I am incredibly, astonishingly, implausibly old. The Atari 2600 was still on sale (just) when I achieved something approaching sentience. I still haven’t truly recovered from the 2600′s piss-poor Pac-Man port. Anywho, the last week or so has seen some really interesting developments in the monitor market, including the announcement that AMD’s FreeSync tech is moving into the mainstream courtesy of official VESA status and the appearance of a cheap Samsung 4K monitor with 60Hz support. High time, then, to pull together the state of play in PC monitors into something we can all understand. Well, hopefully.

Before we start, a heads up on the sections below. I’ve stuck some headers in so that you can quick-jump to the bits that interest you most. I realise not everyone is an anal about colour depth as I am. Well, sometimes, at least.

Also, this is not meant to be a definitive guide to the basics of LCD monitors. Luckily, I’ve got that in the bag already. You can read it here. Instead, this is more of a primer to help you keep up with the latest developments

Driving 4K panels
First up, 4K. I’ve touched on 4K before, so I won’t get too granular with the basics other than to regurgitate the simple idea that it refers to screen resolutions with roughly 4,000 horizontal pixels. Totes amazeballs etc in terms of visual fidelity in games and elbowroom on the desktop. Problematical in a lot of other ways.

Prices for 4K panels have plummeted specutacularly since Asus wheeled out its 32-inch effort for £3,000 back in November

Obviously, you’re going to need one hell of a GPU to push out what amounts to four times the pixels of a common-or-garden 1080p panel. But more that that, it’s a problem driving 4K in simple 2D mode. DVI, even in high-bandwidth dual-link mode, lacks the necessary pixel-pumping minerals, as do most versions of HDMI.

DisplayPort 1.2 and 4K
In fact, even DisplayPort 1.1 doesn’t support 4K as a single stream. Put simply, you need a graphics card with DisplayPort 1.2 support to run a 4K monitor as a single stream rather than multi-stream mode, which is rather like running two monitors or desktops on a single panel and creates all kinds of issues.

Anyway, you’ll also need a monitor that supports DispalyPort 1.2 to do the single-stream thing and as it happens the affordable new Samsung U28D590 (a 28-inch 4K model currently available for pre-order here for £500) will do just that. It will also punch out 4K at a proper 60Hz.

DisplayPort 1.2a and Adaptive-Sync
However, what the Samsung U28D590 doesn’t have is support for DisplayPort 1.2a (yes, 1.2a), which isn’t a huge surprise, because 1.2a isn’t out yet. But it is a critical omission because with 1.2a comes support for what could be the killer gaming feature for the next few years, namely Adaptive-Sync.

It’s a new industry-wide standard from VESA and it’s basically AMD’s FreeSync tech made mainstream. FreeSync, of course, was AMD’s more open take on Nvidia’s proprietary G-Sync tech which involves dynamic refresh syncing between the GPU and display.

Nvidia’s G-Sync is truly a glory to behold. Just a pity it’s all so proprietary…

I’ve seen G-Sync in action and it’s genuinely revelatory. Kudos to Nvidia for that. Without G-Sync, Adaptive-Sync probably wouldn’t be happening. But an open standard that doesn’t restrict you to a particular GPU vendor is infinitely preferable. And now it’s happening. Yay.

High-refresh vs adaptive refresh
We’ve been waiting years for something to come along and improve upon good old 60Hz panels with v-sync enabled in terms of gaming smoothness. Suddenly two turn up at around the same time.

I give you super high refresh rates of 120Hz and beyond and dynamic syncing between the display and the GPU. But which is better and are they antagonistic or complimentary?

The first thing to grasp is that you need a very high performance GPU to get the full benefits of a high refresh display. That’s especially true if you are also running very high resolutions. It’s not much help having a 120Hz panel if your GPU is chugging along at 40fps or whatever.

In that sense, dynamically syncing the display with the GPU is my preference. It allows smooth rendering at much lower frame rates and that means more future proofing for your rig and less money spent.

Pac-Man failed to point up or down. Potentially damaging to the formative gaming mind…

Of course, dynamic syncing at higher frame rates is even better. But I reckon the biggest benefit for most of us is in smoothing things out at lower frame rates. So, I’d prioritise dynamic syncing over high refresh if forced to make a choice.

PLS vs IPS (and VA)
This one is quick and easy. If you see PLS in the spec list of a monitor, that’s a good thing. It’s essentially Samsung’s take on IPS, which has emerged as the favoured LCD panel tech for high performance PC monitors.

IPS isn’t the best by every metric. TN panels have faster response. But overall, it’s the best compromise. A few other companies are also doing their own take on IPS, including AU Optronic’s AHVA (Advanced Hyper-Viewing Angle) tech, just to confuse things a bit more. Sorry. But with any luck, you’ll also see something like ‘IPS-type display’ which will help guide you.

Without getting bogged down in a basic guide to display types (again, go here for the basics), a few VA monitors can still be had. Just be aware that VA is the slowest panel type in terms of pixel response. Overdrive technlogy can offset that, but usually at the price of nasty inverse-ghosting side effects.

Flicker-free backlights
We’ve done this in detail before, but the issue here involves the manner in which LCD monitor backlights are modulated for brightness. Most monitors do this by turning the backlight on and off. Leave the backlight on all the time for full brightness. Switch it on and off rapidly and various frequencies to achieve a scale of brightness.

For those of you who habitually place a fan betwixt self and display, flicker-free tech will be quite the boon

In theory, if the light is switch fast enough, the human simply see something dimmer. But some monitor manufacturers claim this kind of modulation can induce flicker. A bit like the rainbow effect and DLP projectors (which I am personally quite sensitive to), the extent to which you will notice this varies from person to person. I’ve never detected flicker on an LCD panel, myself.

Anyway, you can now buy affordable flicker-free monitors like the BenQ GW2265HM. So if you think this is something that bothers you, here’s a list of flicker-free panels.

Display colour depth
Finally, while we’re talking Samsung U28D590, the official specs claim 1 billion colours, which in turn infers 10-bit-per-channel colour depth. And yet this is a TN panel. Is this possible? What does any of it mean?

I’m going to slightly contradict myself here and dig a little deeper into basic principles. In really simple terms, colours on a digital display are generated using three primary-coloured subpixels – red, green and blue, hence ‘RGB’. And the total number of available colours is a factor of how many intensity levels a display can achieve with each of those primaries. Combine those differing intensity levels in the three primaries and you have your full colour palette. That’s how colour perception works in this context.

Now those ‘intensity levels’ or brightness levels for each subpixel are achieved in discrete steps, with completely off at one end and max brightness at the other. In theory, there is no limit to how many steps you can take between off and on. The better the display, the smaller each step and the greater the overall number of steps per primary.

Now, a 6-bit display can do 64 intensity steps per primary (6 bits have a maximum of 2 to the 6th power = 64 binary values). So that’s 64x64x64=262,144 colours. 8-bit panels offer 256 intensity levels per colour channel. Do the maths and you get 16.7 million colours.

Stand approximately 600ft away and this collection of RGB subpixels will look like a white box. Possibly…

10-bit pushes that up to 1,024 intensity levels per channel and a little over a billion colours overall. Where things get complicated is the use of dithering to simulate greater colour depth. The idea here is to rapidly oscillate a given pixel between two colour states fast enough that the human eye is fooled into seeing something in between.

Nice in theory and suddenly allows your 6-bit panel to render at 8-bit levels. Except dithering is never quite as good as native colour depth and almost always introduces visual noise. Look closely at some colours in a dithered display and you’ll see them ‘fizzing’ away as they hop about between intensity levels.

Moral of the story? True colour depth matters. I haven’t seen the new Samsung TN panel and its alleged 8-bit native capability, but early reports suggest it’s very impressive. Here’s hoping.

The elevator pitch
In the meantime, what should you conclude from all this? It’s complicated, but I’d say that if you can hold out for a while, it’s worth waiting for all this DisplayPort 1.2a and Adaptive-Sync stuff to shake out.

I generally tend to upgrade displays less frequently than any other component. There’s a good chance that if you hang on for a month or six, you’ll end up with both a pixel count and refresh-tech support that will see you good for several years to some.

TL;DR?
4K and DisplayPort 1.2a.

Kthanksbai.

, , , , , , , .

45 Comments »

  1. Peter Radiator Full Pig says:

    So, i was wondering on what the best way to buy a computer, over time was? For instance, I have money put aside, but for now I can wait awhile to buy it. The thing is I dont know enough to know whats a great deal, or whats a good enough deal that i should but that (lets say grahpics card) over a slightly better one that i was going to put in instead.
    Not sure if i am making sense here.

    • The Dark One says:

      I like the Techreport’s quarterly System Guides (with the proviso that they’re a little too enthusiastic for low-end Corsair power supplies). They break each component down between budget, ‘sweet spot’ and high-end price ranges and give a few recommendations for each, depending on what you want to do with the machine.

      • Premium User Badge

        All is Well says:

        Sadly, in the end, there really is no foolproof shortcut – to be perfectly sure you’re getting the system you need/want at the lowest price, you’ll have to do research. On the upside, PC:s (desktop ones at least) are modular so you can easily find the optimal part for you in each product category and then just put those together.

        It’s a lot harder with tablets or laptops as you’ll have to evaluate those as a whole, weighing different qualities against each other – for example, “Should I buy the nicer looking one with a slower processor but more RAM, or should I get the one that’s not quite as good-looking with a Core i7, but without discrete graphics”?

        • TacticalNuclearPenguin says:

          You also have the added issue of wondering where the parts come from, and most of the time you really don’t wanna know.

      • TacticalNuclearPenguin says:

        I can’t comment of Corsair’s low end PSUs, but the high end ones used to be essentially a 100% rebrand of the best Seasonic units, with the exception that with Corsair they came with sexy all-black cable sleeving. There really was nothing better than that, at least 1-2 years ago when i made my homework. You either bought a Seasonic or a Corsair unit and called it a day.

        With the limited ( or non existent ) knowledge i posses about their budget offering, i’d wager that if they have the same attention to good engineering ( even if cheaper and not top end ) the price/performance ratio should well justify TechReport’s hype.

        • Premium User Badge

          Malcolm says:

          I eventually had to replace my Corsair PSU last year (which was one of the excellent Seasonic rebrands) and it appeared that they had expanded their list of OEMs to such a degree that it became a bit of a crap shoot as to whether you were getting something truly good or just so-so. So I went straight for a mid-range Seasonic G-series instead which is a nicely built piece of hardware without being stupid money. Obtaining Seasonic PSUs in the UK was rather more difficult than other brands though, as they are less commonly stocked.

    • tehfish says:

      To give my 2P:

      -For CPU/GPU: Avoid the high-end and also the low-end, aim for the mid or mid-high range somewhere. That way you get decent kit at roughly the best performance/price ratio.
      -RAM: have a look what amounts physical retailers are selling in PC builds (ie. PC world) and choose the higher-end of that as your minimum*, then get the fastest you can afford. Try to leave some slots free on the motherboard for future upgrades if you can.
      -peripherals: Here i would really splurge, as they may last several PC upgrades**.

      *at the moment they’re selling PCs with 4/6/8gb ram mostly, therefore get 8gb minimum
      **i’ve always brought mid-range PC components, but i got a £450+ screen and a £100+ mouse, both have been in use for a very long time. They may be expensive, but they’ll last and you’ll make up the money in the long run here.

      • nrvsNRG says:

        **i’ve always brought mid-range PC components.
        Where did you bring them?
        also…
        Avoid the high-end and also the low-end, aim for the mid or mid-high range somewhere.
        That’s….. brilliant advice, you should write in magazines and stuff.

        • Premium User Badge

          All is Well says:

          What information did you hope to convey with this? What reaction did you hope to elicit? What value did you feel it would add to the discussion on this site?
          Peter Radiator Full Pig asked a question and tehfish tried giving some general advice, e.g. helping.
          You decided to be snide.

    • Premium User Badge

      Saaz says:

      All the advice so far is good. At the risk of making a me-too post, I wanted to reinforce those suggestions and add a couple notes from my experience:

      Do a little research, try to catch up on current tech and what’s right around the corner. I usually fall behind on that kind of stuff, then read up on it for a few hours when I’m ready to upgrade. I like Anandtech and Tom’s Hardware. Tom’s has CPU and GPU hierarchy charts that can help make sense of the confusing naming conventions.

      I go for the mid-high range, as tehfish suggested. CPUs (and I think GPUs) seem to follow an exponential price/performance curve where, say you can spend another $10 and get the next higher CPU, then another $10 for the next, then suddenly it’s $100 for the next step up, then $500 for the next. I buy the last CPU before that $100 jump. I try to do the same with GPUs, though I may spend more if there’s a certain new game that I want to be able to play at high settings. (I’m looking at you, Elder Scrolls and Fallout.)

      Buy a good monitor, mouse, and keyboard. It’s worth spending a little more on your main interface into the computer, and those items can last for many years, through many GPUs and CPUs. I tend to upgrade motherboard/CPU about every 3 years, and GPU every 12-18 months. I’m typing this on an IBM Model M keyboard that’s 21 years old.

      Couple specific suggestions:
      At this point I would probably try to hold out and wait for a monitor with displayport 1.2a and adaptive sync.
      An SSD has an amazing effect on perceived performance. I can’t recommend highly enough that you at least get an SSD boot disk. 120G should handle OS, apps, and a few games.

    • SuicideKing says:

      Buy the best you can with ample room and forward compatibility, then iteratively upgrade and add on stuff you need to. Finally after 3-5 years change the motherboard and CPU if they’re a bottleneck. Usually GPUs are good for 2-3 years and you don’t upgrade your display. Otherwise just add other stuff (quieter cases and fans, an SSD or two, etc.) you need. This is what i do, at least.

      Usually some knowledge and foresight regarding the industry can help with decisions (like, is there a new kickass thing just around the corner? When will DDR4 become mainstream? How many threads will be good for games over the next few years?) and then of course “can it play Crysis?”.

      Bottom line: Buy the performance you need now, with some headroom for later. Upgrade if you can feel the slowdowns (like i can while playing Arma 3 or Rome II…not that i play Rome II anymore).

    • ScottTFrazer says:

      I’ll echo the others here but add another place NOT to skimp:

      Power supply and case.

      Interfaces between the motherboard and CPU change ever couple years, RAM slots change every five or so. Power supplies are in the 5-10 year range for new output types, and getting a good modular one today should last you quite awhile.

      Similarly, the ATX standard was developed in 1995(!) and is still in use today. And all the variants created (FlexATX, MicroATX, etc.) will all mount to a bog-standard ATX case. Buy a good case and it will last you a lifetime.

    • bp_968 says:

      With gaming PCs it’s easy to play “chase the tech” and never actually buy anything while constantly waiting for the next level of tech, or waiting for the nearly next level tech to drop in price. “Future proofing” is a false economy. Buy what you need now, upgrade later if you feel the need. All the previous suggestions are good but I’ll give you another one related to the article: 2K+ monitors are lovely (2560×1440) and the extra vertical resolution helps offset the terrible switch to 16:9 from the vastly superior 16:10 we used to have (for a few years we actually went DOWN in pixel density thanks to that idiocy). Buy yourself a nice 1440p monitor or find a 1200p panel. In a couple years when the 4K panels with some fancy sync tech comes along at a good price buy it and make your current monitor a secondary display. Once you have two you’ll never go back. :)

  2. Hunam says:

    Without my glasses on, I see white/black lines on the last picture from about 3ft away from my screen :P

    • Luke says:

      It just makes my eyes water. Especially when reading the text just below the picture ,and not actually looking ‘at’ it.

  3. d3vilsadvocate says:

    10bit is only useful in combination with a professional GPU (Firepro, etc.) and the corresponding software (Adobe Photoshop). It’s totally useless otherwise.

    Also, since I’m playing games @ 2560×1440 I sure as hell won’t upgrade to 4k anytime soon. The performance requirements are just too high, it’s out of proportions to be honest. I wouldn’t even wanna try playing a new, demanding game @4k without at least a SLI setup…

    • sandineyes says:

      I feel the same way. I’d rather take 1440p with variable refresh rates. I personally believe that multi-GPU setups have too many caveats to be worth the investment, and no reasonable single-GPU setup will suffice at 4k, even with variable refresh rates.

      • kaffis says:

        While I agree (about the 4k bit, not the multi-GPU; I rather like my SLI setup, though it’s aging enough by now that I wouldn’t want to throw 4k at it by any means), I’m quite glad that 4k is a thing, and is getting cheaper on some panel types. That means the competition will drive it better and cheaper (on the non-TN panels) by the time GPUs have caught up.

        This is actually a really good position to be in; usually, it’s the GPUs that are pointlessly powerful, and they’ve always been the faster end of the price-cutting equation, meaning that previously, bumping monitor resolution when you were ready for it was always REALLY expensive, if it was an option at all.

        That, and 4k represents finally breaking free of the shackles that flat panels have had to TV panels and video formats. How long did we stagnate at 1080p because BluRay hadn’t even conquered the world and nobody wanted to start a new format to go higher? Now, if only we can cut it out with this inferior 16:9 crap, 16:10 PC Master Race for life!

        Between affordable ultra-high resolutions coming down the pipe, and HMDs striving to compete as affordable and worthwhile gaming displays, this is an exciting time to be a gamer.

  4. Premium User Badge

    caff says:

    Really glad to know that finally this monitor/graphics sync/refresh/pixel thing seems to be resolving itself… gradually.

    I can see clearly now the rain has gone…

  5. nrvsNRG says:

    For me its more important to have flicker free, and less blur and high refresh for gaming. I stare at my screen for 12+ hours a day and since Ive had the new BenQ Z series monitor Ive had no eye strain and no headaches. The fact its TN is not important, (with the right adjustments) it still looks great and I have a gpu that can get the 120+ frames. I’ll upgraded it to G Sync as soon as the kit is available.

  6. Jabberslops says:

    None of these “new” V-Sync technologies are enough to motivate me to upgrade to a new monitor when my Asus 144Hz TN panel is good enough. Do I wish it had better color? Yes. I knew it would be worse than my previous monitor, but I decided the trade off of higher refresh rate vs better color was worth it. That being said I do find 144Hz to be pretty useless when anything beyond 100Hz is almost the same to my eyes as 85Hz.. Just going from 60 to 85Hz was a big change though with my GTX 780 ticking away under the “hood”. It even shut up a “friend” of mine who was one of those people who latched on tot he idea about eyes not being able to see beyond 30Hz.

  7. DanMan says:

    It’s kthxbai u n00b L0L !!!!!!!!!!11111111111111111

    *cough*

  8. Premium User Badge

    El_MUERkO says:

    For those interested in 4k gaming AU Optronics is releasing a new 32″ 4k panel in Q3 that’ll likely be picked up by Apple, Dell, Samsung etc… and support 1.2a Adaptive Sync.

  9. cloudnein says:

    So starting a project of …filming?… a short …film?… subject and we’re choosing to shoot in 4K. This is the true 4K (4096×2160), not UHD (3840×2160). Alas, looks like any “4K” monitor less than $10k is really UHD. Given that it’s only a measly addl .5MP strip of emitters, and that looking at 4K on UHD will mean scaling or cropping, I’m holding out and hoping true 4K displays drop to the same price bracket. But it’s looking like UHD will be the VHS to 4K’s Betamax. :/

    • TacticalNuclearPenguin says:

      Do you realize the difference is merely a matter of aspect ratio? I’ll take 3840 if that means being able to keep the 16:9 standard. You can’t put 4096*2160 on a 16:9 screen without rectangular pixels, and that’s why you see a lot of screens not supporting that, it has nothing to do with price.

      We could have the very same debate with 1920*1200 vs 1920*1080 and it would be just as meaningless. Also, if you research a little more into 4k you’ll notice that, especially in the movie department, a lot of slightly different resolutions have been tried, all falling under the marketing buzzword of 4k.

      You still have roughly 8 millions pixels, more or less, regardless of how you want to call the tech. If it was for me, the stupid terms like “HD” and “4K” wouldn’t even exist. The former is the worst offender though, teaching people to use the word “definition” in the wrong way.

  10. Premium User Badge

    Joshua says:

    When can we expect to see those DisplayPort 1.2a monitors?
    Also, will every monitor that is 1.2a automatically include Adaptive sync?

  11. TacticalNuclearPenguin says:

    No mention of the incredibly important AMVA technology, in the rush of going “OMG IPS” few actually take the risk of purchasing the former to try it out, which is a pity.

    Essentially, AMVA is the ONLY panel technology that breaks the 800-1000:1 static contrast ratio of ALL current panel technologies and shifts it towards a flabbergasting 3000-5000, which let me tell you is an incredible, monstrous difference.

    Static contrast, in short, refers to the real, native panel capability of handling an actual scene without the backlighting and post processing tricks used by dynamic contrast ( which is marketing only and should be always OFF ). A high static contrast means the panel has a higher dynamic range, therefore providing something considerably less flat and more akin to the real world as seen by our eyes which have an even greater range.

    Furthermore, AMVA has the same focus on color quality than IPS does, if anything the latter is getting a little overhyped in that regard because it’s the choice of the professionals. But WHY? Because IPS ( and IPS-likes ) has ZERO color/contrast shift when viewed head on ( watch this on a non IPS: http://www.lagom.nl/lcd-test/viewing_angle.php ) and that makes it a perfect candidate for color critical work.

    Thing is, this particular issue is not much noticeable outside of the example i linked and it might not matter for some, at all, which is why i’d suggest trying out new things. As of now, i can’t change my AMVA with anything else. Eitherway, the LCD world is full of assumptions and the only “serious” site i found is this: http://www.tftcentral.co.uk/articles/panel_technologies.htm

    Oh, it’s also cheaper.

    • remon says:

      AMVA isn’t the only panel that goes above 1000:1 contrast. VA in general has been doing that for quite a while now (e.g. Samsung F2380 which had around 3000:1 used a PVA panel).

      But as you say the article is buying into all that IPS hype. Laird is saying that the VA panels are the slowest but no mentions of the sort for the IPS. Contrary to that, there are VA monitors coming out with 120hz support, like the Eizo FG2421, but no such IPS monitors.

      • TacticalNuclearPenguin says:

        Aye, that was a cPVA or something, not all VA version manage that but yeah, i agree.

        Since we seem on the same page, ever found a 1440p panel with 3000+ contrast? I’m waiting for such a thing to make the jump but it doesn’t look like happening.

        • remon says:

          I don’t think they have released a VA at that res, they are mostly IPS monitors. But, BenQ has just launched a 32″ 1440p monitor with 3000:1, the BL3200PT. TFTCentral should have a review up later today.

          http://www.tftcentral.co.uk/reviews/benq_bl3200pt.htm

          The problem is that it’s aimed towards CAD/CAM, and it costs 1k dollars.

    • gunny1993 says:

      Reminds me of the death of Plasma, as far as i recall it was generally vastly superior to LCD but was beaten out by bad marketing on the plasma side.

    • Jeremy Laird says:

      Actually, I did mention broader VA into which AMVA falls.

      I see AMVA panels regularly and they are nice but I maintain they are the slowest panels of all – subjectively much slower than IPS, and subjectively is what actually matters in the end – and in the context of a gaming website that makes them impossible for me to recommend even though I very much appreciate the heightened contrast. AMVA can also suffer from poor viewing angles.

      For the record my primary panel is a PVA mainly for the excellent contrast and colours (SAMSUNG XL30) and I prefer it on balance to IPS, but I am willing to put up with the poor response and inverse ghosting. That’s a personal thing that I certainly cannot impose on others, especially in a gaming context.

  12. MichaelGC says:

    lolnoob here. A bit of googling tells me I shouldn’t confuse Adaptive Sync with Nvidia’s Adaptive vSync. Righto then. I can certainly commit to not confusing those in principle. In practice I am compelled to confess to a considerable complement of continuing confusion.

    So, er, what’s the difference?

    • Premium User Badge

      All is Well says:

      The difference lies mostly in that Nvidias solution requires a Nvidia GPU to function, and only GT 6XX and newer ones at that, if I’ve understood it correctly.

      EDIT: To be fair, Adaptive Sync probably won’t work with any old card, but the point is it’s less limiting than Nvidias solution.

      • MichaelGC says:

        Aha, I see. I was imagining massive, swingeing differences between them, but that makes much more sense. Thanks!

    • jinglin_geordie says:

      Adaptive Vsync is available in the Nvidia drivers now. It is not anything to do with this article, it is basically an attempt to iron out the frame rate fluctuations that can be caused by normal Vsync when the FPS drops below the monitor’s refresh rate. This is all done on the GPU, not the monitor, so it is all about modifying the GPU’s output.

      The real comparison is between Nvidia Gsync – which is a proprietary hardware solution to modify the monitor’s refresh rate to match the output of the graphics card, and which requires a Gsync-enabled monitor or one modified with the DIY kit – versus Adaptive Sync which is a new standard that is rolled into DisplayPort 1.2a and which will require a monitor and a GPU that support it, and which will also modify the monitor’s refresh rate to match the GPU output.

      • MichaelGC says:

        Aha again! Right, well I definitely won’t be confusing them in future, in that case. Many thanks!

  13. FFabian says:

    So if I want to play on a 4K display I better save up for a high-end GPU or three, right? I’m personally not a big fan of spending BIG MONEY on my PC, so I usually buy mid-range tech. I’m asking the experts here: Is there any chance that the average Joe could buy reasonably priced components to have a 4K gaming PC this year? Or do we have to wait a year or so before 4K is possible without spending upwards of 800€ just for a display and a new GPU?

    • Jeremy Laird says:

      Depends what you mean by reasonably priced and what your visual settings and frame rate preferences are.

      Would you view £500 for a screen and £300 for a GPU as reasonable? If so you could have a Samsung 4K screen plus an AMD 290 GPU and very likely run quite a few games at reasonable frame rates and image settings with AA disabled. But you’d have to accept that some games wouldn’t run great.

      Mainstream 4K is a little way out. But I tend to think you keep a display for much longer than a GPU, so if you bought today, you’d keep the screen for you next GPU upgrade and that’s when things would hit the sweet spot. In the meantime you’d have a glorious 4K panel. This approach will not be for everyone!

      • FFabian says:

        Thanks. Problem is if I buy a 4K Display now and continue to use my old GPU the only game I’m able to play in glorious 4K is Pacman.

    • Premium User Badge

      phuzz says:

      You’ll need £££ GPUs to play modern games at 4k resolutions at high graphics settings. An older/cheaper GPU should be able to play older games at high res (if they support it), or a modern game set to lower graphics settings.
      Unfortunately if you want to play eg Bioshock Infinite at 4k resolution with all the graphics options set to High, then you’re going to need lots of money (think four Titans for playable framerates).

  14. Premium User Badge

    phuzz says:

    I was going to buy a new monitor this year, but my budget for that went towards an Occulus Rift instead.
    Typically my second monitor is now starting to display dark patches :(
    Hopefully by the time I can afford a new monitor DP1.2a will be widespread and cheap.

  15. SuicideKing says:

    I personally think that 4K is coming a bit too soon (not enough content and/or lack of supporting tech). Adaptive sync is what should be pushed IMO.

    Though to be honest, i expect adaptive sync to be a crucial part of pushing 4K, and i really don’t expect DP1.2a or 1.3 to be common on both the display side or the GPU side before DX12 and associated GPUs hit the market.

    December 2015 will be really special for visuals, especially in games. DX12, adaptive sync, new GPUs, more current-gen games, more PC-first titles…