8K non-gaming and the importance of pixels

Philips' new 8K wonderscreen

Yes, yes, I know 8K gaming is an utter irrelevance. Frankly, 4K remains a niche gaming resolution. But hang with me. 8K monitors are popping up from major manufacturers and with them the build-it-and-they-will-come logic of gaming at a preposterous resolution of 7,680 by 4,320 pixels. The fact that gaming at 8K isn’t really viable with current hardware is, up to a point, a separate issue. The mere possibility of gaming at a resolution fully four times higher than 4K begs the question of how much resolution matters and indeed how much it matters compared to other factors including refresh rate, response, colour quality, panel size and more. How important, truly, are pixels?

To quickly bat the 8K gaming thing into touch, it’s a couple of 31.5-inch monitors from Dell and Philips that give rise to the notion. Dell, as it happens, cut the price of its 8K UltraSharp UP3218K to a mere $3,899 earlier this year. Both screens also require dual DisplayPort 1.3 connections, which is a bit of an ask.

Needless to say, running a modern game smoothly at 8K isn’t a goer with current graphics cards. Even the fastest current GPUs fail to nail every single game out there at 4K if all the eye candy is switched on. At fully 33 million pixels, 8K is four times the pixels of 4K and thus four times the load on your graphics subsystem. That’s too much heavy lifting even for the strongest hardware.

And yet when Philips wheeled out its 8K 328P8K screen at the recent IFA trade show in Germany, it got me thinking. In all candour, it first got me thinking because I adore high DPI when it comes to general computing. I love my 40-inch 4K monitor, but I also love the font rendering on my high-DPI laptop and phone screens, and they both make my desktop monitor look utterly clunky in that regard.

But it also got me thinking about gaming panels on the PC. My personal instinct has usually been that pixels come first. By default, I want more of them and of better quality. Up to a point it’s a little hard to unpick where pixels start and broader panel quality kicks in. Is pixel response a general panel attribute, or exclusively to do with pixels? Likewise viewing angles.

Broadly speaking it’s fair to say there’s a tradeoff to be made between pixel count and other screen specs. The most obvious is refresh rate. Both pixel count (which you can also think of as the native resolution of an LCD panel) and refresh rate have implications when it comes to bandwidth. Every screen has a certain number of pixels and it takes a certain amount of data to describe the colour each pixel contains.

Is 1440p and 144Hz the sweet spot for most of us?

Every time the screen is refreshed, of course, that data is updated. Which is why, currently, you can’t have both a super-high refresh rate like 240Hz and a super-high resolution like 4K in the same screen. Existing video interfaces simply don’t have the bandwidth to refresh the eight million pixels of a 4K monitor 240 times a second. That would be nearly two billion pixels refreshed per second. And that’s just the display connector. Your video card has absolutely zero chance of cranking out two billion fully rendered, pixel shaded, bump mapped, anti-aliased – whatevered – modern game engine pixels every second. 4K at 60Hz is half a billion pixels per second and that’s too much for most GPUs.

All of this may seem like an acutely first world problem – the inability to have one’s 4K cake and eat it at 240Hz. But it’s something that causes me genuine, if fleeting and somewhat superficial, anguish. As it is, I choose 4K for the benefits it brings for all-round computing. I like having the sheer screen real estate. But I do miss the slick, buttery responsiveness of a high-refresh rate monitor. Every time I have to knock a game down from 4K to 1440p to get smooth frame rates, I’m also reminded that my monitor is far from the optimal gaming solution.

Then again, for those games where I can achieve smooth frame rates at 4K, the detail and scale of the image is truly a sight to behold, and I find myself reluctant to give up on that for a little more smoothness and response in other titles. You could argue that it depends what kind of game you’re playing. Hair trigger shooter? You’ll want the refresh. Heavy-duty RTS game? Go with the pixels.

Of course, most of us want to play all kinds of different games and that means trade offs must be made. That said, there are other areas where I reckon those trade offs are less vexing. Sure, I’d prefer the sharper response of a TN panel when playing games, but IPS and VA panels are good enough and I much prefer the colour contrast and viewing angles. Likewise, while I appreciate the benefits of adaptive sync, I find the impact it has on gaming smoothness and response marginal when compared to running at lower resolutions and higher refresh rates.

As things stand, then, it primarily comes down to pixels versus refresh rates. Price plays a part, too. Upping either the refresh rate or the pixel count costs, there’s no avoiding that. Upping both while maintaining panel quality costs even more.

I suppose if my day to day life were a little more gamey and a little less worky, I’d lean 1440p and 144Hz on an IPS panel. That’s a pretty sweet spot to be in for all-round gaming. As it is I’m running 4K at 60Hz on a VA panel and I live with the numerous downsides in return for a few pretty spectacular upsides. But that’s me. What about you? Shout out below where you find your screen sweetspot and whether it’s pure preference or more a matter of price.

70 Comments

  1. Premium User Badge

    Drib says:

    These things are so far above my pay grade that I feel vaguely offended even hearing about them.

    • Premium User Badge

      Earl-Grey says:

      I can not fathom how anyone with a mortgage, car, bills to pay and mouths to feed can ever justify spending that amount of money of a sodding monitor.

      I suppose I could buy a cheap RV and start making crystal meth in the middle of nowhere.

      • svge says:

        It’s pretty simple: don’t have children and buy what you want.

      • Ergates_Antius says:

        Serious answer: I guess people who have some professional/job-related reason for wanting a high-res monitor could justify it.

        For the rest of us – just wait a few more years until such things are mundane and cheap. (I remember when VGA came out…)

        • Sgt_Big_Bubbaloola says:

          What during the time of the dinosaurs Grandpa??? Har har!!

          I can’t talk by the way, i’m now 43 *weeps into his bag of Wurthers*

          • Darloth says:

            Those things are delicious, and I thought so when I was 7.

            (I’m not 7 anymore… unsure how I feel about that.)

      • LegendaryTeeth says:

        People spend more on less useful things. It depends where your priorities are.

        How much doe smoking cost? Or eating out for lunch every day at work? Or buying a coffee every morning? Or golfing? There are lots of things lots of people do which they don’t have to but still cost thousands of dollars over the year. This is one thing, all up front, that you can use for years and years.

        • FriendlyFire says:

          It’s the same thing I tell people about an office chair when their eyes bulge at the $1500 sticker price of my favorite ergonomic chairs.

          Yeah, it’s a lot of money up front, but they all have 10+ year warranties and can easily last for double that if treated well. Less than $100/year for something you might sit in 10, 20, 30 hours a week? That’s a bargain, and my back/neck are worth far more than that.

          • Premium User Badge

            daktaklakpak says:

            Oooh – what do you recommend for an ergonomic chair? I’ve got a chair in my office that is about to give it up and needs replacing, and most of the ergo chairs I can’t find anywhere local to actually try sitting in first.

          • feedmeakitten says:

            My Neck, my back
            are worth more than that

          • agentghost says:

            thats one expensive chair. link?

          • hausser0815 says:

            I can recommend the Steelcase “Please” model. I did hurt my back a few years ago when i helped a buddy to carry his washing machine into his new flat, and sitting in that chair was by far the least painful thing i could do over the next few days.
            It doesnt come in a “gaming” version with LEDs and racing stripes though.

        • pack.wolf says:

          *looks at his Star Wars Armada man-toys*
          sigh…

  2. Nelyeth says:

    1440p, 144Hz and 23 inches with G-sync, powered by a 1080 GPU. I’ve been playing on this set-up for around 6 months now, but before that, I had a 1080p, 60Hz, 17-inch laptop with an increasingly struggling GTX660M. The transition was, to say the least, brutal.

    As far as I’m concerned, I’m perfectly satisfied with what I have : I feel more would be too much, both size-wise and price-wise (my current rig is around 2000€ if I include everything, and it was already way more than I first considered). The thing is, I thought the same when I bought my laptop, and again before that (heck, I considered 20FPS smooth before I got my laptop), so perhaps it’s just that I haven’t tried going beyond 1440p. Perhaps I’ll think about it in a fair few years, when I’ll have a decent amount of change to spend.

    • Imaginary Llamas says:

      Are you me? I had the same laptop setup before upgrading to a 1440p 165Hz 27 inch GSYNC screen with a GTX 1080. In terms of the display, I wouldn’t upgrade just for a higher resolution, and an even higher refresh rate would be pointless.

      Funnily enough I partially got the laptop so that I could play TW: Shogun 2 on something other than the lowest graphics, and the desktop so that I could do the same with TW: Warhammer.

      • Nelyeth says:

        I got TW:Warhammer right after I bought my desktop just so I could get huge armies fighting on my screen with absolutely no stuttering. I’m not into TW games, so it was a “one and done”, but damn, it was at that moment I realized my laptop was a fossil.

        Also, hi me.

  3. FLoJ says:

    Currently playing solely on a 144hz 1080p BenQ with a 1060 because <3 Quake.

    Very close to buying a second monitor that will be higher DPI, better colour contrast and using that for general use.

    The have cake and eat it dual monitor solution :)

  4. automatic says:

    8k is pointless because most games have crappy textures resolution. And the more you crank up texture res the more complex the 3d model must be so the edges don’t look blocky. Then you will also need a betther graphics card. And when you do you go to the game store and realize the most beautiful games out there are in pixel art or a low res piece of art like The Witness.

    8k is not for gaming, it’s for detailed desktops background images on huge monitors.

    • LewdPenguin says:

      This is pretty much my viewpoint, at least regards games having (by default) poor texture resolution. Many of the times I have a jarring moment ingame with suddenly noticing graphical ugliness it’s a result of things suddenly looking like crap because the textures don’t hold up to close viewing, or noticeably poor implementation of shadows or occasionally some other effect that suddenly becomes apparent in a scene. Objects being culled early whilst still visible at screen edge is another fairly common and at times glaring one, as is incredibly aggresive LODing that results in a blatant bubble 20′ from the player where all the graphical super pretties end, and another in the mid distance where nearly everything swaps to 10-poly models.

      None of these issues would be fixed or addressed in the slightest by me spending a chunk of cash on something better than the (probably close to 10 years old) 1080p 60hz monitor currently adorning my desk, and whilst there are a handful of games I play that might benefit from the higher resolutions on a shiny new monitor the cost of potential improvements in such a small number of games really isn’t worth it for me, I generally play very few of the ‘triple A’ type games most likely to ship with ultra high res textures/resolution support, and probably over 90% of the stuff I play tops out at 1080p anyway.

      By the time the mid-budget games I play most of start routinely supporting high enough texture resolutions to make 1080p a notable handicap I expect a 1440 or maybe even 4k monitor will have become rather more affordable and I’ll pick one up, for now cranking everything up to max and running 60fps at 1080p is still pretty damn good.

    • davidgilbert says:

      I agree with this; its the reason why modern digital / landscape pictures aren’t necessarily “better” than the classic oil paintings like the hay wain. It’s a different technique or way or representing something in art but if its badly put together or too narrow in focus it doesn’t work (I am aware art is viewed differently by different people but the same can be said of games)

  5. Siythe says:

    “All of this may seem like an acutely first world problem…”

    “…may seem…”

    Oh mate.

  6. James says:

    I got a 4k monitor and a GTX 1070 for a birthday (it was that or a cheap first car, I think I made the right choice) and I just can’t go back. Some games, especially poorly optimised games (Fallout 4 I’m looking at you), remain at 1080p but in games like Witcher 3 it’s JUST. SO. PRETTY.

  7. gi_ty says:

    For me I’m saving and waiting for a good 34 inch ultra wide curved 1440p monitor with some form of adaptive sync. I plan on buying a new video card at the same time. I have been using a 26 1080p monitor for like 5 years and i feel the cost upfront for an awesome monitor is worth it. I will likely be using it for at least 5 or six years. When I have the money in hand and can find a VA or IPS panel one that fits these specs I will take the plunge.

    • apeshake says:

      I did this and finally got a dell U3415W which I’m really happy with – it’s more for work than games but at the same time i was holding off playing prey till i got it and it’s a perfect fit – I’m pretty new to pc gaming as well so 60fps (on a 1070 at 3440×1440) seems blinding fast after the ps3/ps4 experience

  8. malkav11 says:

    4K is so, so nice. And while I was initially assuming I’d be mostly watching video content at that resolution and sticking to 1080p gaming (since it’s the same ratio with less pixels and 4K’s a hell of a load on a graphics card), it’s actually wound up being the opposite. With no 4K Bluray drives on PC (at least last time I checked) and Netflix 4K streaming only available on TVs or Kady Lake-onwards intel processors (I’m one generation behind – sigh), options for 4K video content are actually pretty limited. But I’ve had little trouble running most modern games at 4K (with stunning results) since I don’t really notice the difference with framerate unless it either stutters a great deal or is significantly below 30 FPS. If anything, older games have been the main issue, either not supporting 4K at all, or having trouble with scaling or other display bugs – e.g., Kingdoms of Amalur doesn’t display any quest log text at 4K which renders it largely unplayable at that resolution.

    I think the most gorgeous games so far have been Witcher 3 and anything recent in EA’s Frostbite engine, particularly Dragon Age Inquisition and (for all that its visuals have been weirdly poorly received) Mass Effect Andromeda.

    • KenTWOu says:

      4K/UHD Bluray drives on PC do exist, but they have ridiculous DRM limitations as well, which make them absolutely pointless for most of the buyers. So if you want UHD playback, it’s better to buy Xbox One S.

  9. dangermouse76 says:

    Weird thing is here we are, and on PC with the latest games how much do we need to spend to get 1080p 60 fps on high settings ? It’s still not that cheap.

    Resolution still costs money. That’s where the value is for those that manufacture this stuff.
    Like printers and ink. CPU’s and performance gain.Formats and distribution are still locked to these ideas of resolution and fidelity.

    • ColonelFlanders says:

      To be fair you can pick up a GTX 970 for a packet of peanuts and a reasonably good drawing of a cow these days, and you can pick up the 2600k for a similarly bargain bin price. So it really isn’t that expensive any more, if you can avoid falling for the bullshit Intel has been trying to feed us while trying to hide their stagnant development.

      I do agree though, resolution is in no way indicative of fidelity – my VHS copy of The Fifth Element still looks great. There will hopefully come a point where accessory manufacturers find something truly ground-breaking to work on, so they can finally stop trying to sell us fucking pixels and refresh rates.

  10. *Legion* says:

    I’m happy with my 1440p monitor, and will be for the foreseeable future.

    Where I want more pixels is in the screens that get pressed right up against your face in a VR headset.

    • Elric666 says:

      Word.

      Also, I’m reasonably confident that monitors will be a thing of the past in the not at all distant future. Once VR and AR headsets and glasses get comfortable enough to wear for extended periods, and the resolution reaches the level where you can comfortably read fine text, monitors will become obsolete. The advantages of turning your head and having computer all around you are too compelling.

      • Asurmen says:

        They will become more popular, but they’re never going to replace the monitor.

  11. steves says:

    “Is 1440p and 144Hz the sweet spot for most of us?”

    Yeah, probably. If ‘us’ is gaming nerds with too much money. And getting 144 FPS is overkill really, at least for me. 90-100 is about the sweet spot, combined with G-Sync.

    When an 1180Ti or whatever it is comes out, and can do that kind of framerate with reasonable settings, in anything I want to play, at 4K…well, that’s the time to get an 8K monitor.

    Just as long as it can run 4K at a decent refresh rate, and 8K at 60 for that print-quality text.

    I confidently predict I’ll be waiting years for this, and comfort myself with memories of just how mind-blowing getting a 19″ 1280px CRT monitor was back in the day.

  12. takfar says:

    Six year-old, 1080p, 60hz monitor here. GTX970, so mostly 60fps on most games with most bells and whistles. It’s also my work monitor (and computer). It’s the best I can afford, and it’s fine by me.

  13. Lukasz says:

    With my new last month built I decided that 34inch 1080p ultrawide is the most suitable monitor for my needs.
    I decided that over 1440p because of size and immersion as well how much horse power I need. With new gpu upgrade being 3-4 years from now on and only assuming I’m making good money compared to my expenses 1440p would be too much right now.
    Can’t imagine myself at this stage running a 4k monitor

  14. ropeladder says:

    FWIW, I recently played through Portal 2 in 640×480 because I like the aesthetic…

    • Elric666 says:

      I’ve been playing Undertale on my 89000$ 8K 240 Hz display and it looks AMAZING!

    • floogles says:

      Ha! I do the same with half life 1 – I think it’s nostalgia, but I try to tell myself “maybe it’s because it was made in this resolution it looks better” when I inevitably turn the resolution up and then back down again.

  15. Raoul Duke says:

    What I find really frustrating about 4k (and, it seems, 8k) monitors is that they don’t seem to have sensible implementations of 1080p upscaling.

    There’s no logical reason why you couldn’t have a super fast little subsystem in a 4k monitor that literally takes a 1080p signal and doubles the pixels in each direction. But many of them don’t do this for some inexplicable reason, turning 1080p into a blurry mess.

    Alternatively, nVidia and AMD could get their shit together and enable this type of scaling on the GPU.

    Really, it should be possible to have 4k/8k/whatever for desktop use and games that aren’t demanding, and then a seamless ability to properly display 1080p for more demanding games, all in the same monitor.

    I was happy to read that the new 4k Panasonic OLED TVs do this really well and also have extremely low input lag for 1080p. Hopefully a sign of things to come in the monitor world.

  16. Ragnar says:

    Three 23″, 1080p, 60Hz, IPS panels on a lowly RX 480 – translating to 1080p for recent games and glorious, triple-wide, 6040×1080 Eyefinity goodness for older games.

    I’d love to upgrade to a monitor with adaptive sync (and preferably 144Hz), as I hate screen tearing and stuttering, but I hate the exclusivity of the technologies. I don’t want my monitor to lock me into a video card vendor for the next decade.

  17. ZipD says:

    Been using Dell’s 29″ ultrawide for a good 4 years now and it has decent frames on a 980Ti and is wonderful for movies. It’s really difficult to not consider another ultrawide for the next jump. The 1440p + 144hz is exactly what I’ve been waiting for but has to be G-sync, HDR and ultrawide as well. Monitors last for a lot of years and getting a good one for gaming is indeed a heavy investment.

    To actually have a GPU can that churn out 144 fps is another matter altogether and may take a couple of generations to achieve. By that time, you’ll have new games that’ll demand even more.

  18. Premium User Badge

    Don Reba says:

    I’ve been rocking a 27″ 4K IPS for about a year, and I only reduce the resolution as the last resort to get those 60 FPS. It is almost always better to reduce rendering quality, especially per-pixel shaders, like screen-space reflections and ambient occlusion.

  19. celticdr says:

    Been using a Philips 4k 40″ monitor for over 2 years now thanks to Mr. Laird’s positive words about it – once you go 4k you can’t go back.

    Plus my AMD R9 290 runs 4k fine at 60hz with AA turned off (which you don’t really notice at 4k anyway) on high to ultra settings on most games I’ve played.

    I’ve got some FPS shooters too and more than 60hz isn’t a noticeable change to me (though I do get that there are folks out there that can see above 60hz I am not one of those individuals it seems).

    Next year will be GFX card upgrade time – hopefully I can get a 1080 ti for cheap then ;)

  20. Tiax says:

    I’m currently rocking a 34 inches 21:9 panel that I really like.

    Although what I’m truly waiting for is a similarly sized OLED panel optimized for gaming.

  21. kwyjibo says:

    I don’t know why pixel densities on monitors are so shit. I thought we’d have stopped making them now.

    Phones and even laptops now have high density screens. But the smallest 1440p monitor is still 24″ and it has been like that for years.

    • Premium User Badge

      Don Reba says:

      4k at 27″ is getting into acceptable range. Although, I wouldn’t say “no” to an 8k 27″ monitor.

    • OmNomNom says:

      Because you don’t normally hold your monitor up to your face

  22. fray_bentos says:

    In the last year I went from 27″ 1080p at 120 Hz to 27″ 1440p at 165 Hz. The main driver for the upgrade was spending more time working from home and text/image resolution being too clunky at 1080p/27″ compared to the silky smooth text on a 1440p/14″ laptop. After that, I couldn’t face my 10 year-old office monitor so that also went to 27″/1440p (but 60 Hz). For gaming I prioritise framerate over resolution. Yesterday, I went from emulating Zelda Breath of the Wild at 1440p 30fps to Doom (2016) at 110-144fps (GTX 1070) and the difference was mindblowing (though both are fantastic games). Having had a 120+ Hz monitor for 5 years, 60fps looks like a slideshow, and I generally change settings to target framerates of at least 100fps, particularly for faster paced fps, racing and third-person games (anything with a sweeping or panning camera). I immediately spot framerate drops below 100 fps, even on GSync, while I can just about, but not always reliably, tell the difference between 120/144/165 fps.

  23. KikYu0 says:

    Acer Predator Z35
    2560 x 1080, G-SYNC, 144 HZ

    I love it all night Long

    • Lomaxx says:

      Eizo FlexScan L557
      1280×1024, No Adaptive Sync, 60 Hz
      Something like 13 years old by now. I love it all night long. xD

      Yes, thats still my mainmonitor (my second is even older with 1024×768).

  24. Catchcart says:

    For my next laptop, I’m actually considering dropping down from QHD+ (3200 × 1800) to plain full hd (1920 x 1080). The desktop and native (GNOME) apps looks nice and pixel-free in QHD but it’s a battery draw and actively a drawback for gaming because resolution switching is a hassle.

    Of course, had I stuck with full hd I wouldn’t have minded the way it looked. Now, I fear that I’m going to annoy myself by focusing on the rough edges.

    • fray_bentos says:

      Indeed, my laptop has a 14″ 1440p screen and I run it at 1080p most of the time as the payoff in terms of battery, GPU performance, and avoidance of scaling issues, outweighs having very slightly smoother fonts. In retrospect I feel that the resolution of the screen was really just a marketing gimmick.

  25. Premium User Badge

    CdrJameson says:

    I was quite shocked the other day to find that my 1600×1200 monitor dates from July 2004.
    It’s, y’know, fine.
    60Hz, can view from any angle.
    Does portrait mode, which surprisingly works in a few games.
    £50 well spent, IIRC.

    • Elric666 says:

      A word of warning… once you go widescreen format, you can never go back.

      • Premium User Badge

        CdrJameson says:

        Because if I rotated the screen it’d get wedged between the desk and ceiling?

      • Kefren says:

        The only reason you can’t go back from widescreen is because the whole industry expects it so you’d have a headache with adopting anything that wasn’t standard. I actually much preferred squarer screens such as 4:3 (great for games where the character is in the centre of the screen, with room for some status info on the side; 8 directional shooters; scenes with more verticality, rather than looking through a letterbox). I know experts say it is easier to look horizontally than vertically but it feels like nonsense to me; when it comes to mm alterations I find them both equally easy.

  26. Kefren says:

    Having recently got into VR, which is a drop in resolution over my monitor, I’d say I prefer lower resolution but proper 3d and being “inside” the game over hi-res on my main computer monitor. Over the years increases in pixels on my main PC haven’t corresponded to enjoying games any more. In fact, I’d found it harder and harder to get into them after 40 years of playing games. I’ve seen most of it before, and higher resolutions stopped being important to me after HD. Whereas VR, even in its infant and enthusiast-only state, has done more to make me excited about games and computers again than anything else in the last fifteen years. :-)

  27. KingFunk says:

    I’m kinda bringing a spoon to a nuke fight here, but I’m still rocking the PC I bought for Fallout 3. Processor is Q9450 which is overclocked up to 3.0 because I don’t think the board can handle much better (and it’s more faff than replacing cards or RAM) but I doubled my RAM up to 8GB for about £30 and spent about £130 on a 1050Ti.

    My monitor is still my old BenQ 1680×1050 (which does 1080p nicely) and I can run everything I’ve tried with a level of compromise I find acceptable. For example, I’m currently playing Dragon’s Dogma, which I believe is more CPU heavy than GPU, so it was chugging a bit due to my obvious bottleneck (and my use of an ENB). However, once I dropped it down to 720p and took a couple of settings down a notch, playing it at a higher frame rate made it much more fun.

    I usually aim for 30fps for everything else (cos mostly I can’t hit a solid 60) but DD is the first game that really felt significantly better when I lowered the eye candy to increase my FPS – it’s not like it’s the prettiest game anyway…

  28. Premium User Badge

    MajorLag says:

    I don’t believe I’ve ever even seen a 4k monitor in person, so this comment is assuredly coming from a place of ignorance, but I can’t help but think the only point to resolutions that high is e-peen waggling.

    Does cramming more pixels into a space really give you more screen real-estate? In my experience everyone pretty much runs whatever they’re currently working on in fullscreen. If they want to do more things at once, they get more monitors. Scaling up the DPI means scaling up the fonts so you can read them, and probably the icons and other graphics too, so what did you really gain from all that extra detail? I’m sure there are applications where that higher resolution can be put to good use, but they’re probably few and far between.

    It seems to me that if you have that kind of money it would be better spent on higher refresh rates, increased color depth, and/or more monitors. But I like I said, I’ve never seen or used one, so this is entirely speculation.

  29. OmNomNom says:

    A big factor here is the screen size.
    Personally I believe if you’re at 24″ then 1080p is fine.
    27″ is probably the max for 1440p
    4k is hardly worth it below 32″
    8k I can’t imagine is worth it even at 40″.

    So then you’re in the realm of what fits with your room / desk as well as the issue of power and game support.

    My current personal holy grail would be an ultrawide that has 1440 in the vertical and does at least 120hz with blur reduction (preferably TN or IPS).

  30. Carra says:

    Bought a Korean 27″ 1440p IPS screen five years ago. Still runs and it’s miles ahead of my previous 22″ 1050p non-IPS screen.

    And 1440p seems to be the sweet spots where my 1060 can run games at high detail. So I’m probably not gonna upgrade in the coming time.

    • Asurmen says:

      I assume it’s overclockable? I bought an American screen with same specs nearly 4 years ago now, but between the slow death of DVI (I’m getting the Asus Vega 64 just because it has a DVI), and up coming HDR/Freesync 2 screens next year, I’m thinking mine will be on eBay some time over the next year.

  31. Foosnark says:

    $3,899

    I was going to say something like “that’s a lot of avacado toast” or “I won’t even spend that much on the eye surgery required to be able to see the difference between 1080P and 4K” or “my monitor and graphics card together cost about 1/10 of that.”

    And then I looked at the estimate of what I’ve spent on my modular synthesizer and, well. Different priorities I guess.

    • sabby says:

      I regularly spend more than the cost of a 1080ti on my eurorack, so i’m right there with you.

      Hobbies are expensive!

  32. criskywalker says:

    I’ve been stuck with a 1080p monitor for a few years and do you know what? I think that’s good enough. I mostly don’t notice pixels and if something bothers me I activate anti-aliasing.

    But having changed my graphics card recently I have finally been able to enter the 60 fps world and it’s such a big difference that I wonder how I could play before that! And now I want a 144 Hz monitor and I don’t care much about higher resolutions. Would it be nice to have some more pixel density? Sure. But I care about fps the most and now I’m curious about HDR since it seems to be a bigger game changer than 4k and 8k.

    • Asurmen says:

      I’m waiting for a 1440p, high refresh, HDR Freesync 2 monitor. A resolution that’s better than 1080p without breaking your GPU, and at that res you can still get higher than 60 frames with a good GPU, combined with the HDR and a sync technology making the pixels ‘better’.

      Seems like win to me.

  33. mactier says:

    Coming from the picture basics, I want 2000 or better yet 3000 REAL contrast, a full AdobeRGB colour space, a good gray scale (12 bit LUT, maybe HDR, although I don’t know how they correspond, since traditionally they seem independent), and since it’s available, of course, 144 Hz (maybe 120 Hz, but 60 Hz is just genuinely not enough), and then any resolution beyond 1080p would be fine with me (but also a requirement, since 1080p is also genuinely not quite enough, but on the lower spectrum of genuinely enough – at least on any “close” screens or monitors and in multi-tasking or gaming).

    For more realistic, everyday requirements sRGB, and a close to 2000 contrast, would also be fine.