CES 2016: OLED PC Screens Are Coming

It's $5K. It's 4K. It's Dell's OLED display.

Didn’t I tell you 2016 was going to be great for PC gamers? Well, it’s started. The greatness, that is. And 2016, too. In fact it’s all so fantastic even the orgasm of capitalism, technological futility and conspicuous consumption that is the Consumer Electronics Show (CES) in Las Vegas could not resist and duly served up an uncharacteristically compelling collection of intriguing new PC stuff. Stuff like OLED displays, silly-fast SSDs, graphics boxes for laptops, VR all over the shop and, well, other things that want your money.

A virtual elephant

First, allow me to escort the elephant in this post to a metaphorical holding pen. Virtually reality hardware was a big theme at CES this year, the major punctuating points of which m’learned colleagues have already covered.

Personally, I don’t yet have a firm feeling for how fast or to what extent VR is likely to become a compelling display tech for PCs. But with the likes of Oculus Rift and HTC’s Vive imminent in final retail form, the simple fact of availability will soon be reality. Which begs a fairly obvious question. Can your PC handle it?

Nvidia has wasted no time in wheeling out its own GeForce GTX VR Ready initiative. And then there’s the Oculus Rift compatibility tool. But this is potentially quite a complex question, the matter of how well a given PC config will handle VR, so I’ll cover that soon in a separate post. We have a few months before the headsets arrive, after all.

OLED is coming

OLED PC displays, then. The critical thing to understand about OLED is that it’s a much better idea than LCD. That’s because each pixel is essentially a tiny little stand-alone light (each pixel is actually three tiny little lights, red, green and blue. Well, unless it’s one of those skanky pentile efforts you see in smartphones, but I digress).

That matters because when a pixel is off, it’s just that. Off. No light at all. And that means effectively infinite contrast compared to a pixel that is on. What’s more, with each pixel being its own light source, the whole viewing angle problem disappears. Yup, that’s perfect contrast and perfect viewing angles.

This is in distinct contrast, pun intended, with LCD technology, which is made up of a grid of tiny shutters that attempts, imperfectly, to control the transmission of light from a rear-firing backlight. There is always some leakage. Worse, that leakage varies depending on your vantage point, causing all kinds of viewing angle issues.

The upshot is that LCD is a fundamentally bad idea for a full-colour computer display. It’s a testament to the ingenuity of the world’s engineers that they have made something so unsuitable quite so workable.

You’ve seen the front, here’s the rear of Dell’s 30-inch OLED beast

Oh, and OLED also has far superior response to LCD. On paper, it really is the killer display tech we’ve been waiting for. At CES, Dell rolled out its UltraSharp UP3017Q. It’s the full OLED monty in 30 glorious inches, complete with a 3,840 by 2,160 4K / UHD pixel grid and 0.1ms response times.

Yep, 0.1ms response. Now, quoted response times aren’t to be entirely trusted. But on the face of it, that’s one tenth the time it takes the best LCDs to respond. However, you slice it, it seems OLED could effectively put an end to pixel response as an issue that even needs to be thought about.

If all that sounds spectacular, the $4,999 list price is pretty eye catching, too. It’s also worth noting that the new UltraSharp implements a number of counter measures to offset the big OLED bogey, which involves pixel degradation in various forms. It’s a hint that OLED is still something of a work in progress.

But if the Dell’s pricing renders it rather hypothetical for most of us, the good news is that a bunch of much more attainable OLED-powered laptops also debuted at CES. Most notable for we gamers is the tweaked Alienware 13 laptop. The highlights are Nvidia Geforce GTX 960M graphics and a 13-inch 2,560 by 1,440 pixel OLED dsiplay.

A number of other less gaming oriented OLED laptops were announced and I can’t comment on the quality of any of them. Up to a point, that’s moot. What matters is that OLED is beginning to happen for the PC. Hurrah.

In the meantime, one intriguing offering in the conventional LCD column is a new 34-inch 3,400 by 1,440 pixel curved IPS monitor from Monoprice for $499 (I don’t think the Monoprice brand operates in the UK, but a search for Achieva Shimian might turn up the goods and I’m hoping for a price of £400 maximum). That’s cheaper by far than I’ve seen for that particular form factor and it runs at 75Hz, which is nice, though it does lack frame syncing of any kind. Could be worth a look.

100Hz and 34-inch uber wide? Ermeferkinggerd…

Oh, and Asus took the wraps off its new ROG Swift PG348Q, which sports the same 34-inch uberwide and curved specs but ups the ante to 100Hz refresh. No word on price, but it won’t be cheap. And yet it might still be the most desirable gaming monitor on the market for now.

Speedy SSDs

If there was a non-surprise, albeit a welcome one, at CES it was the generous supply of fast solid-state drives. For instance, there’s the OCZ RevoDrive 400, good for 2.6GB/s reads and 1.6GB/s writes and random access operations in the 140,000 to 210,000 range. With those figures, it can only really be an M.2 drive with NVMe support. Which it is, funnily enough. Pricing hasn’t been announced, I don’t think.

Or how about Zotac’s new PCI Express drive? It’s a straight adapter card that drops into a x4 PCIe slot and notches up 2.4GB/s and 1.2GB/s respectively. Zotac hasn’t announced IOPS specs, but the drive’s Phison PS5007-E7 controller is good for crazy numbers – 350,000 random reads per second and 250,000 random writes. Those kinds of numbers should make an absolutely tangible difference to how snappy a PC feels.

Zotac’s PCIe SSD looks like a monster

Then there’s Patriot’s new line of Hellfire drives. Patriot is doing both M.2 drives and add-in cards. The add-in version is claimed good for 3GB/s reads and 2.2GB/s writes. Again, no prices as yet.

These are all dizzying numbers compared to a typical SSD of the conventional SATA flavour. These typically top out at around 550MB/s and 100,000 random access operations per second. The only snag is that motherboard support varies. The add-in cards will work in pretty much any PC as a data drive, but BIOS support for that NVMe stuff is needed for bootable support. In other words, school up on your motherboard before you pull any triggers.

Bits and PCs

Elsewhere, we’re into the individual items of interest. AMD was at CES bigging up its graphics and CPU plans. For the most part there was nothing we haven’t already covered, but AMD did confirm plans to unify all its CPUs and APUs into a single socket. Yay. Goodbye AM3+ and FM2+. Hello AM4.

One nice little item was Razer’s Core, a break out graphics box for laptops. Having a separate box containing a desktop graphics and designed to boost the gaming performance of a laptop is not a new idea.

Is wider compatibility for break-out graphics boxes imminent?

Previously, such boxes had been tied to a specific laptop. The Core’s USP is that it’s compatible with any laptop with a Thunderbolt 3 port. At least, that was the initial buzz, before things were a little muddied by inferences that the Core ‘could be’ instead of ‘will be’ compatible with any Thunderbolt 3 port.

I’m not clear on the official status, but I think generally the direction of these boxes is towards more openess and cross compatibility. Of course, it’s still no go with your knackered two-year old lappie. But it’s setting things up nicely for the future of laptops that don’t run out of gaming puff quite so quickly, methinks. Whatever the eventual outcome, it’s another CES reveal that’s so fresh out of the oven that pricing hasn’t been set.

I quite like the sound of a new version of the Intel Compute stick, a PC-in-an-HDMI-dongle thing with a proper Core M processor and also some new quad-core NUC mini PCs incoming. But let’s be honest, neither will be gameable.

Who can really put a price on lasers and robots?

And finally… In Win’s ridiculous motorised, smartphone-app controlled transforming H-Tower PC case got a price. $2,399 or roughly £2,000. Silly for a PC case? Yup. But then it has a laser perimeter demarcation system, so frankly all bets are off. As am I. Off, that is. Until next time. Goodbyeee.

Sponsored links by Taboola

More from the web

From this site

54 Comments

  1. Infinitron says:

    Post is a bit long.

  2. dontnormally says:

    (psst, there is no break, whole article appearing on front page)

  3. Premium User Badge

    gritz says:

    How does OLED fit into the current trend of maximizing and synchronizing refresh rates?

    • roothorick says:

      Quite a lot… it totally changes the game.

      OLED transition time comes entirely from capacitance in the circuit — the OLED emitter itself transitions at literally the speed of electron flow (a domain in which nanoseconds is entirely too coarse of a measurement). The arms race to make faster pixels is over — the actual pixels are orders of magnitude faster than the human eye.

      Refresh rate therefore becomes all about the circuitry driving the panel, and here things get a bit complicated. Since the pixels themselves produce the light, the current load of producing that light goes through the circuits that are actually changing those pixels. The ICs that need to handle each refresh need to be beefy enough to handle that current, which means bigger circuits, which means more capacitance…

      I don’t know how the exact numbers work out, but I’d be rather surprised if the adoption of OLED doesn’t bring with it refresh rates upwards of 200Hz becoming the standard.

      • Premium User Badge

        caff says:

        My main concern is the pricing – this is not exactly brand new tech, but it’s still very expensive?

        • Bing_oh says:

          It’s very new tech for large applications like monitors and TV’s. For those applications, it’s been in the experimental stages for awhile but we’re still not talking about common tech, here. I’d expect to see the prices drop over time as it becomes more widespread, but this is really the birth of a new generation of monitors and TV’s we’re seeing right now.

      • Crocobutt says:

        I personally can’t wait for better PC monitors. Ever since CRTs got phased out of the market by inferior LCDs (only advantages they had were being lightweight and consumed little power) there hasn’t been anything that provides great colors, high refresh rate and no ghosting. Love the fact that OLEDs have really good color reproduction and high pixel reaction times, alas… their lifespans are still very short. Blue color pixel degrades a lot faster than others. There’s hope for quantum dot displays? Which are still at least a decade away before we see any real application in consumer products.

        • Premium User Badge

          Eleven says:

          OLEDs aren’t entirely free of issues though. The current generation of screens have problems switching on a pixel from its completely-off black state, as the pixel takes tens of microseconds to warm up and start working. This leads to a ‘black smear’ effect where moving black objects appear to have an ugly black shadow following them around, as the pixels can’t quite turn on fast enough.

          You can get around it by never switching the pixels fully off, and showing a dark grey instead of black, but then you reduce the contrast of the image which is otherwise one of the key features of OLED.

          • Crocobutt says:

            That’s a shame.
            Ways to go before we get any decent monitor tech free of most pressing issues. I’m still bitter about CRTs getting completely eliminated from the display ecosystem.

          • Clavus says:

            I remember the black smear from my Oculus Rift (which uses an OLED screen). However they do appear to have fixed this affect in the consumer version of the Rift, so it’s probably a solvable issue for monitors too.

  4. Nasarius says:

    I hope the curved display thing goes away soon. Aside from anything else, they’re terribly awkward to use in any multi-monitor setup.

    • Gibs says:

      isnt extra wide format meant to substitute multi monitor..?

      • TacticalNuclearPenguin says:

        Extra wide doesn’t necessarily have to mean curved, and indeed the first 21:9 monitors weren’t.

        Curved has a point as long as it’s perfectly tailored to your viewing distance and position relative to display size with the right curve radius to match, but any effort has been rather random so far.

        The even funnier bit is that LCD is not even a good tech for this kind of crap, OLED is a better fit ( it can even be made foldable), but i hope this is not an excuse for this thing to grow in popularity.

        • OmNomNom says:

          Curved actually serms to make sense with these larger screens. I was trying out an Acer Predator X34 and it is 34 inches wide, without the curve some of the peripheral screen would look too far away tbh

    • Christo4 says:

      I actually like curved screens…
      I got a simple 27″ one for gaming, but i returned it because of some bad pixels and then i got a 23″ curved samsung and honestly, imo the curve makes it seem better, because the edges of the screen don’t seem to go away from your vision as much as it did on the 27″ one, the curve makes it a lot better.
      Granted, it may also be because i wear glasses or something, but i found it to be interesting. Not a WOW factor, but in the future i plan on getting a curved ultrawide and i’m really curious how it will turn out.

  5. Xzi says:

    “$4,999 list price”

    Funny enough, the Oculus Rift is also using an OLED display. $600 seems pretty reasonable by comparison now.

    • TacticalNuclearPenguin says:

      Some smartphones too, and this only tells us one thing, that there are many ways to make OLEDs now and almost none of them are the real proper deal.

      OLED TVs aswell are not much expensive considering the new tech, but any product that is at least semi-professional uses a very different kind of OLED technology that costs 4-5 times more.

      As of now if you get a bargain OLED monitor you’re going to get a piece of crap. Many people there probably remember me as an OLED evangelist, and i still am, but if you don’t want to get shafted you need to wait another couple of years when there will be less segmentation and more value.

      We’ve waited a decade already, it’s not that much afterall.

      • Xzi says:

        I mean, a lot of those smartphones also cost around $600 or more, and they’re much smaller screens. I wouldn’t really consider that bargain-priced, either. I got a 2K IPS panel for $350, and it’s the best monitor I’ve ever owned. So I guess it depends on the brand and whether it’s a product honestly on a good sale or if it’s just cheap.

        • TacticalNuclearPenguin says:

          That’s true, but smartphone pricing can go anywhere and that’s often almost irrespective of the screen used.

          OLED TVs aswell are not exactly a “bargain” on the absolute sense of the world off course, but at similar panel sizes than this Dell they don’t even cost anywhere near 5k. At the same time i don’t really trust their current tech, and going back to smartphones there are various OLED techs that started being used years ago without the price moving from the LCD iPhone ballpark.

          Your 350 bucks screen is a real bargain, but that’s a honest one, LCDs are perfectly mature and by now a good sale is a good sale, not necessarily a suspicios one. When OLEDs will be a little more stable, there will be less risk for wild compromises in the various price ranges.

        • OmNomNom says:

          Yeah OLED is nowhere near good enough for the gaming mainstream. Screens are most of the cost of these phone handsets and still would not be nearly good enough for gaming.
          This flagship Galaxy S6 Edge I’m using now still shows horrid movement blur (comparrd to a gaming monitor) if you scroll a page quickly.
          Also, compared to the price of phones the Dell almost seems like a bargain

      • carewolf says:

        No, it is just hard to make large OLED screens. Making small ones are a solved problem, but making large ones have been a huge problem, which is why are only starting to see them now, and at large prices.

  6. TechnicalBen says:

    I agree with your description of the OLEDs. Quite nice to have the level headed pros and cons laid out fairly.

    I’ve seen some big OLED TVs (LG) and have an OLED phone (Note 3). While both use the cost saving lower subpixel method, both look stunning compared to LCD. Both work wonderfully in any lighting condition, angle and content. While I’ve not seen any burn in yet, it is a worry.

    Though as you say, the current cons, possible missing subpixels and possible burn in, are less a worry on TVs and mobile phones, it’s an absolute no no for monitors. I’d hate to have either happen, so will wait out till the tech is more mature. Did the same with LCD and my current panel, though cheap, has lasted and does a fantastic job. :P

    • Asurmen says:

      That degredation is why I want electroluminescent quantum dots to come out. Two competing standards with the same strengths: consumer benefits

  7. TimRobbins says:

    I hate being a living room PC gamer sometimes. We’re still waiting for a TV that can handle >60fps inputs, much less stuff like g-sync and oled.

  8. Premium User Badge

    JamesTheNumberless says:

    Getting pretty pleased that this is the year I’ve decided to upgrade my graphics card, get a more modern monitor, upgrade all my drives, and totally avoid anything remotely VR related… I just wish those OLED screens were a bit cheaper :| Well… A lot cheaper actually.

  9. Morte66 says:

    A couple of questions…

    Does extra SSD bandwidth (over SATA rates) actually make an appreciable difference in games?.

    And do these non-SATA drives actually deliver better performance to single users, or are they enterprise solutions which get their bandwidth from parallelism and can only deliver it on a server supporting many clients?

    • gunny1993 says:

      For SSd and games, past a certain point, no, theoretically faster load times but that’s about it.

      And M.2 and PCIe drives give faster read/write times which is good for application load times, file copying, best use of it I can think of for the average user is for having you OS on it for crazy load times.

    • TacticalNuclearPenguin says:

      It all comes down to what you mean with better gaming performance.

      People are quick to mention load times or windows boot, let alone linear speeds, but that’s not the real essence of an SSD and especially not what NVME is about.

      The real thing is IOPS and random performance in general. It is obvious in games that use severe amount of texture streaming at all times, as stuttering is severely reduced. It’s not about linear speed only, there’s the need to retrieve various data on the fly and put it there, and while SSDs helped a TON now NVME is even better.

      To actually answer your question, no, i don’t think an upgrade is needed as of now if you already have a very good SATA one, but prices are already dropping even on the new tech and as such it’s only a good thing that they’re pushing boundaries in such an extreme way. If you don’t have any SSD now, get whichever fits your budget and you’ll be fine, if you can wait however this tech is going to be decently affordable and very very hot in a not so distant future.

      • Geebs says:

        I think any improvement in apparent texture streaming would be due to a combination of the same PCI bus bottleneck being in play either way and purchase-associated bias from the high cost of the SSD.

        16 gigs of system RAM for caching textures, plus a common-or-garden SSD is a much cheaper investment than a swanky PCI SSD for a little while, I reckon.

        • Geebs says:

          (* PCI bus bottleneck to VRAM, that is)

        • Unclepauly says:

          Don’t forget that if you have 16 or more GB of ram you can run a ram disk which is even faster than both options.

    • Person of Interest says:

      In a word: no. You need an exotic SSD for desktop like you need a 16-core CPU to play retro games.

      I heard Geoff Gasior, former staff writer of The Tech Report, say on a podcast that none of the SSD performance improvements of the past five years have made a noticeable impact on the desktop experience: drives are fast enough that they are no longer the bottleneck for most desktop tasks. Here’s his review comparing Intel’s latest PCI-E SSD with a SATA one from 2009: he measured near-identical load times for everything he tested. File transfers are faster, although sometimes not by as much as you’d think.

      His advice, which I agree with, is to take any money you’ve budgeted towards a premium-performance SSD and instead put it towards getting the biggest SSD you can afford, so you can avoid using a spinning hard disk as much as possible.

      SATA SSD’s have a typical read latency of under a millisecond, and a write latency that doesn’t matter because you should have write-behind caching, so you’re basically always writing to a RAM disk. Any speculation you read about new SSD tech (NVMe, U.2, etc.) improving frame stutter or resource streaming is, to my knowledge, totally unfounded. I’ve read reviews from every English-language tech site I can find, and several non-English ones, and none have measured and reported game-impacting latencies caused by SSDs.

      And why should they? Games are made to (mostly) work on HDD’s, and SSD’s have orders-of-magnitude lower latencies. Of course they can stream game assets easily.

      Similarly, nearly all desktop workloads are sequential-access. The high-end SSD offerings are most useful in servers with highly parallel workloads. That’s why you’ll see hardware enthusiast sites use increasingly contorted metrics to try and differentiate these products, like IOMeter with 64 simultaneous transfers, or disk traces that defeat the normal flushing and cleanup lifecycle by running full-tilt for an hour straight.

      I think Jeremy’s evangelism for this new tech is a consequence of his proximity to the hardware enthusiast scene, and his admirable overall geekiness. But none of it will matter to us in practice until there’s a massive shift in how desktop apps take advantage of the parallel access patterns that the newest SSD’s need for peak performance.

      • Premium User Badge

        alms says:

        @PoI amazing comment, thanks for perspective.

      • TacticalNuclearPenguin says:

        I can agree with that, but let’s also not forget how hard was to get the idea to the masses that an SSD could improve streaming-heavy games.

        Now that the idea is finally ingrained, do we need another couple of years to accept NVME aswell?

        I too agree that an SSD is already enough. No one should spend wild money now, but the tech is going forward incredibly fast and the prices are getting better, in a couple of years people with a brand new motherboard would be pretty stupid not to opt for a good M.2 NVME thing if they have to replace their old SSD or even start using their first one.

      • Morte66 says:

        Thank you, POI, that’s pretty much what I suspected.

      • Smoky_the_Bear says:

        “Games are made to (mostly) work on HDD’s, and SSD’s have orders-of-magnitude lower latencies.”

        This is essentially the key when it comes to gaming. If things were noticeably stuttering on a SATA SSD they would be completely unplayable on a HDD. Games aren’t designed that way, the only benefit is load times because after that pretty much everything is on the RAM anyway. Load times on current SSDs are so fast that I honestly do not see the need for it to be faster. Reducing 3 second load times to 2 seconds is not something many people will pay hundreds extra for tbh.

        • Unclepauly says:

          My above post was to Person of Interest, this ones to Smoky the Bear. SSD’s GREATLY increase gameplay in certain genres of games. If you play MMO’s, it’s a vastly different experience on an SSD vs HDD. I could go on but basically I’ll just say this, any game that constantly is hitting your hard drive for info is going to be improved vastly. Even a heavily modded Skyrim is a different experience.

      • Unclepauly says:

        At 1st I was thinking you meant that SSD’s were overkill until I re-read it. I got you now that you are meaning SSD’s are enough. My question is, since SSD’s have brought up our baseline performance on loading performance, what is now the bottleneck? Is it bus speeds? I can put my ram, cpu, gpu, everything in my system at different speeds yet my game still loads in 10 seconds off the SSD. Why can’t it be instantaneous?

      • OmNomNom says:

        You will occasionally find a multiplayer game that you are at a disadvantage running from an HDD. If rounds start immediately and so load time is important to be on an equal footing and of course streaming new assets to avoid stutter.

    • Premium User Badge

      Addie says:

      I’ve recently maxxed-out my RAM to 64 Gb (big datasets for work) and can run Witcher 3, Fallout 4 etc from a ram disk. Doesn’t improve load times much compared to the Samsung SSD I was using before; suspect most games spend a lot of time decompressing assets, compiling shaders etc. when loading, which the fastest disk in the world wouldn’t solve. As such, save your money on fast SSDs (unless you’ve a specific read-limited workload that would benefit) and just buy a big SSD instead.

      • OmNomNom says:

        Instead of a ramdisk try something like primocache and heavily cache your reads and writes, helps a lot if you don’t have SSD but have memory to burn

    • Tsumei says:

      To put SSD load times in perspective, we have PCI-E SSD’s for our 3d rendering server park. That’s kind of the one big place where I can imagine you want one.

      So if you render from 3d a lot, I can see an argument for getting one; but for gaming you’re not really going to notice.

  10. mattevansc3 says:

    With the computesticks its all going to come down to price. Realistically they are screenless tablets which would give you a more advanced Steam Link but lacks the controller interface.

    Pair it up with a Logitech K400 keyboard and you should have a nifty discrete multimedia centre that streams games.

  11. Cassius Clayman says:

    As always your articles a great pleasure to read, Mr. Laird. Hope you won’t be off for too long!

  12. Banks says:

    Oled surely is the future but the technology is still not perfect and it will take a while before it gets cheap enough.

    • Premium User Badge

      shoptroll says:

      I’m not sold on VR at all right now, but I can definitely get behind picking up an OLED display in a few years. Even if it costs more it’s guaranteed to get significantly more use than a headset I’m likely only going to want to use for a handful of select activities.

  13. OmNomNom says:

    The 34 inch IPS 100hz Asus screen is basically the same as the Acer X34, same panel. Also has been available for many months now.

    Great aspect ratio but unfortunately still not the best screen for gaming, IPS blur is very evident. I have one under my desk now that I am returning tomorrow. Also the X34 is £1000, every indication is that the Asus model will be even more. Together they are the best gaming ultrawides on the market, but they are nowhere near good enough yet, at least not for me.

  14. Rhodokasaurus says:

    Thanks for the tech updates. I find these really helpful!

  15. Retne says:

    Interesting. Very interesting

    Now I need to figure out how long would be a sensible time before I upgrade.

    I was waiting for USB 3 (check), faster SSDs (check, but perhaps still a bit pricy), knowledge of VR requirements (check, also still a bit pricy, although this is for speculative purposes as I can’t justify a headset).

  16. Unsheep says:

    New tech is always great.
    However the question is how much value you get initially, since it takes time for the gaming industry to adapt. So buying this new stuff fresh at launch seems like a waste of money to me.
    I’d rather wait until I can derive the maximum value from the least investment.

    • OmNomNom says:

      Nice in theory but this never actually happens until tech is obsolete, in which case it is often sold at no profit. If you want to ride the curve you have to take the hit.

  17. Curundir says:

    I find it strange that OLED gets such wide-spread praise as a superior technology. A few years ago, everyone was concerned about the durability of OLED panels – at the time they didn’t reach 10.000 hours. I can’t seem to find any current numbers on durability of the newest panels, but I don’t think it would be very far off from 10.000 hours. Now, that might still seem like a whole lot of hours, but in my case, using my screen daily for about 12 hours, I would get 833 days of use before it breaks. That’s only 2 years and 3 months!

    Also, I’m really sad that SED displays never became a thing. From what I read about them, it could have been the ultimate screen technology. But maybe that’s exactly why they never came to market?

  18. james___uk says:

    The OLED screen is the reason I’m glad to have an original PS Vita over the new one with its LCD