Week in Tech: Bent Screens, Reversible USB, AMD SSDs

By Jeremy Laird on August 21st, 2014 at 9:00 pm.

I’ve been dreading this moment for some time. But inevitably, inexorably, irresistibly it’s happened. LG has announced a curved LCD monitor. Specifically, we’re talking 34-inches of bent IPS panel in the super-wide 21:9 form factor that had me gushing like an idiot the other week. Admittedly I haven’t seen it first hand. But curved HDTVs are an appalling gimmick conceived to exploit the most base consumerist tendencies. I suspect bent PC monitors will be just as bad. Meanwhile, you might think the requirement for correct orientation of USB connectors upon insertion is hardly the most onerous threat to humanity’s collective well-being. But the finalisation of USB Type-C looks set to put an end to it, regardless. Oh, and I have a little – but only a little – more on the Intel Haswell-E uber platform I mentioned last week, Freesync monitors are said to be coming soon and, whaddya know, AMD is doing SSDs…

Bent monitors, then. I can barely be bothered, but duty bound here goes on LG’s new 34UC97. In most regards it’s just like other the 34-inch panels that have popped up in the past month or two. IPS panel, 21:9 aspect ratio and 3,440 by 1,440 pixels. Noice. But it’s a bit curved.

Now I could be wrong. But all my experience to date tells me this is a dubious-going-on-blatantly-cynical development. I say that having seen a flat 34-inch 21:9 panel looking just great. I say that having seen curved HDTVs. And I say that having regularly used a projector at home to generate images far, far larger than 34 inches. On a flat surface. I understand the theoretical benefit of curved screens. But the practical reality is that it’s bollocks.

Arguably they’re even sillier for HDTVs than PCs, since there’s more chance you’ll be sitting in the right spot with a monitor than a telly. But whatever, it’s a bit depressing to think of the skilled-engineer man hours that presumably went into developing curved LCDs.

As for pricing, no word as yet but it won’t be cheap given the novel panel tech. Oh, and the inclusion of a Thunderbolt interface. LG’s own new flat 34-inch 21:9 panel was over £800 last time I looked, so this curved thing could easily bust £1,000 and goodness knows how many of your American dollars.

In that context, isn’t reversible USB Type-C also just another gimmick? To moan about the need to orient a USB connector correctly before you shove it in surely rates as a distinctly first-world problem.

But if you’ve used Apple’s similarly reversible Lightening connector, you’ll know that not having to inspect either port or connector before plugging in is actually a tangible, worthwhile improvement. If you are frequently plugging in peripherals into difficult to access ports on the back of a PC or monitor, I’m pretty sure you’ll be genuinely grateful for an interface that doesn’t care which way you stick things in.

Oh and Type-C will also be compatible with USB 3.1, which bumps bandwidth from 5Gbps to 10Gbps. That’s still only half the 20Gbps offered by Thunderbolt, if you’re counting, but what the hey.

Of course, this is a physical change to the connector, which breaks backwards compatibility unless you use an adapter, which in turn somewhat defeats the object of the whole thing. So, the fact you’ll need a new PC, a new monitor and a new, well, whatever it is you plug things into to get the full benefit of USB Type-C is admittedly a teensy bit of a downside. Still, we can all look forward to fumble-free peripherals over the next five years or so.

Next up, AMD Radeon SSDs. There’s not a great deal to say here. If you didn’t already know, you won’t be surprised to learn AMD isn’t actually making SSDs. Rather it’s rebranding them. They’re OCZ Vector 150 drives. Apparently even the customer support for these drives is being handled by OCZ.

Or should that be Toshiba now that the Japanese giant owns OCZ? AMD Radeon SSD! (By OCZ!) (Via Toshiba!). Convoluted to say the least.

Anyway, pricing at $100 for the 120GB model, $164 for the 240GB and $299 for the 480GB is hardly exceptional, so you might wonder what the point is. As far as I can tell, it’s a simple matter of AMD cashing in on the brand equity of ‘Radeon’. Some people will buy these things simply because they say ‘AMD Radeon’ on the box and that’s a good enough reason to sell them.

In other news, AMD’s official foghorn Richard Huddy has been interviewised to the effect that FreeSync will be sampling (ie samples will be going out to various interested parties) next month with retail monitors due early in 2015.

Huddy reckons adding FreeSync will only bump up the bill of materials for a monitor by around $10-$20. As ever, however, the retail impact of that increase will be a significant multiple of that. But they should still be cheaper than the first Nvidia G-Sync enabled screens and the overall effect should be more competition and lower prices all round.

Finally, the chatter around Intel’s new high-end Haswell-E CPU and X99 platform is building, what with leaks, deliddings, motherboard announcements and the rest. Like I said before, normally I’d file this stuff in the pointlessly-expensive bin. But what I’ve seen so far makes Haswell-E look both genuinely exciting and more relevant than recent LGA2011 kit. Remember when an overclocked Core i7-920 D0 stepping was the weapon of choice? Those days look set to return.

The new refreshed LGA2011 platform (yes, Sakkura, LGA2011v3, I hadn’t forgotten last time) and the chips that go with it look like they’ll offer a genuine value proposition over LGA1150 clobber. Death by a thousand lawyers if I say anything more, but I’m genuinely excited by this new kit right now.

, , , , , , .

65 Comments »

  1. FurryLippedSquid says:

    Ha, a fairly jaded article this week.

    lol

  2. rexx.sabotage says:

    I’m pretty sure you’ll be genuinely grateful for an interface that doesn’t care which way you stick things in.

    Rule #34 of Interwebbing decrees that all liberties must be taken with the context of this statement.

  3. J. Cosmo Cohen says:

    I haven’t seen a curved TV or monitor in real life, but I did play with a curved phone and personally, I found the difference to be incredible. It creates a real sense of depth to what is being viewed, and the colors seemed to pop.

    I don’t know if that will translate well to a much larger screen, but I do know that I thought curved screens were stupid until I used one for myself.

    • Premium User Badge

      Vandelay says:

      The problem with curved TVs that I see (from a complete layman, no technical knowledge perspective,) is that you surely need to be looking at it from a very specific angle. Much like the way 3D televisions are even more pointless than 3D cinema, as soon as you have more than two people watching the TV some people are going to get a sub-par experience.

      That is less of an issue with a computer monitor where, generally, only one person is going to be viewing it whilst sitting at a desk. Whether the curvey-ness actually makes for a better picture, I have no idea.

      • Koozer says:

        I’ve seen bendy TVs on display in a couple of places (Costco maybe?). Viewing angle didn’t seem to make any difference to the picture. Neither did the bendyness. The experience was exactly as exciting as looking at a normal TV.

      • TacticalNuclearPenguin says:

        IPS of all panel technologies is the most resistant to this issue, and that is why they’re generally used for color critical work, since with other tecnologies you’ll have unwanted color/gamma shifts on the corners even if viewing straight on, or “black crush” on VA panels when looking straight at them ( Even though the newer AMVA+ are really getting great at almost avoiding both this and the color shifts, which brings them to relevancy again due to their unmatched contrast ratio ).

        The only problem might be “IPS glow” being made worse, which happens again on corners, is sometimes uneven and it can have either a pale tint or a purple-ish one depending on which sub category of IPS is used.

        Long story short: all LCDs sucks in some way but it’s all we have now ( because SED never took off ), and even if someone invents the perfect LCD panel there is still the insurmountable problem that they need to operate with a back light. The future is in each pixel emitting it’s own luminance ( or total lack of it when needed ) without it being a Plasma panel.

        I don’t envision OLEDs and similar technologies being viable in the near future either, sadly.

    • DelrueOfDetroit says:

      The Galaxy Nexus has a curved screen on it, and I do remember it would create an illusion of a 3D effect when you were viewing certain colours. (Vibrant red lines would always seem to pop)

  4. CookPassBabtridge says:

    I wish NVidia would hurry up with those fuggen 880s.

    • TacticalNuclearPenguin says:

      I kind of look forward to them too, but i probably will try all my best to avoid them. The keyword is “try” off course.

      Long story short, i’m a little put down by the fact that they’ll pull another 680 < 780, same base architecture but no longer the crippled, unwanted son.

  5. Chuckleluck says:

    Yes yes I know it’s a first world problem but it’s about time we get double-sided USB’s. Same thing with batteries – I thought Microsoft was doing something with them in their Xbox controllers, but maybe I misheard.

    And maybe I’m not an SSD freak, but I’m basing my purchase on price and online shopping reviews, not some stupid AMD sticker.

    • Leb says:

      Yes please. I love to tease my wife is better than her iPhone 5 in everyway… and then she sticks in her funky charger that fits in either way and a tear drops from my eye. Freaking USB and USB-otg

    • Premium User Badge

      Harlander says:

      Calling it a ‘first-world problem’ seems extraordinarily asinine to me. Of course it is, just like everything this column could possibly talk about.

      Maybe I’m just experiencing a sense-of-humour failure this morning.

  6. goertzenator says:

    I’m a bit near-sighted and the corners of my 27″ monitor are just outside of what is comfortable for me. I am very interested in curved monitors because it would enable me to use a much larger monitor.

    And consider multi-monitor setups: they are almost always “curved”. I want *that* without the bezels.

    • nrvsNRG says:

      My thoughts exactly. From a strictly gaming point of view, this would be a great monitor to have.

    • Sleepy Will says:

      That doesn’t sound like “a bit near sighted”, it sounds like a lot near sighted!!! You could probably solve that with some kind of corrective lens held by some means in front of your eye, the bonus being that you will be able to drive as well!

  7. Koozer says:

    Come back to me when we have hemisphere monitors we can lower over our heads like Darth Vader’s helmet chamber and we’ll talk.

  8. piphil says:

    I agree curved TVs are definitely a gimmick, as a TV is designed to be viewed by different people at different angles. He curve just makes is difficult for people at more obleque angles.

    PC monitors are generally, for gaming anyway, used with a single person viewing. Perhaps a wrap around screen, if used at the right distance, might slightly increase immersion?

    The latest in screen technology will allow for flexible screens that could be delivered rolled up in a tube. This would allow the curve of the screen to be customised for each user on the fly?

    It all comes down to whether or not the screen curve produces a more natural image?

    • TacticalNuclearPenguin says:

      If you want more people to look at it “straight”, then you’d have to curve it the other way around.

      This IS designed for a single person sitting in the right spot. I agree that it still makes little sense, but in my opinion that’s because they should simply have targeted triple monitors setups, and thus make it far wider.

      I don’t know about you, but if i was ever to consider gaming on 3 monitors, i’d probably would like that setup to be combined on a single panel without too many bezels in my way and a smoother curve, rather than a couple of steep angles. It would also be easier to solve the problem of viewing angles, if each part of the monitor is curved precisely like it’s needed for you to see each pixel perpendicularly as you move your eyes/head.

      And remember the biggest issue ( at least for me ): 3 monitors of the same brand and model will never perform identically, not even if you use calibration tools. So yeah, if this monitor makes no sense it’s simply because it’s targeting the wrong audience.

      • Smoky_the_Bear says:

        I think the point is that monitors are getting bigger and more affordable, I’m sure they would sell something like you described if they could develop it in a cost effective way that didn’t require them to sell it for $10,000. You aren’t targeting the correct audience at that price either because a triple monitor setup can be achieved for <1000, i.e. less than the price of this curved monitor. Making it even bigger (and hence twice as curved because most 3 monitor setups have the outer monitors set at around a 30 degree angle) would make the cost of this thing skyrocket to unaffordable amounts.

  9. Lawful Evil says:

    AMD is still alive? Why?

    • remon says:

      Because it’s the best bang for the buck? Plus they have the strongest GPUs right now, plus some more stuff, but I’m sure that you don’t care, you just want to troll.

      • TacticalNuclearPenguin says:

        Strongest GPUs? I hope you’re still talking about bang-for-buck.

        No, really, PLEASE tell me you’re not talking about Mantle.

        • remon says:

          Well, there’s no need to talk about Mantle. I’m not talking about bang for buck either.

          • TacticalNuclearPenguin says:

            Yeah well, i’m sorry then, i guess there was a misunderstanding as i never really take into account dual GPU cards.

          • remon says:

            I didn’t talk about dual gpu either.

        • Sakkura says:

          Radeon R9 295X2 I guess. The Titan Z is slightly slower than 2x GTX 780 Ti (SLI), which in turn is slightly slower than the R9 295X2.

          (yeah yeah, the R9 295X2 is strictly speaking a graphics card, not a GPU)

    • Sakkura says:

      Because they are doing things the PC way, unlike Nvidia who love proprietary everything. AMD also provides better value most of the time, and their existence means we have competition to keep Nvidia from price gouging (unfortunately they’re not providing adequate competition for Intel, which shows in what they’re offering and in their pricing).

      • TacticalNuclearPenguin says:

        Mantle can be theoretically used with Nvidia too, just like PhysX was supposed to be the exact same case, but ATI never wanted to accept such a thing for the same reason Nvidia is trying to shrug Mantle hoping it dies.

        They’re really not that different, it’s just that AMD cries a little more for stuff like “gameworks” closing them out of optimization, while their flagship Mantle example ( BF4 ), perfectly optimized for AMD and Mantle still run worse than a proper CPU on a 780ti, a realistic high end setup.

        Then there’s G-sync. AMD can try all they want to make everyone believe that FreeSync is the same and Nvidia are scumbags as usual, but it is not.

        G-sync has a dedicated monitor module that talks back and forth with the GPU in order to create a perfect syncronization without any theoretical latency, while FreeSync simply predicts the right VBLANK timings with a software abstraction ( possibly with it’s own overhead ) that is based on a technology that has nothing to do with the needs of gaming, but was actually created for power saving reasons ( avoiding pointless monitor refreshes when not needed ).

        Then there’s the price of all that work that is rightfully a factor.

        • remon says:

          So, everything AMD does is bad.

        • Sakkura says:

          LOL @ blaming AMD for having PhysX kept proprietary.

          As for Freesync, yes it DOES do the same thing as G-sync. But in Nvidia’s defense, they got to market first, and AMD probably wouldn’t have pushed Freesync if Nvidia hadn’t gotten the ball rolling with G-sync. And AMDs marketing on this has been quite snarky. Firstly, Freesync as a snide comment on the proprietary nature of G-sync. Then they renamed it Adaptive Sync, which sounds a lot like an earlier Nvidia feature, Adaptive Vsync.

          As for Mantle, I never mentioned it. But I guess we can get into that too. AMD is basically doing the same thing there as Nvidia did with G-sync – pushing things forward, making everyone else scramble to catch up. And just like with G-sync, Mantle will probably end up somewhat pointless. But only because everyone else got their shit together and implemented matching solutions (in this case, adding many of the same improvements to DirectX 12 and even OpenGL).

          • iniudan says:

            Actually Khronos Group as been given free unrestricted access to Mantle API and the right to integrate as much of it as they want into the next generation OpenGL, which is a complete rebuild of the code base, thus will break backward compatibility with previous version, which is a good thing anyway as developer have long complained about OpenGL need to be sanitized.

            Intel also asked to look at the spec, but they have been rebuffed as AMD has no intention to make spec public, until version 1.0 is release, which is suppose to be by the end of the year, if AMD respect its time schedule.

          • TacticalNuclearPenguin says:

            I’m not blaming AMD for that, i’m merely pointing out that it was possible for them to support it on GPU level but they didn’t want to depend on Nvidia and they weren’t even interested in such a thing anyway.

            Nvidia is using the same approach with Mantle, but i’m actually grateful for that as they’re improving DX performance across the board, which helps the most titles, while i’m also thankful to AMD since they managed to wake up good ‘ol Microsoft ( whishful thinking, i know ) and push them to do something worthwhile.

            Eitherway, G-sync and FreeSync are the same thing only for what they try to achieve, but the important bit is HOW they do that. We’ll have to see the comparison tests to really have an idea of how much this difference is important, but please also remember than i’m not a G-sync-Only fanboy because i really don’t want to buy a gaming monitor for that.

            While the above troll suggests that i think AMD does everything wrong, i’m merely against their PR crusades and their constant moaning.

            The GameWorks debate was the worst offender because it’s not like they’re without issues in other various “neutral” titles. Crap like “Game X has some problems on AMD GPUS” and “Game Y doesn’t appreciate AMD CPU cores” are incredibly common across the board, and most of the times they’re not even driver related.

            Actually, the talk about crap drivers is getting old, unless we’re talking about CrossFire off course.

    • Premium User Badge

      RaveTurned says:

      In addition to the above – because they struck deals to supply graphics hardware for both next gen consoles, guaranteeing them business for the next ten years or so.

    • Lawful Evil says:

      Very well. I concede.

      I have just been so disappointed at AMD for failing to make anything but weak (though admittedly cheap) CPUs. While they had no problem advertising them as “faster than that Intel’s” *rollseyes*

  10. Zenicetus says:

    That monitor might not seem as gimmicky for flight sims, or EliteD and Star Citizen, since it somewhat mimics a cockpit windscreen or canopy. Some flight simmers already use three monitors, with the outer two slightly angled, and this gets rid of the bezel lines.

    So it might be cool for flight sims, but I’ll probably throw my money at the consumer Oculus Rift instead, if they can work out the resolution and small text display issues.

  11. Sidewinder says:

    Given that aspect ratios wider than 4:3 are already nothing more than a cheap marketing gimmick expanding on a half-century old cheap marketing gimmick, what’s a little curvature on top?

  12. remon says:

    I think you got it backwards there with the curved monitors. Curvature should be more usefull on monitors than on TVs since you’re much closer to one. Because of that, the pixels on the edges of the monitor are proportionaly much farther away from your eyes than the pixels in the center. On a TV, since you’re sitting 5-15 feet away, the pixels on the edges are at about the same distance as any other pixel.

  13. sinister agent says:

    Is there a monitor for someone who just wants to be able to see the stuff their computer and/or funsquare is doing? For less than the approximate cost of a tiny child’s kidney?

  14. Sakkura says:

    That AMD/OCZ SSD uses Toshiba NAND anyway, so yeah it definitely has a lot of Toshiba DNA in it.

  15. shaydeeadi says:

    Will we start to see more available 16:10 monitors? Preferably 24 inch?I was struggling to find a 2560×1600 before and would love a 24/25 inch 3840×2400 in a year or 2. Everything is 16:9 (or 21:9 now, which I don’t really fancy.) Why are they so unpopular?

    • Sakkura says:

      16:10 is practically dead. Sorry.

      • Iainn says:

        Sadly, this is very true. I’m slowly resigning to the fact my next monitor will be 16:9. No way can I afford a 16:10 monitor with resolution above 1920×1200, such as the Dell U3014. Such a shame that 16:10 and 4:3 have been pushed to the side by 16:9 when 16:9 is the worst of the 3.

  16. nu1mlock says:

    Apparently the AMD SSD’s aren’t simply OCZ 150 Vector’s. Now, I’m absolutely no expert when it comes to this but they apparently have the “latest NAND flash tech & firmware, only same controller”. Don’t take my word for it, here’s what OCZ said:

    https://twitter.com/nu1mlock/status/502567493928763393

    • Sakkura says:

      The controller is the same (and at the same clocks). The NAND used is slightly different. Both are produced by Toshiba. Both at at 19nm. Both MLC (2 bits per cell). But the Vector 150 uses the original 19nm NAND, while the R7 SSD uses the A19nm NAND. The firmware has also been altered, if only to accomodate the new NAND.

      So it’s not a perfect clone, but it is a very close relative of the Vector 150.

    • The Dark One says:

      Like Sakkura said, it’s not exactly the Vector 150. In fact, if you look at the market segmentation for the current OCZ lineup, the AMD-branded drive fits in nicely. It has the same controller, but newer NAND. It’s cheaper and offers basically the same performance as the 150, but comes with a stingier warranty (four years at 30GB/day vs five years at 50GB/day).

  17. caff says:

    The whole G-sync vs. Freesync thing really annoys me right now. This is clearly revolutionary display technology that is being slowed & hampered to universal adoption by the usual A vs. B technology war. It seems there’s a huge lack of interest from mainstream display manufacturers who are either unwilling to sit on either fence, or feel it only relates to 1440p / 4K or above.

    I’m a person who would gladly adopt this technology at budget form – 1080p on a standard 32″ LCD TV would be fine for me.

    The PC would gain a huge advantage in visual fidelity and smoothness over consoles, if there were more monitor hardware choice out there right now. Maybe in a year or two I’ll have a solution that works for me.

  18. cylentstorm says:

    Sure. I’ll just get one for every room in the mansion and even try to squeeze one into the Ferrari. Neat toys, but still impractical.

  19. Stephen Roberts says:

    Regarding plugging in USB cables and other cables “into difficult to access ports on the back of a PC or monitor” I’ve found that a small mirror from, say, a Christmas cracker pretty much solves the awkward access problem. No need for fancy pants cables.

    Regarding SSDs: Are they still doing that awful thing of saying they are 120gb but there’s only 96gb there when you connect it up? My SSD is ’256gb’ down to 238. Why don’t they just put 238 on the box so I’m not pissed off?

    • The Dark One says:

      Semantics. The drive manufacturers offer you 256 billion bytes. Why doesn’t this show up as 256 gigabytes in Windows? Because, while we use metric prefixes in our daily lives, the operating system uses binary prefixes. If you want to be super accurate, you’d call the values you see on your C: drive kibibytes, mebibytes, gibibytes and tebibytes. Divide 256000000000 by 1024 a few times and you’ll arrive at 238.

      Other space can be lost to file system overhead and hidden boot and recovery partitions.

    • redmund says:

      120 to 96? Sounds like something is a bit off there. I have a 120gb Vertex 3 that gives me 111gb formatted space.

    • TacticalNuclearPenguin says:

      It’s not just a problem of semantics, it’s a matter of “Over Provisioning”

      Some SSDs reserve more space than others in the attempt of improving performance. Mind you that with “performance” i’m not talking about the linear speeds, but the overall “agility” of the drive, which is the REAL difference between an SSD and a HDD. Even a bottlenecked SSD with the older SATA2 would still be 20 times more “agile” regardless of sequential reads/writes speeds, or the HDD’s raid setup and RPM. Electrons instead of moving parts is a big difference.

      You’re also improving the lifespan of it, because the more space you have “reserved”, the less you need to delete and rewrite blocks to accomodate new data, while instead you can dump the old stuff in it’s own reserved space, to be later trimmed at no performance cost. Less redundant operations means more performance and more reliability.

      If you’re really performance minded, 20% Over Provisioning is a sweet spot for “enthusiasts”, but it’s not really recommended on a small drive off course. What i mean by this is that softwares like Samsung Magician lets you choose how much OP you want.

      • The Dark One says:

        No, the over-provisioning is what takes the 128 or 256 GB of NAND chips to the advertised 120 or 240 GB on the packaging. This is a big reason why the lower capacity drive doesn’t have quite the write performance. It doesn’t have quite as much room for caching purposes.

  20. iainl says:

    Since I’ve yet to see anyone do a multi-monitor setup without a slight angle between the screens, a single giant monitor that wide having a curve on it doesn’t seem outlandish, no. I wouldn’t pay a huge premium for the privilege, however.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>