Week in Tech: The end of graphics, and other stuff

By Jeremy Laird on February 22nd, 2013 at 5:00 pm.

Bit late with this installment due to a near-death experience with pharyngitis. But that’s actually allowed a few things shake out into the public domain. And the overall upshot is that PC hardware took an odd turn this week. The launch of a new high end GPU from NVIDIA really only served to confirm 2013 looks odds on for slim to no progress in graphics technology. More than that, it’s basically the end of the graphics refresh cycle as we know it. Sounds grim, but actually means now is a great time to buy a new pixel pumper. Meanwhile, Dell wheels out a crazy 21:9 aspect ultra-wide display and ye olde graphics benchmarks get an update.

When I wrote this first time around, I was due to attend a conference call with AMD for an update on its graphics kit and in fear of being left bound, gagged and writhing in an NDA dead zone.

As it happened AMD merely confirmed the rumours that 2013 will be a case of everything stays the same. No new graphics. The soonest we’ll see any new AMD chips is right at the end of the year.


Read my lips: No new AMD graphics

Why it’s happening is an interesting question. AMD was previously expected to unload its next-gen Sea Islands GPUs, otherwise known as the Radeon HD 8000 series, no later than early 2013. Personally I doubt the delay indicates any problems at AMD’s end.

Instead, it very much smells like we’ve moved into a new era for PC graphics. We simply won’t be getting epic new graphics chip every year. Both sides of the graphics war seem pretty comfortable with this. AMD actually has a very strong GPU line up currently. It’s super competitive at all points of the market and arguably has its nose in front regards overall bang for buck.

Somehow, however, NVIDIA seems to be increasing market share of late. The numbers vary according to who’s doing the telling and whether you’re talking all PC graphics or just add-in boards.

But the point is that it looks like NVIDIA is doing just fine and has no plans to squeeze out a real-world replacement for the GeForce 600 series any time soon. AMD’s recent revelation, then, is very likely a response to that.

If you’re wondering about NVIDIA’s new GeForce Titan, that’s easy. It’s the long awaited desktop version on NVIDIA’s mystical GK110 chip. It’s been on sale for a while as a Tesla board for industrial compute applications and it’s completely bonkers.

Bonkers as in seven billion transistors or double the number of transistors of the GK104 GPU found in the GeForce GTX 680. It’s easy to be blase about the march of technology, but just ponder that for a moment. Seven billion.


£800. That is all.

Despite that, it’s not twice as fast as a 680. A fair few of those transistors are expended on compute-related rather than graphics specific functionality.

Consequently, GK110 and thus Titan doesn’t have double the 1,536 shader count of the 680 and makes do with a mere 2,688 functional shaders. It doesn’t clock as high as a 680, either, so the net result is more in the region of 50 to 60 per cent more performance than a 680.

Still fugging quick, you say? Yup, but you’ll pay for it to the tune of 1,000 bucks. Factor in the VAT man’s pound of flesh in Blighty and you’re looking at over £800.

In my book, that makes it irrelevant, much as I’m pleased to see GK110 finally make its way into PCs. In fact, it’s not really part of the usual GPU cycle. NVIDIA confirms that the replacement for the GTX 680 won’t be faster than the Titan. It’s not part of the conventional product line up.

And thus the good news that comes out of all this is that now is a great time to buy a new graphics card. You can be confident it won’t be immediately usurped by something with added spangliness. AMD isn’t planning anything soon. GeForce Titan is a one off.


How wide is too wide?

Meanwhile…Dell has pulled the covers off the Ultrasharp U2913WM. It’s a 29-inch freak of a monitor with a 21:9 aspect ratio and 2,560 by 1,080 pixels.

I’m no fan of super-wide aspects for desktop computing, but the gadget whore in me would love to give it a go in games. Getting review samples out of Dell is a Byzantine process that’s about as much fun as a tax return, so my hopes aren’t high of getting one in.

For the record, it’s an IPS panel with an LED backlight and sports the latest styling vogue in the form of a pseudo no-bezel design. Admittedly, it’s pretty niche, but it’s not actually as expensive as you might think. It can be yours for a whisker under £500.

I’ve soft spots for displays with crazy specs and particularly high end Dells. Money no object, I’d have one for giggles.


Pointless. But fun.

And finally…Unigine and Futuremark have both rolled out new graphics benchmarks. They’re really anachronisms from a bygone age when benchmark scores generated big internet traffic. Unigine Heaven 4.0 is more a case of a bit of graphical spit and polish than a new benchmark, while Futuremark 3Dmark is all new and crammed with all manner of shimmery, tessellated visual shizzle.

Pointless gimmickery? Perhaps. But I’ve always enjoyed perving over the true potential of PC hardware, freed from its console shackles.

P.S. After a little delay rounding up the bits, our gaming rig give away prizes are now on the way to our lucky winners. Thanks to David Domínguez of Argentina and Peter Bruffell of York for their patience. Hopefully we’ll catch up with up them a little later to see just how much their gaming lives have been shaken up. Until then, enjoy the clobber!

__________________

« | »

, , , .

109 Comments »

  1. jrodman says:

    But which is the quietest?

    • ChromeBallz says:

      In my experience, the NVidia’s are usually the quietest, with the 400 series being an exception. I plan to get an MSI GTX 680 Twin Frozr at some point because of this, my current 6970 is REALLY loud under load.

      • DClark says:

        I find Asus’ Direct CU series and Gigabyte’s Windforce series are really nice GPU coolers regardless of whether they’re cooling an AMD or nVidia graphics core.

      • lijenstina says:

        8800 gt says hi :D

      • rockman29 says:

        Oooo, I want to get a Twin Frozr (or similar) card too…

        I’ve never had a GPU with two fans, I really want to see how quiet one is :)

        • pe510889 says:

          We are closely keeping our pace with the lastest accessories on the market for iPad and other apple products, including iPad toys, cases, game controllers, screen protectors, chargers, keyboards, holders, cables and more. Friends! Really can not miss!http://s6x.it/lnxp

    • Sic says:

      The 600 series is pretty quiet when you’re not gaming.

      Get the ASUS Direct CU stuff, and you can remove the stock fans (and all the plastic), and replace them with basically anything. I’m running inaudible 120 mm fans on a 670 right now, and at load it’s ridiculously much cooler than stock. I’m running them at 600 RPM, and at full load it’s something like 15 degrees C cooler than stock.

      • TheSplund says:

        Very interesting – I’ve got two DirectCUs and would seriously consider that even thought they are pretty quiet already

    • Stochastic says:

      The noise of a GPU will often vary depending on which board partner manufactures it (EVGA, Sapphire, MSI, etc.). I always try and read reviews that measure GPU fan noise for this reason. Typically, cards with dual slot coolers are a little quieter.

    • Strabo says:

      Default cooling is always far too loud, but usually Nvidia is less deafening than AMD with their high-end offerings. But really, it is worth to pay a few bucks more for a twin or triple fan offerings which are often less than half as noisy – bearable even if you don’t drown yourself in weapons fire.

    • phuzz says:

      The one where you pull the OEM cooler off and replace it with water cooling :)
      Now if only one of my hard drives didn’t keep making the whole case vibrate my computer would be almost silent.

  2. Sakkura says:

    Actually, AMD were only referring to the first three quarters of 2013. It seems likely that they have something new up their sleeve for the Christmas sale.

    • Sami Hamlaoui says:

      The far more likely answer is that AMD are focusing all of their manufacturing efforts on the PS4′s CPU and GPU.

      As for the monitor, meh. It might be wide, but it’s still stuck with 1080 vertical pixels.

      • amateurviking says:

        And the nextbox too, both are running the same APU.

      • Sakkura says:

        AMD doesn’t have manufacturing efforts to focus on, they don’t manufacture anything anymore.
        They’ve been working on improving the HD 7000 series drivers and are in no hurry to swap to something new. Understandable, since the 7000s are still improving and Nvidia isn’t launching a new generation of cards either.

      • wild_quinine says:

        No, I doubt it’s manufacturing that has recentered them. I think it’s likely that the security of that deal has given them some breathing room, though.

      • Jeremy Laird says:

        I think the two console deals basically involve the sale of IP. AMD has no manufacturing capability anymore, that’s for sure. I’m not sure exactly who is putting the APU together, but the AMD bits are essentially off the shelf. Don’t think doing the APU will be taking up that much of AMD’s time, anyway.

    • kwyjibo says:

      Games aren’t really going to move forward graphically until next-gen comes out. Nothing is pushing for a new card.

      AMD are more interested in getting their console design wins out. Nvidia is making more noise about the Tegra than anything for the PC.

      • Sakkura says:

        Games ARE moving forward. Crysis 3 has just been released, and it’s considerably more demanding than Crysis 2.

        • kwyjibo says:

          That’s not games, that’s game. And as it’s not setting the critics alight like the original, it’s not going to give that great a push for top end graphics cards.

          The biggest movement in the graphics space in the last 2 years has been Intel’s incorporation of a graphics chip into their Core range.

      • mutopia says:

        And why should it? Consoles mean that those publishers which have the resources to make Big Graphics are only interested in decade old tech (or years old tech, for the upcoming consoles) and meanwhile indie devs are finding they have more than plenty at their disposal.

        And GPUs have finally started becoming somewhat less power-consuming, too, at least during idle.

        Also, 2012 was the year that stereoscopic 3D died, again, and as a result that doubling of power won’t be needed. Oculus Rift may or may not change that, but even if it succeeds I very much doubt it’ll gain widespread acceptance fast enough for nVidia to change their minds, not least because we’re in a massive economic meltdown (which people tend to forget in these discussions).

  3. Citrus says:

    My 660GTX Ti will serve me well for next 3-4 years. After 3-4 years, I will upgrade when people are bitching about consoles holding PC gaming back and those “PC developers” who left to make money on new consoles will come back to PC claiming their love for the master race.

    Time to play some shitty overhyped games.

    *installs Crysis 3 and Half Life 2*

    PS: Only reason next year won’t see many GPU releases. AMD/nVidia have deals with consoles manufacturers to make their shitty consoles look superior for at least a year.

  4. Lord Custard Smingleigh says:

    So we’ve finally run out of graphics. I hope this means that eventually the continual refinement of existing tech drives decent graphics into the realm of the onboard display adapter. Then every PC would be a gaming PC…. BWU-HAHAHAHAHA!

    • Gap Gen says:

      Has anyone told Crysis 3 that there are no graphics left? I guess Crysis 4 will have to be a text adventure. You can count every dust mote, every hair, as exhaustively and tediously recounted in the text.

      • Lord Custard Smingleigh says:

        Are you kidding? Crysis 3 used all the graphics, that’s why there’s none left!

        • Kamos says:

          It has long been prophesied that this day would come. It was our *greed* that led us to use all the graphics. And now there are none for our children.

          • Vorphalack says:

            The downfall of humanity is at hand. Without graphics to power our entertainment, future generations will be reduced to drawing everything by hand. Once they have consumed all the paper, the walls of our houses, the tarmac of our roads and the very rock of the earths crust will be covered in scribbles. Eventually, after a few centuries of frenzied, futile animating, the only thing left to draw on will be each other.

            Perhaps millions of years from now, the next inheritors of the Earth will uncover the remains of human civilization, buried between the shale layers, fossilised in ink, and will wonder to themselves what hubris could have led to such an end.

          • Josh W says:

            Chris crawford will be very happy, games of meaning and process and huge blocks of text!

        • MentatYP says:

          Dare I say there’s a graphics shortage… crysis?

          Yes, I do dare apparently.

      • lokimotive says:

        It’ll be like a Robbe-Grillet novel.

    • czerro says:

      It’s crazy, my aunt made 8,375 graphics last month selling pixels from her home!

  5. Njordsk says:

    Is that desert ingame? Looks more real than reality !

  6. golem09 says:

    This will only be relevant for me once oculus comes out.
    I wonder if there will even be a way to get those stereoscopic 60fps out of any hardware.
    Maybe with two graphic cards, one for each eye?

    • GettCouped says:

      Indeed, here’s to hoping the Occulus train keeps on moving.

    • Low Life says:

      Oculus Rift (at least the dev kit) has such a low resolution that this shouldn’t be a problem at all. The way OR works is that it displays the image for each eye on their half of the panel, so you don’t need to double the FPS (but naturally you only get half the resolution). Besides, 120 FPS (60 FPS stereoscopic) isn’t really that hard to reach in most games with a decent computer, especially if you’re willing to drop from max settings, and people have been playing games like this for years.

  7. Brun says:

    Interesting that AMD isn’t focusing more on its GPU business as their video cards are the one product that is arguably competitive with those of their rivals in the same space (their CPUs are getting bulldozed by Intel at the moment).

    • meatshit says:

      Maybe they’ll funnel some of that R&D money into better driver support.

    • Joshua says:

      Actually, their lower-end processors are doing rather well now (piledriver is very decent, although it can’t beat the i5 at the high end).

    • Wedge says:

      Well, I think they’re just confident they can continue to sell what they have, which being dead on competitive with what nVidia has right now, is a fair assessment. Just keep doing driver updates and promotional deals and they should be fine. It’s not as if there’s going to be any games coming out this year that will likely push a need for new GPU’s anyways. May aswell just wait until the new consoles hit, so you can easily tout a still superior experience over them.

    • godwin says:

      “No new graphics” may be a bit misleading; there probably will be new product releases, just no new GPUs:
      http://www.anandtech.com/show/6751/amd-reiterates-2013-gpu-plans-sea-islands-beyond

      A good thing anyhow, I think, if the companies were to move to a biennial schedule and we see larger improvements per generation.

      Timely too, given that my PC had just stopped working and the 9600GT really is showing its age with the games these days. I’ve been researching a lot to prepare for an upgrade, and I’m actually for the first time set on getting an AMD card; would anyone recommend against a HD7870? Specifically the TwinFrozr from MSI. My only concerns would be driver reliability/compatibility, and the apparent financial troubles that AMD is facing – inconsequential and overstated?

      • PostieDoc says:

        I have the 7970 and have had no driver issues to date.
        I don’t think AMD are anywhere near as bad with their drivers as they used to be, the rep has just stuck with some people.
        The 7870 is a very nice card but an overclocked 7850 is fine if you are counting pennies.

  8. Shantara says:

    “Unigine Heaven 4.0 is more a case of a bit of graphical spit and polish than a new benchmark”

    They also released Unigine Valley, which is all-new benchmark.

  9. Eddy9000 says:

    might see if my workhorse AMD 5770 can keep up at all with the stock coming out with the new console generation and wait for a price drop with the end of year announcements that sounds to be on the cards. I hope it can, I mean I earn enough to get a better card right now, but my 5770 has lasted so well I’m loath to retire it even though it isn’t the best, like my winged spear in Dark Souls. It doesn’t sound like my PC’s i5 is going to be over-taxed this generation either. Be nice to spend the saved money on a decent widescreen HDTV to run the whole thing through, finally give steam’s big picture a decent test out.

    • Inverselaw says:

      I too am rocking the 5770 and what a wonderfull card it is. I will be sad the day that I replace it but It does feel like this year is the year to do it. Planetside 2 is the only game I dont play on max settings which is pretty impressive for such an old card.

  10. mikmanner says:

    Does anyone think that getting a 680 to replace my 570 would be a worthwhile upgrade? I’m going to replace my 1st gen i7 with a 3rd gen one from last year but keep my other components the same, ram, ssd etc

    • Wedge says:

      First gen i7′s use a different socket than the second and third gen models, so good luck with that. Also neither upgrade is notably worth it unless you have some kind of super high res monitor/multi monitor setup. Then again you must have a good amount of excess money to even be considering a 680.

      • mikmanner says:

        Yeah I’d upgrade the mobo too. I’m running a Hazro 2560 x 1440 monitor, but with my current set-up I can only get 60fps in Crysis 3 at medium settings -sounds stupid because it’s not even worth it from a gameplay perspective – but I have some extra dosh this month and I want to spank it like the responsible man I am.

        • Wedge says:

          Yeah, for running at that res it could be worth it, though I don’t recall the 680 being at all worth the price premium it has over a nice 670, especially if it’s OC’ed a little.

          • mikmanner says:

            I’m somewhat okay with overclocking my CPU but GPU makes me nervous, considering my 570 runs at 80 something degrees most the time.

        • theleif says:

          I seem to remember that when running games on high resolutions, one should go for the AMD 7970 series, as they handle the higher resolution better. You should check it out anyway before buying.
          I’m very happy with my HIS Radeon 7970 Ghz. Not only very fast, it’s also very silent even under high load.
          Cheers

    • Brun says:

      My guess is no. I have a first-gen i7 (overclocked by nearly 100%) and I haven’t had the slightest desire to upgrade it. The same goes for my 480 – I had considered a 680 or similar 6-series card (mainly for power and noise reduction) but since I use headphones noise is not terribly bothersome for me. The 480 – now a 3-year-old card – is more than capable of running modern games at 1080p. That may change once the 8th generation consoles hit, but for the moment it’s more than acceptable.

    • Low Life says:

      Here’s some benchmarks of Crysis 3 on a variety of graphics cards: http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,6.html

      (Following assumes 2560×1600 resolution)
      They don’t have the 570, but the 580 gets 28 FPS and the 480 is at 24 FPS, so I’d assume the 570 would be somewhere between that. The 680 is at 42 FPS, so it is a fairly substantial (over 50 %) increase in performance.

      The 7970 GHz edition, however, seems to have equal performance to the 680, as AMD seems to scale a bit better at higher resolutions. I don’t know the global pricing, but around here you can get it for ~50 euros cheaper than the 680, so unless you for some reason hate AMD it’s definitely worth considering.

  11. Feferuco says:

    I put off buying a new graphics card right now because of next gen consoles. My GTS 250 died, I wanted to hold on to it till next year maybe and get something new. But it died so I thought “hey I’ll get a new one right now then”.

    Then I checked the rumored, but accurate, specs for the PS4 and potentially accurate specs for the Durango. I realized a 7850 just wouldn’t do it, and anything more powerful than that was just too expensive for me.

    Is it really a good idea to get a new graphics card during a time of console transition? Isn’t it better to just wait it out, and get a cheaper GPU that’s guaranteed significantly more powerful than next gen consoles.

    I settled for the significantly cheaper 7750 and then in the future I get something for about the same price as the 7750 is now, instead of getting something that’s twice its price but that won’t handle whatever’s coming next.

    • Reapy says:

      I’m teetering on the edge of a pc upgrade, I felt like I just got this last one, but…well looks like I got the ‘best bang for the buck’ pc about 3.5 years ago. I was thinking of holding off until mid/late year, obviously the longer you go the better, but basically it’ll be time when I want to play a new game and it just doesn’t handle it.

      Maybe I’ll actually start saving now so I can go nuts and get a nice new monitor to boot without feeling too guilty. Ahhh, dangerous hardware.

      • Feferuco says:

        I’d save in your situation. Whatever comes out this year your PC will surely handle it because consoles will have to handle it. I say there’ll still be some good current gen releases even in 2014. Then it is better to later get something that will be considerably better than consoles.

  12. smoke.tetsu says:

    21:9 is about as wide as the cinema. So one can watch 2.39:1 or 2.40:1 with almost no letterboxing if any. Of course the tradeoff is pillarboxing for 16:9 and thicker pillarboxing for 4:3 or inbetween. But at least all videos would be the same height pretty much. Except for cinerama movies of course.

    • Reapy says:

      Is 16:10 dead? I’m still clinging too my 16:10 monitor, but is the world 16:9 now?

      • Ragnar says:

        Yeah, pretty much. All the new displays are 16:9, 21:9, and, in the case of the iPad, a bizzare throwback to 4:3. Maybe the iPad 8 will make the move to 16:10 and make it relevant again?

    • Randomer says:

      21:9 – What the hell is wrong with monitor manufacturers? Let’s just keep making our monitors wider and wider and shorter and shorter. Fast forward ten years and they’ll all be the shape of metersticks.

      In a world where websites and text files scroll vertically, am I the only one who thinks having a taller monitor might be useful?

      • Phantoon says:

        My monitor can be rotated 90 degrees for this.

      • Baines says:

        Maybe they’ve decided that the rise of dual screen setups means that what people really want are super-wide displays.

        Or they’ve decided that the next big thing will be watching two separate 4:3 ratio programs simultaneously.

      • Ragnar says:

        I guess it doesn’t help that I have 3 16:9 displays on my desk?

        Honestly, though, text scrolls vertically, but there’s no reason to display more than a page on the screen vertically, since, as you said, you can scroll. There is a lot of use for having multiple pages open a the same time, which means you need to fit the width of the page on a screen while still having it legible. 16:9 certainly lets you do it at 1080p, but 21:9 gives you even more.

        And for games, since they’re all Hor+, wider monitors (or more monitors) gives your peripheral vision something to look at, which lets you see more of the world and makes for more immersive gaming.

        • GravitySmacked says:

          I picked one of these up last weekend and it’s amazing for games; Planetside 2, BF3 etc have never looked so good. Having the extra width really adds to the gaming experience; recommended.

  13. Shooop says:

    Good. GOOD.

    We don’t need a new video card every year. Not when they’re just marginal leapfrogs over each other.

    Taking more time to do something really revolutionary would mean a lot more to the tech world and its consumers than just cranking out minimally tweaked products every year like cars.

  14. Duke Nukem says:

    If my customs office allows me to get the parts I’m going to see a big upgrade with the 670, since I’m currently running on a 6770 and that’s more than a bit old.

    By the way, thanks Jeremy!

  15. roryok says:

    Factor in the VAT man’s pound of flesh in Blighty and you’re looking at over £800.

    They should have called it the Gorilla

    Right?

    right?

  16. Gap Gen says:

    I guess at some point we’ll hit a limit where screen resolution, even huge screens, isn’t worth expanding on. The latest Apple machines claim to have reached that limit, I guess.

  17. Innovacious says:

    I still prefer to stick to 16:10. At least until this monitor dies. High resolution (see: higher than “full” HD) 16:10 monitors seem to be so much more expensive than 16:9 these days.

  18. Megakoresh says:

    This is very much correct. I think it’s going to be more about optimization rather than raw race for power from now on.

    And nothing wrong with that, by the way.

  19. Velko says:

    Ah, good. Just a couple of weeks ago I replaced my entire rig with a new one, and in the process jumped from a 5750 to a 7950. Of course it feels absolutely wonderful, but I’ve been a little worried that I upgraded at the wrong moment. Now it seems that waiting a few months wouldn’t have done anything.

    • Sakkura says:

      Well if you’re sad because of your lack of buyer’s remorse, I could always tell you about the 7870 XT that almost matches the 7950 but is quite a bit cheaper.

  20. Hoaxfish says:

    There was a fairly neat article by PC Gamer about how to build a rough equivalent of the announced PS4: http://www.pcgamer.com/2013/02/21/pc-gamer-vs-playstation-4-theres-only-ever-going-to-be-one-winner-right/

    and for expensive laptops; The Google “Pixel” Chromebook. Over £1000/$1000 in price for a cloudy laptop. The main point is the touch-enabled, high pixel-density 3:2 screen: http://www.theverge.com/2013/2/21/4013932/chromebook-pixel-hands-on-video-and-impressions

    • Baines says:

      The PC Gamer article seems to cheat a bit.

      He doesn’t like the PS4′s 8-core processor, so he goes with a 6-core that he does like. He describes the GDDR 5 RAM as overkill, so he goes with DDR3. He completely excludes the Blu-ray drive because “optical media’s dead, man.”

      Then there is that magical “The rest – $50″. This seems to pop up in articles where the writer has already decided their price point, or just don’t want to drive it higher by getting into details. All that work on getting quality parts, and he skimps out at the end? I don’t recall the last time I saw a specific case mentioned in a “build a PC” article that cost $50, much less a case and power supply and all the other combined cheap bits and bobs. And that’s without even getting into a controller. If you claim that you wouldn’t want a gamepad , then you need to include keyboard and mouse as a PC alternative. Even getting equipment that no serious PC builder would use even in a budget game machine, you aren’t getting all of it for $50.

      EDIT: I see some of the commenters at PC Gamer had the same response as I did, that the guy writing the article hid around $250 in additional costs with his “The rest – $50″ section. Enough to put his comparative PC build at double the cost of a PS4.

      • Strabo says:

        GDDR5 is only useful for the GPU, the CPU doesn’t need it at all, since it isn’t limited by memory bandwith (because it drowns in super-fast cache anyway). It’s more useful for integrated graphics, since then you have something that can profit from the increased memory bandwith (the GPU) – but he uses a dedicated GPU anyway.
        The Jaguar 8-core is the equivalent of two Atoms Z2670 (maybe a bit faster according to AMD), a tablet-class CPU, the ones which make Windows run like shit on Windows 8 Pro tablets (but they have an equal battery life to ARM CPUs and are a chunk faster to boot). Even the lowest and slowest i3 dual-core blows it out of the water. The FX-6300 is several times faster than what a Jaguar 8-core would manage.
        Bringing a hypothetical 8-core FX-6300 or i7 to the comparison would be like comparing ye olde dual-core Celeron with P4-Netburst-architecure from 2005 to a 2012 i3 3200 (performance-wise. I’m sure the Jaguar is a lot more efficient with its 25-Watt-TDP-design).

        • Baines says:

          I wouldn’t complain about the processor and RAM bits on their own (and the RAM was more an issue of not properly explaining why the change was made, which underplays the difference) on their own. But when you combine them with the other voiced (Blu-ray player) and silent (controller, possible other bits) omissions, and the dismissive “$50 will cover everything else” claim that hides items that will cost quite a bit more than $50.

          I cannot help but feel the article writer wanted to assemble a better PC for about the same price as the PS4, but when he saw his PC price rising, he decided to fudge his purchases. I simply cannot see a PC builder just dismissively saying $50 will cover a case, PSU, and everything else, particularly not when PC builders so often argue against spending that little on a case+PSU even for a budget build. (And it isn’t fair even as a theoretical price comparison.)

  21. biz says:

    why are there no actual sources in this ‘article’?

    all the old news indicates gtx 700s early this year… no recent announcement probably means a delay, but why is it unlikely they come out this year?

  22. Smuckers says:

    So, I built my first pc last summer, and I love it! This is my first jump back into pc gaming since brood war came out, so I’ve been playing a ton of catch-up. However, I do have a question for RPS readers. When should I get a new video card, and what should I be looking at? I’m currently running a non overclocked 560 with a gig of RAM (on the card not the system), and while it’s a fine card, it can’t really hang with the best out there right now. My board also supports crossfire, but not sli, and I’ve been wary of AMD simply because I’ve heard about issues with drivers and such. So my question is should I think about upgrading this year? Or, should I wait for the next-gen graphics cards, and is crossfire worth it to switch to AMD? Thoughts?

    edit: I should mention i’m not trying to start an AMD v. Nvidia flame war, just curious about what people think.

    • PostieDoc says:

      I would personally choose a powerful single card over crossfire.
      AMD are much better with their driver support than they used to be.
      Depending on your budget the 660Ti or 7850 are very good cards or a 670 or 7950/ 7970.

    • All is Well says:

      PostieDoc is absolutely correct in their recommendations. I bought a HD7950 just a couple of months ago (coming from a GTX 460) and I’m very pleased with it.
      What you should keep in mind when buying a card is A) What performance you actually want/need, B) how much money you are willing to spend, and C) What card gives you the required performance at the lowest price.

      As for the first point, this is a very rough approximation of how the current cards relate to each other, performance-wise:
      GTX660=HD7850<HD7870<GTX660Ti<HD7950<GTX670<HD7970<GTX680.

    • Smuckers says:

      Cool, thanks for the recommendations guys. I think i’ll probably hold off for a little while at least, and see if nvidia decides to release a 700 line of cards. Plus, I still need to save up some cash till I can make a purchase.

  23. Desmolas says:

    Good news i say. As far as i was concerned, my next GPU would definitely have been an nVidia chip. I’ve been getting quite sick of AMD’s janky driver releases for the past few years. So it’s good to see AMD is focusing more on their software rather than the hardware – which as always been top notch frankly and held back by its drivers. I’ll reconsider my stance come winter 2013

  24. Delusibeta says:

    Judging by the lack of WIDEFACE posts, I guess the THINGFACE and WARSOMETHING memes have died a death.

    Good.

  25. newprince says:

    I still have my HD 6950 2GB, which I am sort of sad now that I didn’t wait for the 7950. But honestly, I have no problems matched with i5 2500 and 8GB RAM. So basically just means I can wait some more time for a system upgrade.

    My only concern is this will be a self-fulfilling prophecy for manufacturers regarding the PC market. There’s no compelling reasons to upgrade because there’s nothing compelling being released.

  26. Boarnoah says:

    Where are all the giveaways/contests taking place anyway? I seem to miss all of them =(

  27. Bob says:

    No new graphics technology for this year is a kinda blessing for me as my Gigabyte 660 ti (2gigs) will have to do me for at least two years anyway. So I think it should handle most, if not all, of the games released in the foreseeable future. I could do with a better cooler for my CPU though.

  28. DClark says:

    Scanned the replies and didn’t see any mention of it, so I figured I’d note that AMD will be releasing a new card, the Radeon 7790, sometime in April. It should slot in between the 7770 and 7850 in terms of performance.

    While it’s not earth-shattering news or anything, it should be a nice performer for the budget gamer at a reasonable price point.

    • Demontain says:

      From what I’ve gathered from the articles I read so far, AMD will release something new by the end of the year. So far we don’t know how many GPUs or in what segment they fall into. Until then they’ll keep supporting their 7000 series with driver updates and all that fancy stuff.

      For now I’m still rocking an HD6850 and I’ll most likely just wait for the new HDx800 successor before upgrading it.

  29. Jambe says:

    wrt display aspect ratio: 16:10 (e.g. 1920×1200, 2560×1600) is closest to the Golden Ratio and is, to my mind, the most aesthetically pleasing aspect ratio (although other aspect ratios may lend themselves to specific aesthetic concerns, e.g. wider for awe-inspiring vistas, narrower for close-up faces or tall shots).

    • Strabo says:

      True 16:10 is the ideal and my choice too, especially for office work (that’s why I write this on a Macbook Pro instead of a Windows Notebook). However, 16:9 is cheaper to produce because it has less pixels for the same size and allows for more cutting waste. And as always in life, cheap beats better for most people.

      • jrodman says:

        Really, for office work? Mostly I work on documents, whether letters, emails, source codes.. and all of those seem much better in 4:3 or even a 16:x turned sideways.

  30. Rian Snuff says:

    This is great news to me, I won’t be getting any calls from the people who I’ve build budget gaming rigs from anytime soon, and my -promises- that those machines will continue to deliver for as long as I assumed they would will only be extended beyond my reasonable predictions really.

    Sub 400$ rigs that are playing Skyrim at 50+ fps on mainly high settings..
    Yes, I’m cool with this. It’s only going to help the PC gaming industry grow.

    I do get gitty when a game really pushes the boundaries and makes my computer cry.
    But I also appreciate it when Dev’s actually make use of the current hardware we have.
    The -leap- between new generations of cards is very pathetic really, it’s counter productive.

    My dual 6950′s (after market cooling, unlocked to 6970′s and overclocked as high as possible and still running stable as can be this whole time) have been now going STRONG for nearly three years and I’m still enjoying 95% of games on the highest settings possible. Shy a few with terrible coding/support or still awaiting driver updates. With this news here (not that I couldn’t already assume knowing the next gen of consoles can’t much compare) these beasts will easily be useful for years to come. That’s really amazing to me and I’m proud the days of “upgrading every year durrr” are very obviously just a myth.

  31. D3xter says:

    I still got my trusty 460GTX and I’m waiting for NVIDIA to roll out their 770GTX series to upgrade and get ready for the “Next Generation” of consoles, the Oculus Rift and 4K resolutions, aswell as play some of the graphics intensive games I skipped over like Metro 2033 or Crysis 3 because I wanted to upgrade first. 760GTX might also be fine but those are usually delayed, I expect something for July/August or so.

  32. SuicideKing says:

    1) Sea Islands is their Southern Islands refresh, which has made its way into laptops.
    2) AMD’s right in the fact that the 8000 series hasn’t been announced yet, so technically their not delayed.
    3) The reason you’re not seeing new cards isn’t because PC graphics is slowing down or whatever, it’s just that there’s not much point before the new consoles come out (because there aren’t many AAA titles that require stronger GPUs than the current gen). Nvidia’s Maxwell GPUs are due to be out next year, and rumours have been flying about a holiday season launch. I’m willing to bet we’ll see a paper launch by the end of this year, with general availability in January.
    4) The GTX 680 should cost Nvidia less than GTX 560 Ti, so you can see what’s happening there to Nvidia’s profits (going up). Now Nvidia knows it can get away with selling a x104 GPU at x110 prices, so i guess they’ll stick to that trend, though they might also NOT do that (if AMD manages to deliver Titan’s performance at half the price), and well, they should be able to deliver the 680′s performance for the 560′s launch price. I mean, they can, they’re just not.

    AMD’s APU strategy is paying off, though, so i won’t be surprised if they decide to merge the FX brand with the Radeon brand in the coming year or two. The FX-HD8970, yeah baby.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>