Results 1 to 10 of 10
  1. #1
    Network Hub
    Join Date
    May 2013
    Posts
    116

    Nvidia GTX 750 Ti - a good surprise from Nvidia.

    60 W only! And not a bad performer for an entry level gaming card. I've been with AMD these last few years due to their much better power usage, but this new board from Nvidia is now getting my interest. A couple of reviews here:

    http://www.tomshardware.com/reviews/...view,3750.html

    http://www.anandtech.com/show/7764/t...review-maxwell

  2. #2
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,420
    Pricing is a bit meh, but the performance per watt is excellent. Everything you could hope for from a new architecture.

    By the way, Kepler was/is also slightly more efficient than GCN. At the cost of compute performance.

  3. #3
    Secondary Hivemind Nexus Heliocentric's Avatar
    Join Date
    Jun 2011
    Posts
    10,285
    All media centre PC's are going to be cooler, quieter and cheaper to run if this is all it promises. Of course, if Intel manage to make graphics cards redundant with that CPU I cant remember then this is all going to be irrelevant.
    I'm failing to writing a blog, specifically about playing games the wrong way
    http://playingitwrong.wordpress.com/

  4. #4
    Network Hub
    Join Date
    Jun 2011
    Posts
    455
    But HTPCs have not needed a separate GPU since the first-gen i3s (I know because I use one), and the modern AMD APUS are many times more powerful than that. I guess if you want to play some undemanding games on your HTPC but not anything that would require a good GPU then this might be useful otherwise meh.

    For a desktop card I don't really care how much power it draws, only that it has 'all the flops' which this most definitely does not. Combined with the fact that some of their power efficiency comes from sharing texture units between multiple shader units and it's not entirely clear that this architecture will scale up to a top-class GPU, though it's possible that with modern shader heavy engines this might not be too bad of a bottleneck. Only time will tell.

  5. #5
    Lesser Hivemind Node Bobtree's Avatar
    Join Date
    Sep 2011
    Posts
    618
    Quote Originally Posted by Heliocentric View Post
    if Intel manage to make graphics cards redundant with that CPU I cant remember
    They won't. Larrabee as a GPU alternative was cancelled in 2010.

  6. #6
    Secondary Hivemind Nexus rockman29's Avatar
    Join Date
    Jul 2013
    Posts
    1,975
    I remember the days when there were so many trumpeting Larrabee as the be-all-end-all solution for high performance graphics...

  7. #7
    Secondary Hivemind Nexus Grizzly's Avatar
    Join Date
    Jun 2011
    Location
    The Bishopric of Utrecht
    Posts
    2,176
    I really like this GTX 750Ti thing. Instead of continiously hearing people lamenting on how expensive gaming PCs are, we can just tell them to get a GTX 750Ti and slot it into an office computer (Which often are I3s to start with, which are reasonably gaming capable) and presto, insta gaming PC!

  8. #8
    Network Hub
    Join Date
    Dec 2013
    Posts
    340
    Halving power consumption in a single generation is pretty much a tacit admission that efficiency was absolutely nowhere on their priority list before this. And why would it be, when increasing efficiency translates to lower PSU requirements and less beastly cooling solutions, i.e. smaller e-peens all round. Gamers really are the Harley Davidson riders of the PC world.

  9. #9
    Network Hub
    Join Date
    Jun 2011
    Posts
    178
    I'm pretty solidly an AMD guy since they tend to be a bit more open with their tech than nVidia ever has been (look at nVidia's treatment of PhysX versus AMD's handling of stuff like TressFX and Mantle), but unfortunately AMD cards don't work with GPU Acceleration in Blender's Cycles renderer, so I am very much intrigued at the possibility of throwing in a Maxwell based card purely as a render card once CUDA 5.0 support gets completely ironed out (and maybe use the hack that lets you use PhysX when there's an AMD card in the system).

  10. #10
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,420
    Quote Originally Posted by Lethe View Post
    Halving power consumption in a single generation is pretty much a tacit admission that efficiency was absolutely nowhere on their priority list before this. And why would it be, when increasing efficiency translates to lower PSU requirements and less beastly cooling solutions, i.e. smaller e-peens all round. Gamers really are the Harley Davidson riders of the PC world.
    Bullshit. Go compare the GTX 480 with the GTX 680. The latter has much better performance and uses significantly less power. Doubled efficiency isn't far off. That's what you can get with a new architecture.

    The problems with the GTX 480 specifically seem to be the reason Nvidia started to really focus on efficiency.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •