Page 2 of 2 FirstFirst 12
Results 21 to 25 of 25
  1. #21
    Secondary Hivemind Nexus mashakos's Avatar
    Join Date
    Jun 2011
    Posts
    1,255
    Quote Originally Posted by corbain View Post
    It's not quite the same thing, but illustrates how scale changes how big a step something seems.
    in my understanding the leap in 2006 was a milestone. It's not as big a milestone as PCs moving away from command line to GUI interfaces or anything like that, but it is a significant milestone in that there was a clear cutoff where everything that even came one month before that period was made obsolete.
    Steam profile
    PC Specs: I have a big e-peen

  2. #22
    Secondary Hivemind Nexus Boris's Avatar
    Join Date
    Apr 2012
    Location
    Netherlands
    Posts
    1,368
    Quote Originally Posted by Heliocentric View Post
    Does that first graph look like an acceleration curve to anyone else? As in, a suggestion that things are not indeed tapering off?
    Things might still be tapering off. The basic rule of computing is Moore's Law. You've probably heard of it. The number of transistors in an IC doubles about every 2 years.

    This means you start with 1. Then 2. Then 4. 8. 16. 32. It goes up exponentially.

    Now slow that growth to about 1.9 times per 2 year. The curve still goes up really fast: 1. 1,9. 3,6. 6,8. 13. 24,7.

    There's still a big jump every year, but in about 5 generations the hardware is "only" 24 times as fast as opposed to 32 times. In other words, it's still increasing, but also tapering off.

    There's a great plot of this on http://en.wikipedia.org/wiki/Moore's_law and that shows the correct way of plotting such data: On a logarithmic scale.

  3. #23
    Hey, I'll probably have enough cash by then to build a cutting edge machine that will last me for a few years because of new console generation. It's good to know when the new toys that actually matter (for those of us who can't buy a new card every time a new one is out ) will happen, so thanks.

    I wonder what kind of computers we will be using by 2020. Only thing I'm sad about having to die is that I won't get to see what technology will bring. I mean, people imagined we would be living on Mars by now ... but nobody could foresee the smartphones.

  4. #24
    Lesser Hivemind Node
    Join Date
    Jun 2012
    Posts
    900
    Quote Originally Posted by Heliocentric View Post
    Does that first graph look like an acceleration curve to anyone else? As in, a suggestion that things are not indeed tapering off?
    Nvidia graphs are never to be taken seriously. According to them tegra 2 was way faster than tegra and tegra 3 would shit on it from above the clouds, but performance was nothing like that in the final product.

    According to nvidias 2004 gpu chart the RSX in ps3 was going to be a 2 teraflop gpu (hahahahahaha)

    Their graphs are literally completely worthless.

    @ op, I wouldn't hold my breath for any real value increases in 2013 - 2014, amd and nvidia are currently choosing not to compete, not performance wise nor price wise
    A gtx 580 cost about 120 dollars to produce in 2010 and was selling for 500 at retail, that was a giant 520mm^2 die that had low yields in the factory and had a respectable 384bit memory bus.

    Now they are selling the spiritual successor (gk 110, 560mm die, 384 bit bus) for 1000 dollars and marketing their mid range 260mm die 256bit bus cheap as hell to produce product as the 'high end' with matching 500 dollar price tag.

    The price/die space has been doubled, there is no value to be had anymore.
    Amd have gone and delayed their 8 series cards by an entire year (into late 2013-early 2014) since they are profiting from giant mark ups with their current cards as well.

    I expect amd, intel and nvidia to just keep marketing around performance/watt and being green instead of performance/dollar and being fast.
    Expect a 10 percent performance tick for intel haswell later this year along with another drop in power consumption (selling low end low power parts instead of scaling up transistor counts for performance) along with another price increase since AMD has nothing to compete with.
    Leaked intel roadmaps already show a brandname shift for haswell to calling their (only) unlocked quad core cpu 4670k instead of 4500k, which signifies another price increase (the same thing amd did with their gpus)

    As for maxwell,nvidia and amd have both shifted to releasing their midrange (now rebranded low end) cards first at a hefty premium for people eager to benifit from increased performance/watt, and only to start selling the larger midrange die lower yield cards 6 months later at the 500-600 dollar premium price.
    Maxwell will be on the 20nm procedé so I expect the exact same thing to happen.
    gtx 760 first at 400 dollars in early 2014 with performance similar to gtx 680, then a midrange 780 half a year later at full premium, with the actual high end large die card being released in their new titan brand lineup.

    I agree with OP that it's best to wait for the haswell and maxwell releases to upgrade since the current gen is approaching 1.5 years of age without any real price drops in the high end (mid range in reality) , but if you are expecting another big performance boost you are going to be sorely dissapointed.

    What you'll get is lower power consumption and being sold smaller cheaper dies at bigger premiums with a higher model number.

    I'm not being negative I'm being realistic, the performance race is over for both companies in the duopolies, pc market is considered to have stalled (which in real people terms means it is fucking huge, but in marketing terms means they have no incentive to try to attract more people with good value, ah the joys of the American take on capitalism)
    The consumer has proven to be willing to shell out high premiums for desktop parts since 2010 and since there are no third players in the hdd, cpu and gpu markets who can swoop in for a piece of the pie they don't have to compete and have a price war anymore.
    You can quote me on all of this in a year and laugh at me if any of it turns out differently, I would love to be laughed at since that would mean consumers wouldn't get fucked up the ass.

    Neither amd or nvidia has any reason to lower prices when they can shift lower volumes at 3-4x the profit margin without having to worry about yields.
    Surplusses in production get either castrated : in the form of a crippled memory bus like the gtx 660 ti so it's a no AA-1080p only card (go look it up if you don't know about it) , or 1GB versions of the hd7850
    Or cherrypicked as OC editions for an additional premium.
    Last edited by Finicky; 10-03-2013 at 01:00 AM.

  5. #25
    Secondary Hivemind Nexus mashakos's Avatar
    Join Date
    Jun 2011
    Posts
    1,255
    ^ You could be right, but I think that things are going to be shaken up very significantly in the next 2-3 years. I'm sure Intel, amd and nvidia have seen the success of media players and internet connected consoles in the living room and are thinking to themselves "we can do better than this". We'll see.
    Steam profile
    PC Specs: I have a big e-peen

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •