Sounds Like ‘Dirty Underwear’ To Me

By Alec Meer on January 7th, 2011 at 7:22 pm.

I mean, “Sandy Bridge?” That’s a euphemism for unwashed undercrackers if ever I heard one. Intel seems to think it’s an appropriate codename for its latest generation of processors, however. Processors apparently so good that they prompted Gabe Newell to say they’re “a game-changer” and will “bring a console-like experience to PC.” This is, apparently, because the CPU includes built-in graphics processing that’s actually up to the job of modern games.

Hmm.

The statements, as statements by successful and important men do, made their way into news stories across the web. In many cases, this prompted some concern at a) the news outlets and b) Gabe, given that early reviews (here’s Anandtech’s) of Sandy Bridge had rather made a mockery of Intel’s claims that Sandy Bridge’s built-in graphics processing could outdo some 40 to 50% of current graphics cards.

Capable integrated graphics has long seemed the future of the PC, but there just hasn’t been anything out yet to free us from the tyranny/pleasure of installing separate 3D cards. Apparently Sandy Bridge barely competes with a contemporary $50-100 board, so Gabe’s comments sound pretty fishy.

Then VentureBeat posted a video of Portal 2 running on a Sandy Bridge chip. And whaddayaknow? It looks like it runs okay.

Crucial, I guess, is that Valve have specifically optimised their game for Sandy Bridge’s integrated graphics. Whether others follow suit will probably depend on the chip’s adoption rate, and of the complexity/hassle of converting any titles also designed for console. Or is the above video just a case of bullshot? Guess we’ll find out when Portal 2 launches.

By all accounts, Sandy Bridge makes an excellent gaming processor when it is paired with a decent graphics card, however.

__________________

« | »

, , , .

98 Comments »

  1. patricij says:

    “a console-like experience”? Do not want.

    • westyfield says:

      Intel Sandy Bridge: now available for £200 plus £40 per year!

    • RQH says:

      I think it clearly depends on what is meant by “console-like experience.” If it means having some sort of affordable graphical standard that removes the ambiguity, extra expense, and hassle from the rapid-paced upgrade cycle of PC gaming, I’m all for it. If it means the platform is owned by a single platform-owner and all content must pass through their gates, and that the games must be played with a paddle instead of the myriad wonderful input devices gamers currently have to choose from (and I don’t see how it would), then obviously no.

      I realize that some people enjoy the fiddlyness of PC gaming, and for their sakes, I certainly hope this works more toward creating a minimum standard than a fixed maximum, as with consoles, but for my part, I find myself declaring “never again” about once every three months, when I realize that game X doesn’t run well on the computer I paid more for than all three current consoles combined, or that part Y is failing and will need to be replaced for the cost of a console. I would gladly settle for a guaranteed minimum supported spec if it made things simpler. The graphical advantages afforded by the most top end PCs are out of my price-range, so for me PC gaming is about game experiences I can’t have elsewhere.

      But I’d say the increasingly draconian DRM schemes (and even Steam) are more of a threat to the openness of the PC platform than this chip.

    • Keukeu says:

      @ RQH you’re doing something wrong, I didn’t buy any computer part for 3 years and everything is running fine.

    • Goomich says:

      SB have 2 console like experiences: not upgradable graphic, unfotunatelly we already have it (Crysis 2) and built in DRM.

    • Centy says:

      @RHQ Yeah dude if you are needing to upgrade your PC every 3 months and yet don’t like the fiddlyness you really should be playing on a console anyway.

    • RQH says:

      I didn’t say I needed to upgrade every three months. I said every three months something doesn’t work right, whether it’s software related or hardware related. Most of the time I have been able to fix it without an upgrade. But that’s time spent not playing the game I paid for and not having fun. I have had to replace a dvd drive and a motherboard in roughly two years. Maybe I didn’t do a good job assembling my computer, or maybe I’ve had terribly bad luck this round. My previous computer was not nearly so powerful, even when it was new, but I never had to replace anything in it until it was so old the whole thing needed to be replaced. But then, I was in high school when I built it and had all the time in the world research and compare parts, and I didn’t have silly little things like bills to pay or food to buy with my money.

      Also, that’s elitist bullshit to say that someone should have the time and resources to troubleshoot their software and hardware problems in order to play the types of games that appear on PC. I happen to like fiddling in my games–like deep RPGs and strategy games, for example. These are games that either aren’t on consoles or can’t be played nearly as well on consoles, and I fail to see the precise correlation between skillset A–fiddling with hardware–and skillset B–enjoying intelligent, well-designed video games. One is fun and relaxing for me, the other is headache-inducing.

      Edit: @Goomich: The DRM business is undeniably a bad thing, and I would certainly trade convenience for keeping the platform open, if that’s the cost.

    • Tommo says:

      I dont see the problem with upgrading pc’s. It is now 2weeks short of 2 years since i built this pc (i7 920, gtx295, 6gig ram) which cost me then $3400AU and STILL plays everygame at full, including ARMA2 which is my main game.
      So $3400 sound like a lot at the time but 2 years use out of it is excellent, it is something that Ive used every day and ive got no plans to upgrade for quite a while, maybe when ARMA3 comes out….

    • Hoaxfish says:

      In 2007, I bought a prebuilt “office work” machine (onboard intel graphics) for about £300. Slotted in a couple of sticks of ram. Mucked about on low-end games (the ones that min spec GeForce 5 series), Flash games, free indie games, etc.

      For £24, bought an Xbox 360 wired controller to play platformers, etc.

      After 3~4 years, I’m spending £60 to get a directX11/OpenGL4 compatible card and another stick of ram. Might splash out on the “top graphics” games I couldn’t play in that gap that are now on sale.

      A lot of this seems like a side-step development related to onboard graphics, rather than something “proper” machines are going to want. The more integrated things become, the larger the piece you have to throw away when an individual piece goes wrong.

      PCs require patience if you want to take full advantage of their backwards compatibility.

    • kutkh says:

      Just wanted to add that I thought that RQH’s original post was very even-handed and astute and the speed with which old PC gamer stereotypes were quickly brought to bear is frankly a bit disappointing.

    • pupsikaso says:

      Tommo, that’s an absurd pricetag for ANY PC. Not to mention the parts you’ve bought for such a price aren’t even that great. I’ve built my own PC around three years ago for less than a thousand, and yes, it played Arma 2 on full as well.
      It’s people like you that blindly spend thousands and thousands on overpriced, inefficient parts and then brag to the world about it that gives PCs such a bad name.
      If you make smart purchases, a PC (and it’s upgrades) won’t cost you any more than a thousand dollars every 3 years.
      And since all games being made now are for consoles and don’t require a lot of power anyway, in the past three years and even still now you can run all games without any problems on full with a 3 year old computer like I do.

    • Tommo says:

      @pupsikaso , did you see the bit where i said $3400 AUSTRALIAN dollars? parts here are more expensive and our dollar wasnt as strong then. 2 years ago they were the latest and greatest parts, I built the thing too i didnt get it retail, it wouldve been AU$4000 retail. The gtx295 had only just come out..

    • DrazharLn says:

      I built my PC in 2006. After a few weeks I replaced my PSU on warranty because it was faulty and I put a 1TB HDD in it last August. Other than that I have made no changes to the hardware.

      The parts cost me about £850 (not incl. peripherals) and I built it over a lazy weekend. I can play pretty much any game out now for the PC on it (I can’t play DX10+ games because I’m on XP and MS are money grabbing bastards).

      Either you’ve been spectacularly unlucky with your parts or vendor choice (some vendors, especially for PSUs are simply not reliable) or perhaps you simply made a mistake somewhere.

      I was once at a loss as to why my PC failed to boot for about a week. Turned out I was shorting the motherboard by plugging the keyboard PS/2 input into the (very similar) mouse input. Swapped them round and it worked fine.

    • Archonsod says:

      “Also, that’s elitist bullshit to say that someone should have the time and resources to troubleshoot their software and hardware problems in order to play the types of games that appear on PC.”

      Erm, by that token it’s elitist bullshit to say someone should have the time and resources to learn to play an instrument if they want to make music. If you haven’t got time to learn how to use a tool you have no right to complain when you can’t use it.

      And I think the point there is that the entire purpose of the console was to allow people to play videogames without having to know how computers work, or in other words the part of the market you’ve identified yourself as belonging too. It’s not exactly the nineties anymore, you’ll find 90% of all strategy and RPG games available on the PC are likewise available on the Xbox these days.

    • Muzman says:

      You should never ever buy the latest stuff. They charge a premium for the very fact that it’s new. It’ll all be available for a third of the price in under six months!

    • bjohndooh says:

      @RQH – So you’d give up convenience for the sake of openness when it comes to software, but you’re not willing to make the same concessions when it comes to dealing with hardware?

      The price of being able to assemble a computer from parts of your choosing is that when it fails you have to troubleshoot it yourself and/or replace parts yourself.

      If you’d like to trade in the openness of the PC platform and would pay extra for convenience – I recommend a Mac.

      Seriously, the truth is that no matter how much you research your parts and how perfect your computer is built there is always the potential for problems to arise.

      The bottom line is that you will eventually have to solve any such problems yourself, with either time or money.

    • littlewilly91 says:

      So RQH wished things were a little simpler and cheaper but still open, and everyone who is proud of overcoming fiddliness attacked him for it? RPS forums, inspiring and constructive and then utterly disparaging in the same breath.

      Would be great to have better researched recommended and minimum stats with games. A sort of ubiquitous benchmark so you just have one thing to remember about your PC. Like gas mark and temperature on ovens- alternatives. Ain’t that a possibility? I think RQH was just pining for things like this. Isn’t it pessimistic to presume PC can’t get any simpler, with less sharp bits or whatever, without massive sacrifice?

      Hearing that things are fine just the way they are and all change is rubbish is like meeting Gideon Osborne.

    • Consumatopia says:

      @bjohndooh:

      So you’d give up convenience for the sake of openness when it comes to software, but you’re not willing to make the same concessions when it comes to dealing with hardware?

      I can’t speak for RQH, but when you put it in those terms I have to say heck, yes! That is exactly the trade off I am willing to make, and I find it bizarre anyone would want things otherwise.

      Hardware is the means. Software is the end. I don’t purchase expensive hardware just to have expensive hardware sitting around, I purchase expensive hardware in order to execute my software. And the whole point of computers, dating back the Turing Machine, is Universality–e.g. my hardware should be able to execute any program. So I greatly welcome general-purpose CPUs that can rival or at least make less important special-purpose hardware. Special purpose hardware is the opposite of openness.

      And deep down, you care more about software, too. After all, special purpose graphics hardware is not going away–graphics professionals will always require it. What worries you isn’t that special-purpose hardware is going away, it’s that games will no longer take advantage of it.

      And yes, it would represent something of a loss if games no longer took advantage of high-end graphics cards. It’s nice having the option to spend money to make things shinier! But the potential gain here could dwarf that loss. By making the PC a convenient, accessible, yet still open platform, it could make the gaming market as a whole more open–more open where it counts, in the actual games being written.

    • bjohndooh says:

      I don’t mean to say you can simply trade in openness for convenience when it comes to hardware – it usually comes with around $1000 initial premium, and when you experience problems diagnostic fees, overpriced parts, and ridiculous installation fees.

    • tim7168 says:

      @Archonsod

      Erm, by that token it’s elitist bullshit to say someone should have the time and resources to learn to play an instrument if they want to make music. If you haven’t got time to learn how to use a tool you have no right to complain when you can’t use it.

      No it isn’t. To use the music analogy you’d be suggesting that someone should have to know how to build the instrument in order to play it…which would be elitist bullshit. They ARE different skillsets.

      @kutkh

      Just wanted to add that I thought that RQH’s original post was very even-handed and astute and the speed with which old PC gamer stereotypes were quickly brought to bear is frankly a bit disappointing.

      I couldn’t agree more.

    • psyk says:

      pfft double post

    • psyk says:

      “And I think the point there is that the entire purpose of the console was to allow people to play videogames without having to know how computers work, or in other words the part of the market you’ve identified yourself as belonging too.”

      It wouldn’t have anything to do with the no upgrading and any game put out for the system just working? of course not it’s because people can’t use one of these http://img267.imageshack.us/img267/11/getagriptrianglewoodens.jpg and can’t read……. get over yourself.

      EDIT – wtf rps

      Regex ID: 124496 (PRADA) appears to be an invalid regex string! Please fix it in the Blacklist control panel.
      Regex ID: 124494 (http://wwwdotb2bjordansdotcom) appears to be an invalid regex string! Please fix it in the Blacklist control panel.
      Regex ID: 124471 (http://wwwdotaeooedotcom) appears to be an invalid regex string! Please fix it in the Blacklist control panel.
      Regex ID: 124538 (w w wdot a e o o e dot c o m) appears to be an invalid regex string! Please fix it in the Blacklist control panel.
      Regex ID: 124470 (PYAPAL) appears to be an invalid regex string! Please fix it in the Blacklist control panel.

  2. Decimae says:

    I still think the Sandy Bridge IGP’s are pointless, and the improvements are incremental. I’m waiting for the AMD Bulldozer cores to really give some impact. They also have more decent IGP’s, probably.

    • skinlo says:

      Its predicted that Bulldozer will perform like the old i7′s, so Intel will still have the performance advantage. Hopefully AMD will have the value advantage.

      My interpretation of a console like experience was one with a ‘base’ level of performance, where PC game developers will know the least amount of power computers have give or take, and work from there. If Intels and AMD new combined chips do take hold, and a majority of new computers have them in, it will mean a widening of the PC game market , where someone who knows little about computers will still have enough power to run games in lowish settings as opposed to now where an intergrated card and barely run 5 year old games.

    • Decimae says:

      But who on earth uses the IGP if they have such an expensive CPU? Only if it breaks down or you’re not supposed to game, I think. On the lower end, these IGPs would make sense. Also, of course AMD isn’t going to get the performance crown, but will probably regain the excellent price/performance level.

    • James Allen says:

      You’d be surprised how dumb the average consumer is. They walk in to Best Buy and get a generic computer with no idea what’s inside or how to upgrade it. There are tons of people running super-fast processors with crap integrated graphics.

    • Snargelfargen says:

      @James Allen – The box store computers with multi-core processors and crap graphics aren’t meant for gaming. They make a lot of sense for anything involving spreadsheets, converting file types and other not-so-fun things.

      Edit: Hmm, I guess the mainstream gaming market would open up as well, if workstation computers had adequate integrated graphics,

    • bill says:

      @James Allen:

      that’s not dumb. That’s normal. If you go into a shop and you buy a modern machine, you expect it to work. You don’t expect to have to do huge amounts of work into obscurely named components to buy consumer electronics.

      Almost every game-specific forum I’ve visited has loads of threads with people asking why the game they bought doesn’t run well. And it’s often down to their laptop not having a decent graphics card – but why would they know that? And why wouldn’t they expect their shiny new laptop to play the game they just heard about?

      Not everyone can be an expert on everything. People don’t care and don’t have the time. I might know about PC hardware, but I don’t know about cars, electronics, etc… but if I buy one, i expect it to work.

    • Barnaby says:

      I think Skinlo hit the nail on the head regarding the console comments. Not so sure it was a great choice of words on Gabe’s part though, without further clarification at least.

    • drewski says:

      @ bill – I went down to the local street race in my brand new Suzuki Swift the other day and some idiot in an AMG Mercedes beat me in a race. Why can’t my brand new car beat his?

      Then I tried piling 7 people into it and towing a boat. I couldn’t figure out how to attach the trailer, so I just opened a window and tied it to a passenger. When the police pulled me over I was doing 2mph on the freeway. Apparently my Swift isn’t licensed for 7 passengers and doesn’t have a “towbar”. It’s a new car! I expect it to work.

  3. Stephen Roberts says:

    Names are strange things. Kinect. WAVI Xtion. Thi4f. Sandy Bridge. Really?

    It sounds like progress has a weird name. But shitty portmanteaus are the future.

    • Ricc says:

      Another good one: the iGUGU Gamecore

    • Wilson says:

      Sandy Bridge? Really? Is that some kind of American reference that I don’t get as a Brit? I can’t imagine how that became a name for anything other than a raised bank of sand that provides passage across a body of water. As a place name, it’s fine. It’s not a processor name.

    • Devan says:

      @Wilson
      I don’t think it’s completely arbitrary. “Bridge” at least is an established term in computer and network architecture.

    • Oak says:

      It’s a code name. Code names don’t have to make sense.

    • Hoaxfish says:

      But is it better or worse than sandy balls?

    • Wulf says:

      I honestly believe WAVI Xtion was a pun John (was it John?) was making based on a pun I made in a comments thread where I pointed out how an article said that the Kinect was a more eloquent name than that of the ASUS device – PrimeSense. So it’s called PrimeSense. And I was dumbfounded by this, since Kinect is the most clumsy portmanteau of kinetic and connect imaginable, I actually prefer PrimeSense.

      And then I pointed out that if those kinds of names are what people want, then they could just call it the ASUS Activitalus. The next day, an article calls it the WAVI Xtion (wavey action). >_> Admittedly far cleverer than Activitalus (which would indubitably get them sued by Activision).

    • fatchap says:

      It is probably named after a river in the US. Most of Intel’s code names were from that. MS were towns in the NW

    • Jhoosier says:

      What fatchap said. Apple’s codename’s for their OS versions are large cats (snow leopard), MS uses a mishmash of bizarrities: http://en.wikipedia.org/wiki/List_of_Microsoft_codenames

      Intel uses “geographical names of towns, rivers or mountains near the location of the Intel facility”, but it sounds like they’ve been running out of names around Oregon. Type “List of *** codenames” in Wikipedia, and you’ll find some interesting stuff.

  4. neolith says:

    “bring a console-like experience to PC.”

    I might only speak for me, but if there’s ONE thing I don’t want then it’s a console-like experience on the PC. I want a PC-like experience on the PC! If I’d want a console-like experience, I’d be playing on consoles in the first place, dammit.

    I wish people would stop trying to turn the PC into something else… :(

    • skinlo says:

      Pretty sure its talking about minimum specs and optimisation for a base platform, not console controls etc.

    • Shagittarius says:

      So when AMD and NVidia also come out with their On-Chip GPU variants and you have 3 competing companies and different revisions of the GPU part of the chip is that really gonna make things better?

      I think not.

  5. Miker says:

    Is the footage of Portal 2 running on Sandy Bridge without a discrete graphics card? Has that been confirmed?

  6. skinlo says:

    And anyway, a $50 graphics card performs around the same as a console.

  7. DHP says:

    “By all accounts, Sandy Bridge makes an excellent gaming processor when it is paired with a decent graphics card, however.”

    Hey, Sandy, show us your (North) bridge!

  8. Vandelay says:

    I’m personally very much looking forward to get hold of one of these chips alongside a nice graphics card when I decide to build a new system in the not too distant future. They are meant to be astonishingly good.

    And I think many people are getting a bit unnecessarily pissed off about the “console-like” comment. They mean that people will actually be able to play the games without much fiddling and the benefit to developers, which is a good thing. Not sure I really buy that these new chips will let them do that, but it is a step in the right direction.

    • Archonsod says:

      Which is silly. I don’t fiddle with the hardware very often, particularly not to play a game. I do fiddle with the software quite a lot though.

  9. mandrill says:

    We need the context of Gabe’s ‘console like’ comment.

    I highly doubt that he meant it in the sense that all PC games will become pale shadows of what they once were and will be reduced to being sequels and linear games that o not require higher brain functions. But without context its hard to dismiss the possibility out of hand.

  10. KBKarma says:

    The Sandy Bridge also has a remote kill-switch that can be activated at any time. I also read that it has a built-in anti-piracy system, though it seems to be more a system that allows you to decrypt legitimate 1080p videos while streaming them rather than any kind of hardware lock.

    I’ll probably be getting an AMD, though; they seem cheaper.

    • frymaster says:

      you have to be careful saying comments like “kill-switch” – people assume that means the manufacturer can disable it, when it’s actually so the owner can, in case of theft. As regards to insider, I think it’s mainly just a popularisation of the TPM stuff that intel business chipsets have had for ages.

      the kill switch only really makes sense in the context of laptops, where the main thing in sandy bridge is that it can now be activated over 3g, if the laptop has it (again, the killswitch has been included in current and previous business chipsets)

    • KBKarma says:

      Amusingly, I got the phrase from several online sources. They called it a kill switch, but, until now, I don’t think any of them actually said what the purpose was for. I’m slightly relieved, though, as one site pointed out, it doesn’t really prevent anyone from stealing your data, it just means they need to swap your drive into another machine. It does, of course, mean that, if they were planning on selling it, they just got a rather expensive paperweight. Unless they have a backup processor handy, of course.

      Still, I’m not particularly enamoured with it, especially since it’ll most likely start at around €300 or so. And, as someone mentioned to me on another site, Intel chips aren’t really backwards-compatible, requiring occasional motherboard switching, while AMD don’t tend to change their architecture or pin placement wildly enough to require this (disclaimer: I have no idea if this is true).

    • drewski says:

      AMD fiddle their bits occasionally, but not as often as Intel.

  11. Rich says:

    If it kills the graphics card market (it probably won’t) then we’ll have to by a new processor and, probably, motherboard whenever we want better graphics.

    Right now, if your CPU isn’t up to much, you can by a good graphics card that’ll take the load. That’s what I’ve done. It’s not great but it does the job.

  12. Big Murray says:

    Gabe pulling a Molyneux. By which I mean getting over-excited and saying stuff that he’ll later regret.

    Source engine games have never needed much graphical power to run smoothly. I remember playing Episode 2 on a ridiculously outdated graphics card and still getting nice framerates, while the same card would buckle under any other modern game.

  13. Kadayi says:

    I guess GabeNs way of thinking is probably that albeit these IGP aren’t going to necessarily set the gaming scene alight for the hardcore gamer whose probably going to still go down the Nvidia/ATi route of hardware epenis, it is going to make it a lot easier for people with base level laptops/towers etc to play more sophisticated games.

    If johnny Highstreet (your traditional console only gamer) can suddenly buy a £300 Laptop that will play MW2 at a respectable frame rate, they might well go for that option (instead of/or on top of a console title only) given a laptop possesses a lot more functionality (and yes, by functionality I mean the ability to look up ladies getting nekkid)

    Right now (save a few exceptions, MMOs & Sims) the bulk of triple AAA titles are with the console market, and subsequently Microsoft and Sony dictate the beat, and PC has to follow (even though on paper at least there’s quite a large PC user base). Developers are always going to build on the audience they expect to make the most sales from, so if suddenly the PC starts to come back as a real force in terms of market footprint for AAA titles, perhaps developers might start shifting towards building for the PC first and the consoles second. In a way DA:O was a demonstration of that in effect, given how distinct the PC version was Vs the console release.

    Interesting to see whether we do see a shift a few years down the road, but by then we might be onto a new console hardware cycle. .

    • Archonsod says:

      If CoD and similar are all the ‘AAA mainstream’ are capable of these days I fail to see how their absence from the PC is in any way detrimental.

      In fact it occurs to me if you wiped out Sony, Activision et al you’d have improved console gaming by the kind of leap not seen since they started adding hard drives.

  14. lurkalisk says:

    This was probably inevitable.

  15. Starky says:

    Sandy bridge is a game changer, but it’s a first generation game changer that will need 2-3 more generations to realize.

    Right now, a sandy bridge CPU will beat almost any integrated GPU and most cheapo discreet CPUs – this isn’t a big deal for gamers, most of us have mid-high spec discreets…
    But is a big deal for average users, HTPC builds and such.

    But… give it 5 years and 80% of PC gamers won’t need more than their CPU for gaming, with discreet graphics only required by people using MASSIVE resolutions (multi-monitor setups).

    The days of discreet graphics are dying and AMD and Nvidia know it.

    Game graphics exist on a exponential curve where “x” is the noticeable graphical improvement, and “y” is the processing power required to achieve that.

    In the past a new generation of graphics cards achieved about a 20-25% processing power increase (with a doubling of transistor count) noticeable improvement on the x scale – but now we’ve gotten to the point that a 20% increase in GPU power doesn’t really make any noticeable impact on visual fidelity.

    Ironically it is one of the reasons people bitch about DX10/11 not looking better than DX9c – DX11 could look much better but we still don’t have the GPU processing power to run it at decent FPS.

    To put it in perspective, for most modern games, moving from High settings with 4xAA and no Ambient occlusion at a mid-high resolution (say 1650×1080), to 16xAA and AO on uses about double the graphical processing. in other words is roughly 50% of your graphics cards horsepower is getting used JUST for that 16xAA and AO
    And I challenge anyone to really notice the difference between 4xAA with no AO, and 16xAA with it on in game – it’s only noticeable in a screenshot when paying close attention to detail.

    So yes, in a few generations every PC will be able to run what we now think of as high/ultra level graphics out of the box.
    In 5 years PC games won’t have “medium, high, ultra” settings – because the difference between medium and ultra will be unnoticeable by human observation.

    THAT is what Gabe means by “A console like experience”:
    Every PC capable of playing any game released for the platform.
    Devs no longer needing to make scalable graphics, textures or other such assets.
    No more fiddly options, tweaking or worrying if you meet spec.

    CPU speed will be the main bottleneck, and systems which rely on CPu are much less noticable when scaled back (AI, physics accuracy, model detail so on so forth)…

    TLDR version: In a few generations, the difference between medium graphics (will run on an integrated GPU) and ultra graphics (needs 2+ GPUs eating 400 watts of power) will be so marginal as to be meaningless, freeing developers from a LOT of work, difficulty and quality control nightmares.

    • Diziet Sma says:

      Thanks for saying that. I don’t think I could’ve typed a reply that well, you just saved me either:

      - Frustration through wanting to say that but not being arsed to.
      - Having to edit and edit my reply until it said what I wanted and I was satisfied it was complete and robust.

    • Captain Kirk says:

      I don’t want to break the temporal prime directive, so I can’t tell you exactly what will happen in the future, but I do want to remind you that there are physical limitation in graphics and cpu architecture design. Take for instance the newest Nvidia 580 GTX, which has a die area of 520 mm^2 and a transistor count of 3 billion. The newest and top end Sandy Bridge processors are weighing in at 216 mm^2 and just under 1 billion transistors. The size is a not a direct metric of performance, but the greater the the number of transistors – and therefore die area – that can be dedicated to graphics is what will drive performance in nearly all 3D games**.

      Since the intel CPU’s are only dedicating a small amount of logic to the actual graphics processing part (don’t forget that that Intel’s cpu’s have a lot more die space dedicated to both flow control and have very large caches as compared to a gpu), dedicated GPU’s will always – for longer than most of these companies will survive – provide far superior performance.

      Someone might then say, “Why doesn’t Intel just increase the size of the die indefinitely until they can match performance?”. The problem being that the yields of large die cpu and gpu’s are horrible. Often below 1% (yield = number of viable chips / total number on a waffer). Also, very large chips are thermally limited, everyone knows how hot some of the new gpu’s can get. CPU’s with built in GPU’s would suffer the same fate.

      As a general rule, I will say that incorporating the CPU onto the GPU is the way of the future. Intel has it slightly backwards – graphics is by far the most difficult computational task that most computers have to handle. Nvidia and AMD both have plans to fight Intel’s encroachment on their markets. However, the botique high end discrete GPU’s simply can not be replaced if we are going to maintain superior graphical fidelity on the PC. Unless there is a paradigm shift and those companies simply stop making high end parts or go out of business for one reason or another, discrete will be with us for a while.

      There are other considerations as well. Notably that Intel is famous for sub standard graphics. This is because their graphics cores are basically use a subset of the x86 architecture. x86 is bloated, ancient, power-inefficient, etc. They will NEVER get as much performance per transistor as compared to NVIDIA’s and AMD’s graphics solutions. Given Intel’s total resistance to leaving x86 behind, I don’t expect this to change.

      I don’t think Sandy Bridge is a game changer for a decent subset of pc gamers. It’s graphics performance will be subpar compared to a lot of even $30 or $40 dollar discrete graphics cards. Despite being incorporated on the die, basic physical limitations will prevent it from ever achieving the level of fidelity that is present in even a console gpu. (What I mean by this is that if you took a discrete version of what is present if either the x360 or the PS3, you would get a graphics processor that is still somewhat better than the top of the line graphics performance in Sandy Bridge).

      ** The source engine being a notable example – it has been around for a while and is very optimized for use on relatively weak gpu’s. It also is not nearly as graphics intensive as many of the more modern engines.

    • Aninhumer says:

      >Game graphics exist on a exponential curve x=y^2 where x is the noticeable graphical improvement, and y is the processing power required to achieve that.

      Not to be a pedant but:
      a) That’s quadratic not exponential
      b) You have your variables the wrong way around

      Sorry that just bugged me, carry on. :P

    • Starky says:

      I knew that I was technically wrong calling it exponential, but that is the term most non-math nerds use for a growth curve, so I went with it, x-squared curve just don’t have the same ring. Geometric curve is sexier though.
      I blame the mixing up of my x’s and y’s on beer though.
      It is after all the cause of and solution to all problems.

    • Starky says:

      Kirk, when I say that sandy bridge is a game changer, I don’t necessarily mean that Intel will be the ones who’ll win the game.
      As you say, AMD (with their ATI expertise) might pull manage better CPU/GPU hybrid.

      Obviously no one knows the future, some breakthrough with photonic processing, or quantum computing may make this entire discussing moot because at that point we’ll have enough CPU power to render games in photorealism with raytracing rather than rasterization.

      Still, I don’t think that graphical fidelity is the battleground of the future, I also don’t think that it is really needed – every generation so far the gap between medium graphics and maximum graphics in pc games as diminished – so much so that with many games you could put medium and maximum side by side and on a 1080p TV and the only clue to which was the 5770 and which was the HD 5970
      would be the FPS counter in the top corner.
      It this point it’s still noticeable, but it’s no longer the massive deal it once was. 16xAA vs 4xAA, tessellation, ambient occlusion, all processes that use massive amounts of GPU power for marginal fidelity gain.

      Take Metro 2033 for example, the most recent high end PC focused game very high will bring anything but a massive SLI setup to it’s knees, while medium chugs away happily HTPC (q6600 @3ghz, 4GB DDR2, and 5770) at 50ish FPS @ 1080p with a few settings on high/v.high too – so my medium spec PC looks better than the floowing screenshots, mainly thanks to using high quality textures.
      http://www.legionhardware.com/pic.php?image=images/review/Metro_2033_Performance_Guide/VeryHigh_01.jpg
      http://www.legionhardware.com/pic.php?image=images/review/Metro_2033_Performance_Guide/Normal_01.jpg
      http://www.legionhardware.com/pic.php?image=images/review/Metro_2033_Performance_Guide/VeryHigh_02.jpg
      http://www.legionhardware.com/pic.php?image=images/review/Metro_2033_Performance_Guide/VeryHigh_02.jpg

      Or 4xAA vs 16xAA or even higher
      http://i43.tinypic.com/2egbe6f.jpg
      Found this from battlefield bad company 2 – not the highest graphical fidelity in a game, but it shows that the differences at higher AA levels are marginal.
      heck with increased screen pixels and modern displays having less dot pitch you don’t need as much anti-aliasing anyway, as more and smaller pixels means less aliasing in the first place.

    • Archonsod says:

      Erm, I’m running Metro 2033 on very high with only an 8800GTX, and getting no chug whatsoever. In fact, the frame rate stays between 80 – 100 throughout the game, and the rest of the system isn’t exactly cutting edge (2.3 Ghz quad, 4 Gb Ram and Vista 32).

      The only real problem with your argument is that it would require a shift in developers too. A specialised graphics chip is always going to outperform a generalised CPU for obvious reasons, so the dedicated graphics card won’t become obsolete just yet. And of course if the graphics capacity is there, then devs tend to try to use it no matter how little it actually does. The final problem being that utilising whatever the latest graphic tech is tends to be the reason a PC game gets mainstream attention in the first place.

    • RegisteredUser says:

      Lal @ people who can already see the end of the visual cycle when we haven’t even remotely approached photorealism and proper, true physics and lighting and water yet.

      Yea, we are so totally going to reach an indiscernible middle ground of visual quality in the foreseeable future than any integrated thingie can do, suuuuuuuuuuuuuure..

      Where have you BEEN the last 20 years?

      When we get high-definition movies on the level of Final Fantasy(pick your favorite omg-I-can’t-believe-this-is-CGI) et al as simple compressed script / rendering instruction files rather than pre-rendered, because we now have the power to render them all ourselves in realtime as standard desktop equipment, then we can talk.

    • Starky says:

      @Archon, what resolution are you running it at? I highly doubt you are running it at very high with a big screen? I’m running at 1920×1080, and I have a 8800GT OC (about the same speed as the standard GTX) in my old machine and it cannot run Metro 2033 on very high at all unless you’re only running on a small screen (1440×900 or less).
      My 5770 as I said runs it at about medium-high on most settings, probably more towards high – some things are down on medium only because I can’t notice any difference between the settings.

      @RegUser… Erm are you even following the same conversation?
      Let me spell it out for you, no one is predicting the end of visual improvements in games – but it IS slowing down, and this isn’t because of consoles holding PC hardware back (as much as some people would like to believe) but because a 20% generation actual processing speed increase is no longer enough to make massive leaps in visual quality.

      So unless there is a massive shift in paradigm or a massive leap in processing power – things are going to stay roughly as good as they are now for the foreseeable future.

      When we DO have enough processing power for what you describe, it WILL NOT be running off a GPU anyway – it will be rendered directly from your CPU (say some 32 core, 16nm future CPU) – that CPU may be some hybrid of current GPU and CPU as we know them today, but it won’t require a CPU and a separate discreet GPU.
      It will probably be raytraced and highly procedural – both CPU operations.

      Essentially at that point, like audio, the sending the pixels to the display will only be a tiny overhead in the operation, and needing a dedicated card will be redundant.
      Just as sound cards are redundant now (for anyone except an audiophiles who needs a lot of I/O’s).

    • Captain Kirk says:

      Even NVIDIA has an answer (a dual core ARM Cortex 9 with built in GPU – and Windows 8 will support the ARM architecture – was just announced at CES).

      I am not trying to outright disagree with you Starky, but the situation is very dynamic. One of the main reasons that we have not seen graphics dramatically improve in many of the titles in the past half dozen years is because the consoles have not been updated. As the hardware design and lifecycle of consoles get longer for monetary reasons (it gets harder for hardware manufacturers to recoup after billions of dollars spent in design and fabrication), the graphics fidelity target stays fixed. Most big budget titles with wowzer graphics of course targeting both the 360 and the PC and therefore don’t target our far superior hardware.

      However, there will be new consoles in a few years or so. And their target will be non-upscaled 1080p, 60 fps games (with 0 to 4x AA). Sandy Bridge’s siblings are not going to be able to touch these numbers. Sandy Bridge is probably not even comparable to the graphics processors in 6 year old consoles. That’s how large the gap is. Given Intel’s total inability to design proper graphics (they have tried and failed for 20 years now), ATI’s lack of a competitive CPU, and NVIDIA’s lack of an x86 processor at all, even a skunkworks cpu, all seem to point to the status quo for at least the end of this console life cycle and well into the next. Sandy Bridge’s graphics processor uses fixed function hardware!!!! NVIDIA and AMD moved away from that 5 years ago!

      This is assuming that we stay with rasterized graphics in the near future. Ray tracing is also something that can be implemented on the GPU for tremendous speed gains, and CPU’s simply don’t and will not have the execution hardware to ray trace in real time in the next 5-10 years. The issue at much lower resolutions is different, but not for large 1080p+ resolutions.

      Considering the past of the computer architecture world, it seems unlikely that Intel is going to suddenly make a competitive 3d processor. It also seems unlikely that any one of the companies (AMD or Intel) is ever going to dedicate enough die space for the execution units required to keep pace with console graphics for the next console life cycle. Eventually, yeah, when the CPU parts become so trivial in size and Intel figures out how to do graphics or AMD figures out how to do even better processors we will see what you are suggesting.

      So in short, end game, I agree. We will see CPU+GPU combinations where a majority of the die will be dedicated to the GPU, with the peripherals being dedicated to fixed function hardware and the CPU. But I don’t think it will be important in the short term (<5 years) and quite possibly not in the near long term (<10 years, i.e. the next console life cycle).

      It will canabalize the cheapest of the cheap, the <$25 graphics cards and some of the worst IGP's, but the better IGP's and nearly all discrete will remain better. It might kill off some of the low end, but that is not because it is better – more because of Intel's dominant market position.

      Lastly, don't forget that any integrated processor will have to share main system memory. This bottle neck, at least in the short term, offers insurmountable performance limitations. Also, it increase the cost of the packaging on the motherboard because it requires many more pins to keep the processor fed with information.

  16. A-Scale says:

    People are complaining about the integrated GPUs, but as someone who usually only has one graphics card at a time, this will permit me to RMA my card while simultaneously still having a machine to use.

    • DigitalSignalX says:

      The only reason I’m reading this right now is because I’m in 240 bucks in the hole ordering a new card to replace my current faulty one, so I don’t have 2 weeks of no-pc when I RMA it.

      When buying a new MB this Christmas, I had to go out of my way to not get an integrated GPU because I thought it was a waste of board resources. Fool am i. Some of those integrated GPU’s are better then the one on the game computer I’m replacing >.<

  17. Navagon says:

    There are many things I wouldn’t trust Intel with and graphics processing is definitely one of them.

  18. vash47 says:

    They forgot to mention the part where <a href="http://www.computerworld.com/s/article/9202961/Intel_s_upcoming_Core_chips_to_secure_streaming_movies?taxonomyId=142&quot; title="Sandy Bridge comes with hardware DRM."

  19. vash47 says:

    They forgot to mention the part where

  20. vash47 says:

    They forgot to mention the part where
    Sandy Bridge comes with hardware DRM.

    • Navagon says:

      The edit link is your friend.

    • Starky says:

      So? it includes on chip DRM to prevent you copying streaming 1080p content from specific Intel partners for said 1080p content.

      It won’t do anything for your own content, won’t do anything to stop you ripping your own blurays or netflix, or any other content unless intel partner with them, it’ll just stop (for about a week until someone cracks it) the specific Intel partnered content.

      It’s no different from the fact graphics cards have DRM built into them also (HDCP) which has also been utterly meaningless as a form of DRM (all it does is stop legitimate buyers from watching their disks), pirates ripped away happily, and now it is fully cracked anyway.

    • Urael says:

      “The edit link is your friend.”

      No it bloody isn’t. I haven’t managed to use it once without being judged Spam and it eating my original comment.

  21. Eclipse says:

    tests says Sandy bridge runs today games poorly, attesting 15-30 fps even in mid-low resolutions (1280x*)… So it that the console experience right?

  22. Snargelfargen says:

    I don’t think integrated graphics are going to make a big difference for several years yet. It’s still possible to play just about anything out there with a top of the line gpu from 3 or 4 years ago, or something cheap purchased today.
    Once a new generation of consoles is finally released, there’ll be a jump in graphics quality on the pc as well. At that point, buying a chip with integrated graphics will make a lot of sense, especially for people who are trying to upgrade an older system.
    I happened to buy one of the last pre-DX11 cards (ati HD4870) and I can’t even justify getting a new one in the next year. It’s 2 1/2 years old, but anything better would be a waste of money.

  23. pupsikaso says:

    He means to say bring a PC-like experience to next gen consoles, surely? PCs have never lacked for neither processing nor GPU power, obviously. If this chip is to be used in next gen consoles, however, then I can see how it can make them both more powerful and cheaper to produce.

    Otherwise this is so backwards that it makes zero sense to me. Why would I use a sub-standard chip for two things at once when I can just plug in a dedicated graphics card instead? Or maybe this is for laptops? I’m so confused.

    • drewski says:

      I suspect what he’s saying is that when games optimised for this chip make it to market (or their patches do) people will have a “put in and play” experience.

      You won’t need to muck around with making sure your video card has blah blah blah, and blah blah of blah blah, with blah blah; you just put the game in, install it, and it works. That’s the “console like” experience Valve (and, I suspect, a lot of PC users) want.

      It’s not for me, but I can see the appeal.

  24. Mac says:

    What is a console like experience ???

    Is it RROD ?

  25. kwyjibo says:

    “The anti-gravity gun” – That’s what the presenter calls the Portal gun.

    Integrated graphics has always held back PC gaming from making the mainstream. The masses do not have dedicated graphic cards, the masses don’t even have a non-Intel graphics chip on their laptop. This has always meant that users were afraid of the systems specifications box on PC games. It’s meant frustrating laggy experiences. It’s meant that the only true mainstream PC games are the oldies such as WoW, or Farmville.

    And so with Sandy Bridge, and with AMD Fusion, we’re going to see graphics capability built into the CPU. But I don’t think this is what will bring console-like ease of use to the PC space – it’s only a symptom of the cause.

    The real cause has been the stagnating of the console space. The 360 is 5 years old, the PS3 is largely the same, and the Wii can be emulated on a laptop (probably). It’s because that consoles have stayed still technologically, and that this generation will be the longest of them all, that has allowed integrated graphics to catch up.

    Do you think integrated graphics will give you a console-like experience when the next gen comes out? Will they fuck.

  26. phenom_x8 says:

    The problem is the price of the motherboard alone for this procie was almost twice the price of HD6850 or GTX 460 768MB.
    also it isn’t upgradeable due to its ridicoulous amounts of socket it has that incompatible each other.This problem also hold me back from Intel when I was decided to build new rig 2 years ago, I dont even know what socket this processor intended. Trully cosole like experience!

  27. Spacewalk says:

    It’s a good thing that sandy bridges aren’t flammable.

  28. noobnob says:

    It’s an improvement, but it’s hardly impressive. I was disappointed with game performance to be honest, but at least Sandy Bridge’s IGP is powerful enough so that it can be used for more than just games (such as transcoding). Don’t know if that justifies buying a new motherboard with yet another new Intel socket, though. Here’s hoping that AMD’s Fusion won’t disappoint…

  29. RC-1290'Dreadnought' says:

    Portal gun with gmod physgun like upgrade!

  30. Corrupt_Tiki says:

    You Dirty dirty man, Gabe..

    On a side note I will probably get this (soon) as I am in dire need of a new CPU and RAM upgrade, so it’ll be time to move my faithful core 2 duo on to better pastures.

    And I’d like to second a comment I read further up by Rich? I would rather replace my outdated GPU, than replace my outdated CPU, Mobo, and (sometimes RAM). Any day.

  31. bill says:

    It’ll be a game changer because suddenly all the millions of people who can play facebook games, but can’t play pc games will be able to do so – at that point the potential PC game market will dwarf consoles.

  32. Mr Chug says:

    The tech behind Sandy Bridge is impressive, but why it needs an entirely different socket from the current i5/i7 generation is beyond me, as is why they’ve locked down over-clocking so much and used an absolutely bonkers numbers-with-suffix system to describe what the chip can do (can be overclocked, low voltage, etc). It smacks of unnecessary cheap marketing and money grabbing.

    Also, I heard through the lithography pipeline that Intel are planning to get Ivy Bridge (the 22nm version of Sandy Bridge) out by this time next year, including DX11 and supposedly double the overall graphics punt, so I’m prepared to wait for that before judging prices for an upgrade.

  33. MadTinkerer says:

    After a bit of thought, I realized that what Newell meant by a “console-like experience” he really meant “back to the way PCs were”.

    Once upon a time, the processor in your computer mattered far more than any peripheral for determining the graphical capability of your box. Can’t run Ultima Underworld at full settings on your 386? Upgrade to a 486! 486s could have different hard drives sizes and memory, but a 486 was a 486. Sound cards existed for the audiophiles but relying on a peripheral for graphics was less than laughable, it was inconceivable.

    Then the Pentium comes out, Quake comes out, and Carmack makes Quake play nice with someone’s experimental “3D graphics accelerator”. Now the average user doesn’t even know the difference between one processor and another, but they do know the latest ATI gives them more polygons. (Actually, I lie: the average user doesn’t know what a polygon is, even if they stare at them playing WoW all day.) So our wallets are all fucked for sixteen years until the graphical plateau is reached where people finally say “No, I’m not buying ANOTHER graphics card yet, the games on my machine look just fine!” en masse.

    In the meantime, I’ve actually never upgraded a graphics card in any of my machines, but I’m one of the odd cases that couldn’t afford to upgrade during the early 00s and then stuck to laptops since.

    So if a 486 is a 486 again, and it can play Portal 2, then I know what I’m getting next.

    • bill says:

      Bring back Quemm and boot disks!

      But I agree with you. And I’ve been on laptops for a while now too. As, infact, are all of my friends and family. I don’t know a single person who has a desktop these days.

      (side note: while shopping in tokyo for a new laptop. in a 6 story electronics store with more laptops than you’ve ever seen in your life, I found 5 possible laptops that had decent, non-integrated graphics. So if I was less informed I’d have had a 0.01% chance of buying a laptop that could play most games. )

      I think that the recent massive success of the iphone/ipad app store has shown that a lot of people just want easily accessible software that works. See something that looks good – get it. don’t have to check the requirements, google the graphics cards, check the DRM limits, etc…

  34. LoopyDood says:

    The future is Fusion, you say?

  35. RegisteredUser says:

    The only console experience I want brought to the PC is games that should have been made for the PC in the first place and then were made “xxx exclusives”.

    Port those with anytime saves and non-clunky controls and remove the “press x to win!!11″ mechanics and I’d be all aflutter, as opposed to this whole hardware merging that will just lead to more single component dependence.

  36. rocketman71 says:

    I know hats are Valve’s thing, but.. who told that guy to wear THAT hat?. It looks ridiculous.

    (yeah, yeah, I know he’s not from Valve)

  37. rocketman71 says:

    As for Sandy Bridge, if this can end the notebook tiranny of the shitty GMA3100 in cheap notebooks (someone should go to jail for that POS), I say good for everybody.

    As for the Hollywood DRM shit they are putting in the chip, that I’m not that keen on, so I’ll cautiously wait for AMD’s answer.