At Sony’s interminable live conference revealing the PS4, it was announced that it’s built on an x86 CPU, includes an “enhanced PC GPU”, along with a local hard drive (although no mention of SSD). And it comes with 8GB of GDDR5 memory. Also announced are other features that have that nagging sense of familiarity, including live streaming of games, and the ability to stream yourself playing games. It’s a PC!

So, we say welcome Sony! Glad you’ve decided to join us.


  1. kwyjibo says:

    Hopefully it’ll mean even fewer third party console exclusives given that porting costs will be minimal. Shame that The Witness is getting timed exclusivity though.

    • Tridae says:

      This would also make a PS4 emulator easier to create.

      Has no one picked up on the total silence regarding discs? There’s a huge emphasis on downloading content and games but not a single mention of discs. They stressed being able to download before you buy or partial download as you play.

      I’m calling it now, no disc drive in the PS4. I think the console wasn’t shown partly for this reason.

      • kwyjibo says:

        Yeah, I’m sure we’ll get an emulator for the PS4 and XboxNext before the generation is over.

        They didn’t mention disks once, but I’d be surprised if it didn’t support a Blu-Ray drive. Sony have invested too much into that to sunset that tech already – has Blu-Ray even overtaken DVDs yet?

        EDIT – Specifications released, includes Blu-Ray – link to

        • Hoaxfish says:

          I think we’re probably too early to get a 100% digitial download console… there’re still “first world” countries where bandwidth costs, stability, etc would cause issues. That’s before you get into other markets.

          But they did talk up stuff like downloading a game (they think you might buy) before you’ve actually bought it, a dedicated onboard chip for downloading, and downloading while the system is “turned off” (i.e. the old tactic of pretend your system’s boot time is fast because the system is never technically “off as in not drawing power from the mains” off)

          • Malibu Stacey says:

            (i.e. the old tactic of pretend your system’s boot time is fast because the system is never technically “off as in not drawing power from the mains” off)

            What, like the PC standard ATX which has been in use since 1995?

        • toejam316 says:

          As much as I’d like to think so, I’m doubtful. We still don’t have a fully functional Xbox emulator which was rocking a Pentium 3 and a GeForce 4. And I really do mean it was rocking a Pentium 3 – there are tutorials out there to upgrade the CPU. On the plus side, it’s less like an emulator and more like WINE – an abstraction layer translating the calls into a valid call for that OS.

          • mckertis says:

            To be fair, nobody really cared for xbox180, neither then, nor now. How many exclusives of any interest does it have ? Two ?

          • roryok says:

            I certainly cared for how moddable it was. I still occasionally use it!

          • Hahaha says:

            mckertis that is highly amusing seeing as you could do so much with it, being able to rip games with it was a nice touch ;)

        • roryok says:

          On the spammers issue, I don’t know if its been mentioned before but there’s a thing called SURBL checking where you can check any link against a blacklist of spammy URLs or link shortening services. If RPS was to implement such a check for any link in the comments section, it would certainly cut down on these link spam bots.

          • Diving Duck says:

            But where else could I find such quality links to satiate my cut price tat yearnings, or just as importantly, find how to apply for a job that Susan’s sister’s dog’s fleas have which earns 30 bajillionty dollars per minute?!?

          • roryok says:

            ebay and bumper stickers. next question!

        • CoreWolf says:

          Yeah we’re not ready for 100% digital download yet.. at least not in the UK. People on Infinity or cable would probably welcome it, but there are still large areas of the UK where the best you can get is 2-3mbps.

          I would know as I live in such an area. They have a plan to improve speeds by 2017. Lucky us, lol.

          • Malibu Stacey says:

            I live in Glasgow, like 10 minutes drive from the city centre & the best I can get on DSL is 2-3 Mbit & that’s when it stays connected as it can drop constantly for hours as the cable between the exchange & the street cabinet is so ancient & decrepit (2 different BT OpenReach engineers have described it in that fashion).
            Thankfully OpenReach say our exchange will be upgraded to optic fibre by June this year along with a few others but there’s tons of others a short distance away which don’t even have an expected date of upgrade.

      • Al__S says:

        Emulator? I’m almost wondering if all it would take is a Steam-esque programme manager and integrated DRM system.

      • Drake Sigar says:

        Yeah, it’s too early to go full digital. Apart from the stability issues in various countries, there’s also the fact that digital sales haven’t been especially promoted in the console market outside of arcade games or old classics, and what modern games are available are often priced the same as the boxed version. Not to mention it’ll cut the retailers out, who arguably bring a lot of cash to the table, plus it might not be wise to go fully digital when the battle for digital consumer rights (the right to resell for example) hasn’t started proper yet, and there’s no guarantee as to it’s outcome.

      • Solidstate89 says:

        You would think so, and yet there’s still no good Xbox emulator despite the first-gen using an Intel x86 CPU and a off the shelf nVidia GPU.

      • Mo6eB says:

        About emulating PS4 – it won’t be that easy. Remember, Xbox was a PC and we still don’t have a good emulator for that. The biggest issue is all the complex hardware which games can touch directly – PC games are basically written against an imaginary high-level machine, which then every hardware vendor implements pieces of with drivers. On the other hand, you wouldn’t need to interpret machine instructions, except again – specialised ones present only on the PS4. Then there’s the issue with the unified memory architecture, which could necessitate tons of back-and-forth copying between video and main ram in PCs.

        Still, I’m prepared to be blown away by a genius high-level emulator including a static code analyser to complement a brilliant memory manager; whose author we’ll all hail as the next messiah, at least until SONY’s flying demonic ape-lawyers swoop down from the sky and carry him or her to the top of the SONY tower in the middle of SONYdor to be judged by SONYron as guilty for all the crimes of mankind and sentenced to being thrown in the lava lake of mount SONYdoom.*

        * – any/all of PS4, SONY, “SONY’s flying demonic ape-lawyers”, “SONY tower”, SONYdor, SONYron, SONYdoom might or might not be registered trademarks, patents or copyrights of SONY corporation.

    • rje says:

      It’s only a console exclusive, not a system exclusive. So pc release should still be okay.

      source: link to

    • kwyjibo says:

      Look at it the other way around – how fast can we get this thing hacked and turned into a Steambox?

    • jrpatton says:

      There will certainly be fewer hardware mandated exclusives. Most console exclusives from here on will be because of back-room deals, or first party games.

      On the plus side, games made for one console that won’t be made on the other may still be released on the PC on a delay. Game devs still don’t see the PC as a competing gaming platform to the consoles, so there are fewer reservations on releasing “exclusives” on the PC.

      Also, maybe we’ll even get a PS4 emulator in 5-10 years.

  2. pupsikaso says:

    Is this real?

    • Phantoon says:

      Well, it’s words that came out of Sony people faceholes, so yes. I would venture that they did not lie about the graphics.

    • jrpatton says:

      I assume you mean, “Is it literally a PC?”. Technically yes, but it’s not like you can use it as a general purpose computer.

      What RPS is trying to get at is, the PS3 was a unique piece of hardware with the Cell sort of 8-core but not really processor. The PS4 on the other hand has a much more traditional PC hardware architecture, with RAM, APU, and GPU. Granted APUs are new, but it will be interesting to see how it is utilized.

    • zbmott says:

      I’m sorry, Ms. Jackson.

  3. Brun says:

    Quickly! We must suppress this “PS4”, lest the unwashed console peasant masses gain gaming literacy and threaten the landed gentry!

    • bstard says:

      Oh noos, someone repress these serves and put them back in their place. The aristocracy should remain pure!

      • Tams80 says:

        A revolt! These ruffians are threatening one’s pure gaming parlour!

  4. jellydonut says:

    It’s a laptop with a lot of memory in a console box.

    On the one hand, great, PC ports will be more plentiful and suck less.

    On the other hand, the bar is now so low that we’re going to be stuck at a plateau even faster than after the release of the previous generation of boxes.

    • eclipse mattaru says:

      PC ports will be more plentiful and suck less

      You seem to underestimate the power of platform-exclusivity stupidity, my good man.

      Though ports should suck less, that much is true. Then again, I’m sure plenty of developers will find a way to cock them up somehow.

    • Mrs Columbo says:

      The traditional, cloying sensation of buying a new PC only to realise that it’s instantly obsolete has, for me, subsided in recent years. No doubt that’s because of the levelling-out of system requirements which are a by-product of console development.

      I reckon, on balance, it’s a good thing: I acknowledge that some game genres can become homogenised partly because of the influence of the console market, but the games that work best on PC continue to do so.

      • sinister agent says:

        The “plateau” of not having to constantly piss money up a wall just to have playable games is the best thing to happen to the PC for a decade.

    • Lemming says:

      I admit, I lol’d when during the conference he refereed to it as a ‘supercharged PC’ with the picture behind him showing an x86 processor.

      The forced social stuff, while expected with this gen, still made me retch. The whole marketing spiel was built around the premise that we, the consumer, had demanded more social connectivity in games. Did I miss the meeting? Gamers demanded nothing of the sort!

      Also, that streaming old games thing sounds good in theory, but most of the planet are connected to the net via ISPs that throttle and/or have download limits. That kind of service is completely overreaching. We aren’t in any position to be doing that kind of thing yet.

      • Thants says:

        I, as a gamer, demand less social connectivity in games.

      • FriendlyFire says:

        The funny thing is that rumor has it the CPU’s actually a Jaguar-based AMD processor. That’s AMD’s latest mobile architecture, which is designed to fight… Intel’s Atom.

        There’s a reason why they’re just quoting the number of cores and not much else.

        • Adekan says:

          To be fair, that’s kind of a hallmark of AMD processors in general. Look at all the cores! Never mind the fact that they’ve been years behind Intel in the consumer market since Sandy Bridge launched.

          Even in server boards, a Xeon is going to give you more bang for your buck than an Opteron. I can’t fathom why they didn’t use the new low power consumption Ivy Bridge I series.

          • c1515252 says:

            Not that Ben is doing great right now either.


          • roryok says:

            I would guess AMD gave them some kind of licensing deal that Intel would not.

          • Low Life says:

            AMD gives them both the CPU and GPU, and what’s better they might even do that in one cheap chip (this is how the Wii U is handled). This gives AMD a huge advantage in negotiations vs. asking for a CPU from Intel and a GPU from Nvidia.

            The latest info on the next Xbox is that it’s going to have the same solution from AMD in it.

          • Tams80 says:

            The AMD processors are probably cheaper and Jaguar is an APU, so will have better GPU performance. Sony may be using hybrid crossfire with the AMD GPU they are using (they won’t be using a top end AMD GPU, so it should be possible). As a mobile processor, it also requires less power.

            I suspect the main reason is cost and low power consumption. Intel know they have in general better processors and charge as such. Ivy Bridge would likely also be too power hungry. Consoles really need to keep costs and power consumption down, as the last/current generation showed.

          • SuicideKing says:

            Because Intel wants the MONIES!

            Seriously, Intel would want a profit margin, then the GPU would have to be sourced separately. Using a dual core, HT enabled Ivy model would cost at least $100, and the iGPU would be crap.

            If Intel had quad core Atoms based on the 22nm process and GT3e HD4600 graphics, it might have been the better option.

            Though honestly i doubt power consumption is an issue here, it’ll be plugged in to the wall.

          • BarneyL says:

            Power consumption can be an issue, remember power ultimately ends up as heat and that’s caused plenty of issues for the last round of consoles.

        • mattevansc3 says:

          That rumour seemed to be started by based on a “credible source”

          Not taking into account that Eurogamer’s hardware articles are for lack of a better word, pants, its one of those rumours that people ran with even though it doesn’t stand up to scrutiny.

          The Jaguar core is a quad core upgrade to the Bobcat core which has performance equivalent to an Athlon X2 (which was released around a year after the Xbox360 to put it in some perspective). Its designed for netbooks, phones, tablets and all in one units. This comes direct from AMD’s own press release.

          Eurogamer claimed its “customised” to handle 8 cores and is a “low power high performance core” (even though AMD state otherwise), that its a CPU (the Jaguar and Bobcat cores are designed as APUs) and that the PS4 has a separate Northbridge (APUs don’t). This completely ignores the fact that AMD has a natural high performance 8 core designed to work as CPU that uses a Northbridge on the market that’s heading for a die shrink, its the Piledriver core.

          I’d put no stock in the PS4 being powered by a Jaguar based “CPU”.

          Re-edit to delete edit as mis-read press release.

          • Unruly says:

            Well, the Sony press release that Lemming linked confirms that it is, in fact, a custom 8-core Jaguar chip. It also confirms an optical drive(6x BD/8x DVD) and just about all the general stuff, aside from what exact chipset the GPU is. What I’m interested in is if the HDD is going to be like it is with the PS3, and easily replaced by the end-user with any retail HDD, or if they’re going to go the MS route and switch to some proprietary crap. I mean, it is Sony we’re talking about. They not only love their proprietary crap(looking at you Memory Stick!) but they also love to screw customers over(rootkits, OtherOS, etc). Doubly so if they can do it after the customer has bought the item in question.

      • kibble-n-bullets says:

        I get the vibe that executives believe that young people are unwilling be socially disconnected from their phones at any point. Shame. There’s obviously a few brains working upstairs at Sony though: I saw a pushing of existing and successful social networking platforms. They’re not going to top smartphones (BBM, whatever an iPhone uses) with any propitiatory bullshit.

        • Lemming says:

          Oh, there is definitely some genius brains behind the marketing. The most cunning bit was when they showed streaming to a Vita, clearly in an effort to shift more units of the ailing handheld.

      • Milky1985 says:

        I just realised, they said x86 architecture didn’t they , not x86-64. I know we can assume it is because otherwise that 8 gigs of ram is useless but they probably should have given us that info :P

        • Joe-Gamer says:

          32 bit WINDOWS can’t use 8Gb of ram but on a console, where dev’s have direct access to the hardware should have no problems using it all.

          • Malibu Stacey says:

            Wrong. 32-bit Windows (XP & later) can use Physical Address Extensions which means 36-bit memory address space allowing for substantially more than the 4GB limit.

    • Nesetalis says:

      don’t forget the fact that its still a walled garden. The US government has ruled, you are not allowed to modify consoles or DVD players or anything of the sort without permission from the manufacturer.. and a few other countries have gone that way as well. No modding, no 3rd party software, no, its not a personal computer.

    • stupid_mcgee says:

      More specifically, it will open up the possibility of easier Linux ports. What with Sony using OpenGL, even if it is a proprietary version.

      Given how much Valve has worked with Sony over the past few years (Steam is on PS3 and Sony let them do cross-platform with Portal 2), I’m starting to wonder if this news of the PS4 architecture might have been part of a reason for Valve’s foray into Linux. I t could also bode well for Sony’s computer division, since they could sell gaming laptops and PCs with Ubuntu instead of Windows.

  5. The Sombrero Kid says:

    the announcement was like watching e3 2012 except this time I’m underwhelmed

    • TheIronSky says:

      You mean you’re not usually underwhelmed by E3 announcements?

      • rb2610 says:

        2010 I think was pretty whelming, there was the Assassin’s Creed 2 announcement, which had so many new features compared to AC1 that it was truely impressive at the time.

        Also there was the Kinect, Move and Wiimote+ all announced in one year. While they turned out to be underwhelming as products, it was still quite exciting.

    • Deadly Sinner says:

      One bit of good news out of the whole thing was that million object Havok physics demo running on an AMD GPU. Looks like Nvidia’s proprietary Physx crap has been rendered obsolete.

      • Tams80 says:

        Good. PhysX can go jump into a fiery pit (fuelled by Fermi GPUs).

  6. Kein says:

    But but but… muh share button!

    • Hoaxfish says:

      Do you like Facebook? Despite an absence of Twitter?
      Do you want people to watch you over the Internet?
      Do you want people to play your games, on your console, and complete your games, without them even being in the room?

      The PS4 is for you!

      • Milky1985 says:

        Do you want the console to decide what games it thinks you will like and download them for you
        Do you want your console to track your play to come up with suggestions of new games
        Do you want your console to start having a go at you when you don’t play it for a few weeks and complain that its being neglected

        Then the PS4 is for you!

    • OfficerMeatbeef says:

      I am more resistant to and/or completely uninterested in most “social media connectivity!” bullshit than anyone else, and you can bet my eyes rolled so hard on that “Share button” reveal they actually bored the sockets a bit wider. But I pushed back my cynicism and gave it a chance, and they actually TOTALLY WON ME OVER.

      Basically all the “social” stuff they presented was actually really cool and relevant to video games and some of the stuff that makes them great. Constantly recording play so when something awesome happens you can go back, trim it out into clip form, and share it online? Brilliant. Why would anyone ever want that? I would absolutely not be surprised if something like that doesn’t show up in Steam sometime in the future, particularly as they already implemented that screenshot-sharing feature a good while ago.

      Built in system-level support for Ustream? Excellent, that’s something that we’re already starting to see at a per-game basis on PC. Planetside 2 and The Showdown Effect come to mind. Who would ever want to watch people play games or broadcast their own playing? Well only hundreds of thousands of people. Hell, I have a friend who never bought or played anything on OnLive but used it constantly just to spectate games.

      Being able to help/affect other people’s games? Yeah, that’s not necessarily for “us”, but surely we all have friends who gave up on a great game because they couldn’t clear some section or just couldn’t grasp the mechanics, or maybe you have a parent or sibling who has become interested in playing games but doesn’t quite have the skill level to deal with certain more complex situations? Plus there is a potentially more “hardcore gamer” angle at play there too, like allowing your friends to essentially jump in and act as a dungeon master for a game, etc.

      Yes, this stuff can generally be done on a PC already (well, that last one is a stretch), albeit generally through the use of third-party applications etc. That doesn’t in any way compromise the fact that these are all very smart, very cool, forward thinking ideas they’re implementing, ones that will be immediately accessible and easily usable to anyone using the system and that have infinitely more value and benefit to the actual experience of playing and sharing games than any previous “we have Facebook and Twitter integration so you’ll annoy everyone when it auto-tweets that you just beat a level!!” nonsense.

      • Reapy says:

        Movie clip /stream integration I thought was a good idea for what you mentioned. I can get my games up on youtube but I have to have fraps, enough hard drive space and a video editing program to string it all together and upload.

        Granted this still won’t have that power, but I have had moments where I wished the game was recording and I could go back and snag that moment.

        Still I’m sure it’s one more api for developers to account for when making the game to support it, there really is no reason for most games to not already have this, just, that Sony can come down and say YE SHALL HAVE THIS FEATURE.

        Which in a way is why the next gen consoles coming around is good, finally AAA can unleash their vast content generating monies and push games so we can actually use our PC hardware to its full extent.

  7. TheIronSky says:

    Who would’ve guessed I might end up owning a Sony machine? Weird and exciting! Now if only they’d stop making such God-Awful exclusives like Twisted Metal and whatever other crap they tried to force on us in “The Tester.”

    Let’s be honest: anything with a PC-grade processor deserves more than a glorified demolition derby game.

    • ulix says:

      While yopu can surely be cinycal about many of Sony’s exclusives, many of them are actually stellar in a lot of ways. The art design in God of War III and Uncharted 2 & 3 is, without a doubt, better than anything else in the industry (whatever you may think of the rest of these games).

      Also Sony supports a lot of great indie games. And I actually liked Heavy Rain…

      And while GT5 might not be better than Forza, although it took them three times as long to develop than any individualk Forza game, I do hope that Polyphony gets their shit together soon.
      And also: Drive Club. That looked cool.

      • GreatGreyBeast says:

        Don’t forget Team Ico.

        • Fiatil says:

          Team Ico’s done a good job forgetting about themselves I think. The Last Guardian is still vaporware and it was announced 4 years ago.

        • dmoe says:

          The same PS3 launch title Team Ico which now looks like it will be PS4 launch title Team Ico.

    • ulix says:

      Let me clarify:
      While I own a PC perfectly capable of running Battlefield 3, Witcher 2 or Crysis 2 in high or “ultra” settings, I still think that Uncharted 2 and God of War 3 are graphically superior, even though they run on 7 year old hardware.

      Sure, they run in 720p, without MSAA and at 30 frames. But if they would actually run at 1080p, at unlocked frames and with MSAA there wouldn’t even be any discussion.

      The artisanship Naughty Dog or Sony Santa Monica show in their design ist just above anything else, by far. Same thing was true of Retro (Metroid Prime) during the 6th generation.

      • kharnevil says:

        Sir, I believe you should see an Optometrist.

      • TheIronSky says:

        I politely disagree – I thought God of War was essentially a button-smasher stuck in an uninspired bit of fiction and that Uncharted was just Tomb Raider with more story and QTEs. Not that either of them are bad games, and they both certainly have their places (my time spent with them is fairly limited, having only completed God of War 3 and Uncharted 2), but they seem to be entirely uninspired and slightly regurgitated. I would never match Santa Monica or Naughty Dog with the likes of CDProject RED (as you mentioned) or some other great PC developer like, uh, I don’t know… Ion Storm? Origin Systems? I’m drawing a blank on the specific example I wanted to use.

        BUT I DIGRESS!

        Seems we just have a fundamental difference of opinions. I look at the gameplay presented in God of War, Uncharted and other Santa Monica games as the de-evolution of gaming, whereas you see their “artisanship” and artistic design prowess as shining pillars of the industry.

        I’d rather they focus on implementing some interesting gameplay elements instead of trying to force whatever entirely contrived and unoriginal story down my throat through QTEs and shiny explosions. But that’s just me. And I’m certainly not implying that they can’t make good games – The Last of Us is on my ‘watch’ list – but I do not see them as the paragons of design that you see them as.

        • ulix says:

          You didn’t get my point.

          I explicitely stated that you can think about the rest of these games, apart from their artistical, purely visual aesthetical appeal, whatever you may like.

          I greatly prefer CD Project games in terms of gameplay, over let’s say Naughty Dog or Sony Santa Monica games (although these ARE in fact amazing action games).

          But to argue that, from a purely “visual artistry” standpoint, the Witcher 2 (or BF3, or Crysis 2, or whatever) are even remotely close to God of War III or Uncharted 2 (and 3) is just laughably silly.

          In terms of “art design” these two (three, four…) games just stand high above the rest of the industry, just like God of War 1 & 2 and Metroid Prime 1 & 2 stood far above the rest during the 6th generation of consoles (explicitely including PC games).

          I repeat again: I am not talking about gameplay, not at all.

          • TheIronSky says:

            I think kharnevil was right and you should go get your eyes checked. Or whatever astral visage, spectral analyzer, or prehensile tentacle you use to look at things.

          • ulix says:

            Sorry guys, there’s just no discussion to be had here.

            The fact that God of War III & Uncharted 2 (& 3) have superior art-design is undisputable. I do love Witcher 2’s art-design. It’s great, don’t get me wrong.
            In fact it’s far above industry standard. But just not nearly as far above industry standard as Naughty Dog’s or Sony Santa Monica’s art-design.

            But even from a purely technical standpoint these games are incredibly amazing. Uncharted 2 still has the the most convincing (meaning “realistic”) snow- & ice-graphics in ANY game to date, same thing goes for Uncharted 3’s sand/desert graphics.


          • tangoliber says:

            I agree with you for the most part. Art design, and artistic detail is very important, and for that reason, Uncharted 2 looks better than Crysis 3 no matter how good the PC is. It doesn’t matter how impressive the engine and hardware is if you don’t have the art budget to put unique detail into the environment.
            I can’t talk about Witcher 2 though, because I thought it looked pretty good, but couldn’t stand more than an hour of it because it wasn’t letting me participate in any gameplay.

          • roryok says:

            wow, I was seriously impressed with how pretty that looked. I hardly think it’s down to the platform though, just a bunch of talented guys tied to an exclusivity contract and given infinity+ money. Also, that guy can’t aim an RPG for shite

          • anark10n says:

            You still haven’t said what’s so special about why the art design in either is much better. You’ve just kept stating that they are, and that’s it. Made more apparent by the fact that you’re arguing a subjective point. That video you linked to didn’t help your point as much as you thought it would.

            And since they don’t actually run at 1080p with MSAA, they can’t even be compared to any of the PC games you’ve mentioned from a technical standpoint.

          • JakobBloch says:

            To be honest Ulix that video was nice but if that is your best example of graphical design from Uncharted I fear they just can’t compete with Witcher 2 (if it is not the best example then… maybe). The clip had the impressive local as the ONLY impressive bit. The rest was fairly boring and uninspired (while the action was bloodpoundingly exciting). I many cases the graphical design in this sequence was actually held back by the graphical capabilities of the system. The horses, the vehicles, the enemies were all very boring. Even the truck with the crane looked flat and boring (though it was definitely more interesting than the rest).

            Now take Witcher 2
            In that game you have strict attention to detail on all things. The guards of the first little town are gloriously crafted. Stained tabards, rusted chainmail, chubby, with a bad posture. They are no just slapped together cronies. They are people with a certain… lifestyle. Interesting in other words. The town likewise. You can almost smell the refuse in the streets. The plank walls and the well crafted doors (the bane of the game) give you the feel of dilapidated. You can feel the moister of gnawing at the wood. And let us not forget the main characters. Geralt is a picture of ravaged professional with all that entails (scares as well as equipment) and no one can dispute the loving care that was put into Triss. I could go on about Vernon and Iorveth but I would rather move on to the trolls… I mean there we have some cool character design and even in different versions with one type decorating themselves with branches and the like wile the other is more of a… boozer.
            I could go into a big thing about the invironments but lets cut it down to them being varied and very evocative (for the most part… some parts of chapter 2 are fairly boring)

            The problem I see here is equating graphical design with bombastic locations. The only part of the clip you showed that was actually impressive was the locale. The thing is that there is much more to it than that and even so other games does it better there as well even in other console exclusives (Red Dead Redemption). Uncharted are good games and pretty games as well but saying they are at the top of graphical design is like saying Subway is a healthy fast food because it is healthier than McDonalds.

    • Daoler says:

      I freaking love demolition derby games! Don’t diss them!

    • rockman29 says:

      Twisted Metal was my favourite game of last year…. >.>

  8. AraxisHT says:

    Isn’t that every console?

  9. Sokurah says:

    Great, cool news. I’ll buy one in 10 years when it goes for nothing.

    • Greggh says:

      HAHAHAAH I did this with the PS2 – BEST DEAL EVER! I got to play all PS2 exclusives :D

      • SominiTheCommenter says:

        I installed an emulator on my PC a few years ago. I all the good and none of the bad.

        • Greggh says:

          Not only you have no grammar skill, you have no sense of ethics :'(

          • Supahewok says:

            Honestly, when a system and its games are no longer supported I don’t blame anyone for using emulators for them. Sony/Nintendo/Microsoft aren’t going to get their share anyway, and a bunch of games get tough to track down. In some cases, they become ridiculously expensive. (my brother bought Final Fantasy 7 for around a hundred dollars, way back before it was rereleased on PSN) I consider them to be akin to abandonware, although I’m sure that that’s not legally correct.

          • aliksy says:

            Emulators offer a better experience than the actual console (assuming you don’t get a performance hit). You get save state, fast forward and whatever input device you fancy.

            If console makers had half a brain, they would sell working emulators and games for current-gen systems (at reasonable prices).

          • MattM says:

            You can use emulators with your own games. I prefer PCSX2 over my PS2 for the higher resolution and texture filtering. FF X looks a whole lot better.

  10. SlyDave says:

    The 8GB system memory isn’t GDDR5. It’ll be DDR3.
    The 2.2GB video memory is GDDR5. The article states this incorrectly.


    [Edit: I am wrong, it appears all the memory in PS4, system and video will be GDDR5 – good for bandwidth bad for latency, given it’s a console, this seems like a really good decision)

  11. mehteh says:

    Thanks again to consoles another was supposed to be PC-only title gets delayed in order to be exclusive to a console for a short period. Thank you for making me wait for The Witness.

  12. Zanpa says:

    It’s so much like a PC, it even is facebook.
    Social everything. What you want to play video games? No, be social man. Talk with your friends instead. See, we even put a dedicated “Share” button on the controller!

    • GiantPotato says:

      Well, give them credit for drawing a line in the sand at least. As of today, people are going to know whether they want a PS4 or not.

  13. Tusque D'Ivoire says:

    Also Kotaku reports that Jonathan Blow’s The Witness will also be a PS4 exclusive (or at least a debut). Sad to hear he hasn’t learned from the experience of striking a deal with MS for Braid. But still, I have high hopes for a speedy PC release

    • RockinRanger says:

      I think he said the PS4 was the only console it would be on at launch, so it could be on PC then as well.

      EDIT: link to

      At about 2:40 he says, “At the release window the Playstation 4 will be the only console that The Witness is on.”

    • skinlo says:


  14. GamerOS says:

    In other News they announced Destiny is a PS4 exclusive, now we all know why they said what they said.
    I hate it when I’m right…

  15. Paul says:

    Not a single interesting game (to me personally) presented…ok, watch dogs could have potential, but Ubi will surelly fuck it up.
    I am underwhelmed.

  16. WoundedBum says:

    8GDDR5. It’d be nice if we could have that in PC too.

    Anyway, I thought most of it looked good! Watch Dogs and Destiny.

    • Sakkura says:

      A typical gamer PC will have 8 GB DDR3 and 1-2 GB GDDR5.

      • WoundedBum says:

        According to Steams hardware charts, only 20ish percent have 8GB.

        And this is a console and faster RAM, so it will probably be more useful.

        • PopeRatzo says:

          By the time this is in stores and there are sufficient games for it, the PC specs will increase, too.

          Every christmas, the PC specs go up. When was the Steam chart made?

          • WoundedBum says:

            Jan 2013.

            I know the PC will obviously be more powerful, but it’s still a good start for the PS4.

        • Sakkura says:

          Steam includes old PCs. This is a new console, it’s only fair to compare it to new PCs.

          Oh, and having faster RAM won’t really matter. You only need 1-2 GB fast RAM for the GPU; the CPU doesn’t benefit as much from faster RAM, let alone GDDR5 which is specifically optimized for GPU usage.

          • WoundedBum says:

            But you said a typical PC gamer. What is that? Someone using Steam? Someone who only built it this year?

          • Sakkura says:

            A typical gamer PC, not a typical PC gamer. I hope they go together though, they’d both be terribly unhappy otherwise. :,(

            Anyway, I feel it is only fair to compare the new console to new PCs. If we mix in all manner of older PCs we might as well mix in old consoles and make a grand old mess of things.

          • mckertis says:

            Thats a typical gamer PC of the people who took part in a survey, not typical PC that is average of the entire world, or even entirety Steam users.

        • El_Emmental says:

          GDDR5, based on the DDR3 design (but changing some design decisions to fit GPUs’ goals), have “better” speed (it can read/write more per sec – see it like cargo capacity), but its timings (the time between a chip ask to read or write and the action is actually done) are much worse.

          It’s perfectly fine with GPUs as they can move to other calculations if the timing are making the initial calculation take more cycles than expected, while it will slow down CPUs (who have a more linear way of doing calculations) whenever the timing are causing a problem.

          The PS4 will probably have a little bit of extra dedicated memory to buffer for the CPU, while its AMD Jaguar CPU will probably get a GDDR5-friendly memory controller/cycle management, but it doesn’t a change a thing regarding the GDDR5: it’s like DDR3, but it favors capacity per sec over timings.

          Regarding the console vs PC “war”, you can easily compensate the lesser “cargo capacity” by getting a 2*8 GB kit (goes for $80/£50 for gaming Corsair series), while the worse timing of the GDDR5 can’t be compensated (even if the platform using it allow adding more RAM).

          Also, DDR4. The JEDEC wrote the final specifications in September 2012, expect it in your rig 12 months later.

      • -Spooky- says:

        Wait .. do you say .. i´m totaly OP with my 16 GB DDR3 .. *shocking*

    • marsilainen says:

      Having gddr5 as main ram is not actually good idea. It is not directly comparable to normal ddr memory as it is designed to work with embedded systems like graphics cards. The five does not mean that it’s two generations ahead. Its actually totally different technology.

      • WoundedBum says:

        I don’t think it would be a good idea in a PC as it is now, but I think it’ll work fine in a console because despite it’s similarities to PC it is still different.

        • marsilainen says:

          It “works” on console because they are sharing the memory with cpu and gpu. Sony wants the gpu benefit more from the memory but it costs with cpu and other systems. If ms uses DDR3/esram/whatever, their overall perfomance might be slightly better but it might show in slower gpu speed.

          In PCs, its better to have dedicated memory for the system and gpu and not share the same type with everything.

      • jalf says:

        how/why would DDR3 perform better?

        GDDR does have limitations, yes, but it is *fast*.

        The main difference is that DDR3 connects to a memory bus, with a memory controller arbitraring access between a number of devices.

        GDDR is point-to-point. Each memory module is wired straight up to its own dedicated controller. The processor effectively has a direct link to RAM, and that cuts out a lot of latency, and allows for frankly absurd bandwidth. Don’t estimate that.

        Having dedicated fast memory close to the processor (which is what GDDR gives you) is a pretty big deal.

        I’ve been unable to find any hard information on just *how* bad the latency of GDDR5 is (do you have any hard info on this?), but (1) the memory bandwidth is *vastly* better, (2) the latency can be partially mitigated with the CPU’s cache, and (3) the PC setup suffers a lot of latency because of how far away the RAM is, and because it is physically so distant from the CPU.

        And finally, there’s a lot to be gained by having CPU and GPU on the same chip (and sharing the same memory — as long as there’s sufficient memory bandwidth, which is what GDDR5 makes possible)

        Honestly, it looks like a pretty attractive setup to me. The main sacrifice compared to a PC is not performance, but upgradeability. And in a console, that was never really an option in the first place..

  17. Clavus says:

    Wasn’t there supposed to be a Battlefield 4 trailer? :(

  18. Premium User Badge

    Aerothorn says:

    While I appreciate the snark, in all seriousness: does this mean RPS plans on covering PS4 games? That would make it within the site’s remit.

    • PopeRatzo says:

      It better not.

    • baby snot says:

      If you can upgrade the hardware then yes. If you can’t upgrade the hardware, then it’s still not a PC. I doubt you’ll be able to upgrade the hardware in this box, but well, let’s see what happens.

      • Hahaha says:

        Going by your post they should of been reviewing original xbox games for a long long time now

    • FriendlyFire says:

      It uses PC hardware, but it’s not a PC. No user upgradable hardware, no user replaceable operating system, limited tweaking capabilities, intrusive DRM, single-purpose focus… I’d be very surprised to see RPS cover it for anything more than saying “Ha! Got you!”.

      • Srethron says:

        That almost, almost sounds like the Macs of 10 years ago. Compare with an Emac, say.

        • MattM says:

          Mac OS, Windows, and Linux are all open OSs where anyone can create and distribute software. The consoles all have closed OSs where you must go through the platform owner if you want to release software. I think that’s one of the major differences between a PC and a console.

  19. GameCat says:

    I was waiting whole time for some The Last Guardian info.

    Nothing, ;_;

  20. skinlo says:

    The entire thing was pretty underwhelming. I saw nothing we as the master race have to be worried about.

  21. Fallward says:

    Just finished watching the entire conference.

    Hardly ANY REAL gameplay footage, far too much CGI doctored ‘videos’ that do not represent the final product at all. You can bet the PS4 will be underwhelming, just like the announcement was.

    Still, the GDDR5 system ram and the potential of updated hardware are still exciting. I guess E3 will tell us more. Although an avid PC gamer, with little to no interest in Sony – I will be purchasing one if Gran Turismo, Final Fantasy, and Tekken all look good.

    • PopeRatzo says:

      Gran Turismo, Final Fantasy, and Tekken already look good.

      Save your money and buy a kick-ass video card, because the future of gaming is on the PC.

      • yourgrandma says:

        Im sorry tekken does not look good at all. It’s problably the lowest resolution console game for the 360/ps3 generation.

        • Llewyn says:

          And neither does GT5, at least in motion. That amount of screen tearing should have been a clear indication to someone that they needed to dial back on the shinies in favour of some semblance of a stable frame rate.

  22. alilsneaky says:

    A closed PC with way more expensive games and no m/k.

    It’s not even halfway there.

  23. Desmolas says:

    The whole thing was the most underwhelming thing i have ever seen. RIP consoles. You were a’ight while you lasted I suppose.

    • Clavus says:

      If that let you down, you should’ve seen the last few Microsoft E3 conferences.

      This presentation was loaded with details and game announcements. It was pretty good.

    • Stochastic says:

      I guess you didn’t see that one E3 conference where they had people playing laser tag in the middle of the presentation.

      • SominiTheCommenter says:

        Don’t forget lady boners and the like. Ubisoft hires the shittiest PR people.
        Watchdogs is looking great actually.

  24. LazyAssMF says:

    Well, i just watched about half of the last hour and I’m not impressed AT ALL. Games look like PC games look now and I really believe that this shitty new consoles are gonna be behind PCs in one, maybe two years. I saw Watch Dogs and I think Sleeping Dogs allready looks better so… I’m dissapointed, a lot. I thought games are gonna look like… WOW, but they just look like… meh.

    • Sakkura says:

      The tech is already behind PCs. The GPU supposedly has almost 2 teraFLOPS performance… well that means it falls just below what a Radeon HD 6870 offers, and that’s an upper-midrange PC GPU from the last generation (launched in 2010). Of course, comparing FLOPS figures can be misleading, but I think it’s fairly indicative of the actual hardware performance. Extra optimization work may help, but it’s still going to struggle to match a mid-range gaming PC bought in 2012-13.

    • Stochastic says:

      It’s far too early to be judging. At the very least let’s wait until E3 to pass judgement on this sort of thing.

      Also, Watch Dogs was actually running on a high-spec PC, not a PS4. Keep in mind that it’s also being designed to run on current-gen systems, so it’s probably not going to be pushing boundaries in the same way that new games made exclusively for next-gen systems will be.

  25. PopeRatzo says:

    All the faster to hack it, my dear.

    Let the console-modding BEGIN!

  26. amateurviking says:

    Given the x86 architecture, am I right in thinking that making an emulator for PS4 and the nextbox would be an easier proposition than the current powerPC-based efforts?

    It would be kind of amusing if there was a functional emulator available within a year or two of their release.

    • kwyjibo says:

      It should be a doddle compared to emulating the Cell on x86.

    • Baines says:

      Emulators are hitting a speed issue these days. CPU speeds don’t increase the way that they used to, so you cannot just throw a faster machine at a problem. More cores help, but cannot do everything.

      MAME (admittedly never a speed demon) is hitting that issue with more recent arcade games, and the issue is also supposedly affecting attempts to emulate more recent consoles. People have some PC-based arcade games running on home PCs, but they weren’t actually emulated, rather they’ve just been altered to run directly on home PCs.

      As for systems being quickly emulated, that generally doesn’t happen either. The Wii was emulated so fast primarily because it was hardware-wise largely an upgraded Gamecube, and the Gamecube had already been emulated.

      • FriendlyFire says:

        This isn’t the same at all though. The most costly step in emulation is the translation between different architectures. With this new generation, it would appear that two of the three primary consoles will share the same architecture as a run-of-the-mill PC. This will dramatically speed up the emulation and should allow for a PC of similar specs to emulate PS4 or XboxWhatever games.

        Heck, you could even imagine a hypervisor for the whole OS running on your PC.

  27. Inigo says:

    No sight of the console itself, but at least we got a gay Japanese midget, a bald simpleton about to burst into tears because he saw a CCTV camera burning the American flag, and Johnathan Blow desperately trying to make 2D line mazes look interesting. Good-o.

  28. WoundedBum says:

    “Aw, the PS4 gets to have last year’s PC games!”

    Saw John tweeted that, but what about Brutal Legend coming out to PC we get 2+ year older console games?

    • PopeRatzo says:

      Yeah, for ten bucks.

      I’m all for getting 2 year-old console games for bargain basement prices.

      You think the PS4 is going to get last year’s PC games at such low prices?

      Also, let’s remember how different the economy was when the PS3 was released. There is no guarantee that next Christmas, when this PS4 actually hits stores, that they’ll fly off the shelves and meet Sony’s projections.

      • WoundedBum says:

        £10 is a lot more expensive that most PC games from 2009 which we can get for £1.50.

        I have no problem with either, but he counted it as a bad thing, when really it’s not.

        • PopeRatzo says:

          I’m sorry, I keep forgetting that most of you guys use foreign money.

          Ten $US, I meant.

  29. ResonanceCascade says:

    The community of a PC-exclusive website is not at all impressed by a console announcement from Sony. Someone get me a defibrillator, I’ve died of shock.

    • PopeRatzo says:

      Gizmodo is also not impressed. For the record. And they are not PC-exclusive by any means.

      • ResonanceCascade says:

        That wasn’t my point. My point is, no one cares what we think about this. This isn’t for us.

        There was nothing within the realm of possibility that Sony could have announced that would have made PC gamers bat an eyelash. Even graphics-wise, Crysis 3 looks better on my PC than any of the games Sony showed.

        • saganprime says:

          It has less to do with graphics. The press conference didnt show anything we havent already seen gameplaywise. The small ammount of wiiu gameplay is more impressive, but it doesent match the graphics quality. It is all about fun. And the pc master race.

          • ResonanceCascade says:

            Media Molecule.

          • Hoaxfish says:

            Media Molecule’s stuff has always been some of my favourite “I wish I had that” because it looks fun.

            The sculpting thing looked neat, but I got a bit confused about the whole “creation” thing… they almost seemed to be talking about it as a dev-tool rather than a user-toy.

            Then they got to the musical bit, where we’re supposed to believe that they rigged multiple animations (dancing, instrument playing) to respond to some basic PSMove-waggling (they were possibly even suggesting that the music and the actual animation itself was also created with the PSMove too). Pretty sceptical at that point.

        • Prime says:

          Surely WE’RE allowed to care about this (bugger what anyone else thinks of our opinion)? If you consider the impact the wider industry has on PC gaming I think you’ll find it’s considerable. For instance, the current “plateau” in graphical fidelity is a direct result of ageing current-gen console hardware so how is getting a look at next gen hardware specs NOT going to affect us or be of interest? This is so newsworthy it hurts.

    • Fiatil says:

      I own a PS3 and I thought this press conference was ridiculous. They actually followed up David Cage’s “Look at all the polygons!” chart with “Death to polygons!” clay sculpture guy. You can’t make this shit up.

  30. PopeRatzo says:

    No PS3 backward compatibility.

    F.U., Sony.

    You can have my mouse and keyboard when you wrest them from my cold, dead hands.

    • Kadayi says:

      It was never going to support PS3 games due to different architecture. However the Gaikai guy talked about playing streamed games. I suspect people might be able to play their old titles that way.

      • Lemming says:

        Yeah until their ISP cuts them off for reaching their monthly ‘download limit’

        • Stochastic says:

          I think Gabe was right when he said that streaming games isn’t sustainable. Or at least not yet. ISPs already hate Netflix and throttle Youtube. Just imagine what will happen if cloud streaming takes off on a mass scale.

          • Lemming says:

            Exactly, and look at all the shit the PS4 is piling in: Streaming old games, streaming new games while you are downloading them, uploading videos and screens while you play, having someone remotely take over your game. ISPs watching this must have shat bricks!

        • ulix says:

          My ISP isn’t throttling anything. Just choose your ISP accordingly.

          • SominiTheCommenter says:

            If you can.

          • SkittleDiddler says:

            You should know better than to use that line of reasoning, especially if you live in the United States or other countries where ISPs are allowed to monopolize.

          • lijenstina says:

            Yes. Anecdotal evidence is the best ISP. They also sell other great products and services. :)

        • Kadayi says:

          I don’t disagree. I think a lot of what they talked about in terms of the technology was very much ‘optimal user experience’ where in people have good broadband connections and don’t possess limit caps. This idea that you’ll be able to play titles off the bat as soon as you purchase them seems slightly absurd, unless they’re intending to sneakily pre-load games you ‘re likely to buy based on purchase history prior to release (as sensible a move/use of resources as Amazon sending product recommendations to you locally, on the basis you might buy them).

          The other option of course is streaming whilst loading, but that in itself sounds problematic, unless they’re intending to hamstring developers into building early level rat runs for the first few hours of a game to allow for the download to build up a buffer of information, which again doesn’t sound ideal.

          However the point was lack of direct backwards compatibility is hardly a shock. The rumours of Sony abandoning the cell processor and settling on X86 architecture have been doing the rounds for ages, so why PopeRatzo is suddenly getting all righteous is perplexing.

      • Walter Heisenberg says:

        So “backwards compatibility” for this system is basically always on DRM :(.

  31. Abendlaender says:

    Help an idiot (me) out over here:
    What does this all mean? Is it like much better than a good PC, on par, what? I need a website that talks to idiots like me in an idiot language like
    “It has 8 super fast things and there are also some random letters and a 5 included which means is super mega duper fast, like a cheetah on a bicyle!”

    • Sakkura says:

      It’s somewhat comparable to a lower midrange gaming PC you can buy in the shops today. On the pure hardware level. The actual games will probably be better optimized to squeeze as much performance out of the hardware as possible, while PC games can afford to be a little more “wasteful”.

      • FriendlyFire says:

        Still, I’d expect a mid-high level machine bought today or even a year ago to compare favourably to those new consoles.

    • Lord Custard Smingleigh says:

      It has a good computer thinking bit.
      It has a fast and good picture making thing.
      It comes with a long-time-stuff-keeping thing (but maybe not a fast long-time-stuff-keeping thing).
      It has 8 big-things of fast short-time-stuff-keeping things.

      Thanks, Up-Goer Five Text Editor!

    • Stochastic says:

      Eurogamer did a decent job, although their explanation is pretty jargon rich: link to

    • nil says:

      It is like two AMD netbooks, taped together.
      (more precisely, it will be, when available to buy)

  32. Syzorr says:

    My biggest problem with it is that they have released an all-dancing all-streaming device that is going to suck the bandwidth out of my connection faster than a laser would melt my eyeballs. (this has not been tested empirically though)

    It just seems like they have designed a system that would work well with high speed, no bandwidth limit internet service which isn’t everyone and that it will automatically place a massive restriction on anyone who isn’t blessed with belonging to the land of Internet Milk and Honey… >_<

    Advantage of digital distribution models like Steam – download the game once
    Disadvantage of what PS4 seems to be proposing – you may only have to download the game once… or maybe many times… and then we are going to bug you to live stream/watch live streams/upload video content… etc etc etc

  33. derella says:

    The Blizzard announcement made me laugh.

    • Lemming says:

      You mean the ‘Jesus, we are haemorrhaging players on this shitty game, lets port it to a console quick’ announcement?

    • Hoaxfish says:

      I wouldn’t mind them explaining the existence of one-machine multiplayer, something so far removed from the PC version’s “allowed” always on DRM nonsense.

    • El_Emmental says:

      Same here, loved how the crowd didn’t made a sound or was impressed when the guy did his presentation.

      If it would have been something like a new game, ok. But that ? “We’re making a console port of a game that was highly hyped but kinda failed to be an actual breakthrough – we’re not even hinting at some extra content for that release”.

  34. Laketown says:

    god all of you people need to get over your console anger, it’s seriously embarrassing

  35. Ricc says:

    The Drive Club guy looked like he was going to start masturbating on the stage at any moment. So intense.

    This event was much more impressive than I anticipated, though. Pretty cool seeing Jonathan Blow in front of millions.

  36. ArtyFishal says:

    Fairly middle of the road for a pc as well.

  37. Greggh says:

    BEST POST OF THE YEAR (so far).

  38. Hoaxfish says:

    So many bald men, so many jeans with suit jackets.

    I even saw one person with a tie, but an open top button.

    Who the hell dresses these people.

    • PacketOfCrisps says:

      Jeremy Clarkson’s mum.

    • El_Emmental says:

      The guy trying to be cool in his leather-like jacket was the best, it’s like one of this middle-crisis guy trying to act young and trendy in a bar crowded with 20-something people – “go home dad, it’s ok to grow up”.

  39. crinkles esq. says:

    Har har. I think it’s actually a very sad state of affairs for game consoles. It’s obvious that Sony Japan has admitted total defeat and let boring western manshooter machines take control of console development. It was just a parade of disheveled western dudebros. I miss the days of crazy Ken Kutaragi spouting off insane shit on-stage. So what if the PS2 and PS3 architectures were weird and byzantine to develop for? It was part of their charm. I mean, it’s not really surprising; the PS3 had barely any breakout titles like its predecessors had. But with the PS4, they might as well have Emperor Palpatine up there giving a speech. Actually maybe I saw him…

    I mourn for the unique creative experiences brought by Japanese studios. But Sony Japan apparently believes the only way to make money on console games is to let the west take over their company. The cultural impact of Japanese games has been huge, and I’m sad to see that ecosystem of games waning.

    • WoundedBum says:

      Did you not notice some of the people to come up on stage? Capcom etc. a few Japanese devs and Yves Guillemot is definitely not a dudebro.

      • crinkles esq. says:

        Well of course there were Japanese devs there. But the soul of Sony Computer Entertainment is now in the hands of the west. The console is feature-driven by what the west thinks the console should be. Part of my rambling was a more general lament with the loss of the Japanese uniqueness in the titles coming out. They’re all trying to copy western style now, because they think that’s the way to stay profitable.

        • FriendlyFire says:

          Ironically, the Japanese devs became complacent… and Western devs caught up big time. That reminds me of something…

        • Stochastic says:

          Well it kind of is, isn’t it? At least for an international corporation like Sony.

        • WoundedBum says:

          Eh, I don’t know. Capcom and Square both looked like they were going to do Japanese stuff still. The conference didn’t scream dudebro.

  40. kibble-n-bullets says:

    Given that developers can essentially triple their potential market with the nod of a head, I don’t see studios being financially tempted to make platform exclusive games. And as was noted earlier in the comments, an emulator sounds quite plausible anyways.

    All the exciting stuff is going to happen on the PC as we know it right now, but I’m happy for added memory.

    • SominiTheCommenter says:

      But Sony pays for the exclusives, so the financial incentive is still there.

    • El_Emmental says:

      If by giving up on the Microsoft console, you get a “free” marketing support by Sony (even getting some console+game package in stores), it could be well worth it.

  41. Dudeist says:

    PC roolez :) I knew it! :D

  42. Dudeist says:

    Ok, here Dude prophecy. Wot I see?
    SONY with “PC like” console.
    Valve with new Big Screen, and “invisible” Valve console and known “personel cases”
    Is here connection?

  43. Upper Class Twit says:

    Is anyone else thinking we’re starting to see some diminishing returns with all this new tech?

    As in, everyone’s talking about hardware specs, how may cores the PS4 has, what kind of GPU it has, how that’s comparable to current PC’s, etc. You all saw that new Killzone demo right? It looked pretty nice, a bit better than Crysis 3 at max everything. Hardly the leap you’d expect from a next generation though. And that might have something to do with the PS4’s relatively underwhelming hardware, but how do you think it would look if the PS4, had, say, one of those absurd Nvidia “Titans” and a really high end CPU. A *lot* better than Crysis 3 on max? Better eyeball tesselation? Improved dynamic range hair simulation? More Pixels? (I really have a poor understanding of what all the PC graphics words mean) .

    I guess I’m just wondering how much closer to photo-realistic we can get before it starts to become pointless, and what that means form a hardware standpoint.

    First post on this site, btw.

    • Baines says:

      The photo-realistic argument comes up every console generation. The problem is, companies keep finding more uses for the extra power, and consumers acclimate to whatever the cutting edge is.

      You can always find something to sink more power into.

      You might think those new graphics are minor effects, but remember that bit about people acclimating to it. People will see those things, and those things will become the new baseline standard for people. A DVD doesn’t look that bad until you see a BD movie or HD resolution video. Dynamic shadows went from being a fantasy to the sign of a high end game to a graphical minimum for some people. Think about the graphical annoyances you experience in games, like when animations aren’t quite up to par, or lighting isn’t quite right, or whatever. Maybe in a couple more generations, it will be that the footprints left in grass just don’t look up to par in game X, and angry fanboys will complain that the individual blade modelling in game X is so bad that the company should have just thrown a decal on the grass instead. Or the lack of proper air current modeling just makes everyone’s hair look so 2015, and it just takes you right out of the game.

      And, even if you hit photorealism, then try and make a game that can throw a ton of active characters onscreen without taking a graphical hit…

      Even when it starts to look like you can’t cram more visual information into a screen, screen resolution increases. PC gamers are going to dual monitor set-ups. TVs moved to 720/1080p. Movie companies are already planning for the next TV tech upgrade.

      • roryok says:

        Yes, those Killzone developers definitely don’t fake anything either. I’ll bet that’s 100% real in-game footage. No Horsemeat whatsoever.

      • El_Emmental says:

        There’s also the 3D display market showing up, requiring even more graphical processing power.

      • Upper Class Twit says:

        Thanks for the answer dude. You make good points, especially on the acclimating part (I remember when I though Baldur’s Gate 2 looked “photorealistic”. I think I was twelve or something), but I still think that the all the shiny bits we’re seeing with this next generation aren’t nearly as big a leap as something like no shadows to shadows back in the late 90s. Its probably a personal preference issue at this point though. I just don’t think we should worry so much about hardware when the improvements we are seeing are so incremental.

    • mckertis says:

      “You all saw that new Killzone demo right? It looked pretty nice, a bit better than Crysis 3 at max everything.”

      But does it look better than Crysis 3 at max everything ON PC ?

      • crinkles esq. says:

        I thought the medieval dungeon crawler which Capcom showed off, which was I think running on PS4 hardware, looked much more impressive than Killzone.

        • pakoito says:

          That movie? Yeah. Movies are great.

          • crinkles esq. says:

            Surely at least some of it was pre-scripted, but it was all in-engine. There’s a thread documenting why on NeoGAF if you really care. But both technically and stylistically, it was much more compelling than the man-shoot they showed.

  44. fish99 says:

    I watched the whole thing and didn’t see anything about real-time weapon switching or historical battles involving giant crabs. Very disappointing.

  45. GreatGreyBeast says:

    I’m actually impressed. This was about refining a wheel, not reinventing it, which I am grateful for after so much ZOMG motion controls! ZOMG 3d! ZOMG second-screens! I mean, there is second-screen stuff, and social stuff, but it’s all secondary to the core experience. If all those headline-bait features fall flat on their face, it doesn’t ruin the console. And without chasing the Wii’s Cinderella story, there was time for actual improvements. The power/save state stuff is great. Putting a headphone jack on the pad is brilliant. I say well done Sony!

    The one teeny-weeny little weakness in their plan is that by becoming so much like a PC, one could ask, “So why not get a PC?” The master race is moving into the living room like never before. The Steambox is not a mirage on the horizon. IT’S ALREADY HERE. Has anybody noticed the latest updates to Big Picture mode? You now have the option to launch straight into BP mode when your computer boots, and restart/shutdown commands are right in the Exit dialog. Think about that – there is no need to touch Windows anymore. If you have Steam installed, you own a Steambox. Controller and HDMI cable sold separately.

    EDIT: Okay, I spoke a bit too soon. I can’t actually get Steam to load up straight into Big Picture without a click or two of prodding after startup. At least not on Win8. But dammit, it’s close.

    • roryok says:

      I’m not sure if I’m comfortable with PC users being referred to as “the master race”

      • El_Emmental says:

        It’s actually a joke, but sadly some people seem to miss the point.

        We all know consoles are fine for playing games without too much fiddling around, we just like to have a little fun at clueless people repeating the marketing bullcrap about how their system is “the best”.

  46. Zeushbien says:

    Atleast Sony has a tendency to support quirky and interesting games from time to time, unlike a certain other console manufacturer I could mention.

  47. crinkles esq. says:

    Followup thought: the 8GB of GDDR5 RAM is going to be huge in upping the minimum quality level of PC multi-platform games. So this can be nothing but a big win for the PC platform, RPS snark aside.

    • Gunrun says:

      8GB of ram without something like Windows sitting behind it is going to be insane, and it’s GDDR5 which is way way faster than anything we have right now. The PS4 might actually outmatch PCs when it comes out, at least for a while.

      • roryok says:

        Them there Steam boxes won’t necessarily have something like Windows. Might be they come with a Valve built linux distro stripped for speed. Which is essentially what the PS3 used, and probably what the PS4 will also use.

      • El_Emmental says:

        (edited :P ->) *sigh*

        As mentionned earlier, GDDR5 isn’t “DDR3 +2 generations”, it’s the same DDR3 design, modified to fit GPUs’ designs. It’s been on PC’s graphic cards for 5 years now (2008).

        It is not “faster” (haha), it only chose to favour “quantity of data per sec” speed over “timings” speed. It means that the GDDR5 can write/read more (quantity) per sec, but each data order (read/write) will be slower.

        It is not a problem for GPUs to have slower timings, as they can set some calculations aside (and work on other ones meanwhile) and later sync that properly, but for the CPUs (who have a more linear way of doing calculations) it will slow it down.

        It also make more sense to have higher capacity of transfer for a chip that has to load and unload heavy amount of data (for AA/AF, textures, etc), and keep the fast timing for the CPU.

        Also, the non graphical-dedicated RAM (“normal” DDR) can easily be upgraded thanks to the motherboards’ RAM slots (minimum of 2, usually 4 slots), for an affordable and progressively-lower price. That way, you can easily compensate the smaller “cargo capacity” of the DDR3 by getting a 2*8 GB kit ($80/£50 for Corsair/Kingston sticks).

        Finally, DDR4 final specs were released in Sept 2012, so the average new gaming PC will get it by next Christmas.

        • roryok says:

          I think you meant to sigh at the beginning there. instead you sighted.

          • El_Emmental says:

            you got me :P

            Maybe I was unconsciously staring deep into the OP’s soul, trying to engrave the difference between GDDR and DDR memories (which is a common mistake, as they’ve got pretty similar acronyms)… we’ll never know !

        • Sakkura says:

          The average new gaming PC as of late 2013 won’t be getting DDR4. Intel’s upcoming Haswell processors won’t support DDR4, and AMDs next generation of FX processors almost certainly won’t support it either (they would break motherboard compatibility otherwise).

  48. Wedge says:

    Hee hee.

  49. MOKKA says:

    So is this the right time to get a PS3?

    • Lars Westergren says:

      If you are dying to play the existing exclusive titles, plus want a Bluray drive, sure. I expect the number of new titles announced for it will plummet now. Porting to the PS3 won’t be worth the time and money for most developers.

      • roryok says:

        Blu Ray is a complete waste of money. I bought a player last year (a Sony one) and it takes about 40 seconds to spin up a DVD. Longer for Blu Ray. And on my 42″ 1080p TV, it’s hard to dell the difference between either.

        (and yes, I am of course using a HDMI cable)

        • Gunrun says:

          I’m sorry about your being blind, and buying a bad blu ray player.

  50. D3xter says:

    RPS apparently missed the most important thing about the reveal of the PlayStation 4, thank god we have Kotaku for illuminating us about the most important issues in gaming: link to

    • Lars Westergren says:

      Not the most pressing question of the show, but definitely worth asking.

    • El_Emmental says:

      So… to fight racism and sexism, we need to select presenters by their skin colours and genders, rather than their relevance/importance regarding the console design/games presented ?

      There’s two things to say about the Sony event:
      – there wasn’t objectified women in this presentation. Positive point.
      – the video game industry is mostly populated by men, especially at the higher ranks of the hierarchy. Negative point, given the existing interest in this media and culture by many women.

      Why not enough women ?

      * Partially because it is populated by sexist executives (like hundreds of other sectors)(nb: the opposite is happening in other sectors, with a majority of women in a sector, isolating the men trying to work there – with less sexual harassment though).

      * Partially because the entire society (including, and stricly enforced by female individuals) said (for years) that video games are for kids, boys especially, and really not for girls, so these girls couldn’t fully embrace gaming as much as boys (sadly, VERY sadly – imagine all the 2000-2004 era (“golden age” of 3D PC gaming) excellent games we would have get if we had a strong female playerbase and 20-30 years old developers… we really missed something :/).

      There wasn’t anything specifically sexist in the PS4 presentation, only the tumblr “oppressed people” will try to rant about it to to reassure themselves and free themselves from their self-induced guilt over their first-world “privileges” and personal angst when procrastinating in front of their life crises.

      The tweets mentionned in that Kotaku article are really pathetic.

      EDIT: Gosh, even the comments on Kotaku are far from the usual noise and this time they raise plenty of valid points – the “sexist agenda” conspiracy thing is getting out of control, it’s doing much more harm than good to the cause of women in the video game industry :/

    • Jamesworkshop says:

      I didn’t notice any wheelchair users either, why were all the presenters adults also(ageism in the gameingindustry), kids play games too they need to be represented, I mean how can someone be expected to get anything from a press event unless the presenter looks exactly like themselves, being able to listen to people of any gender, that’s like totally weird, who can manage that.

      At least we were spared stage presenters talking about their Girlwood, or how they were gamers first and women 7th.

      link to