Not Unrealistic

By Alec Meer on October 4th, 2007 at 10:35 am.

It’s been everywhere already, but may as well mention it here so folk have somewhere else to express their outrage/joy/paranoia/confusion. Yes, Unreal Tournament 3 system specs! Will the beefy quad-core CPU and GeForce 8800 in my PC be worth the investment at last?

Minimum System Requirements
Windows XP SP2 or Windows Vista
2.0+ GHZ Single Core Processor
512 Mbytes of System RAM
NVIDIA 6200+ or ATI Radeon 9600+ Video Card

Recommended System Requirements
2.4+ GHZ Dual Core Processor
1 GBytes of System RAM
NVIDIA 7800GTX+ or ATI x1300+ Video Card

Huh. Guess not. Surprising, really. Clearly Windows 2000 users will explode in fury, but apart from that it’s about the most reasonable rider I’ve seen any big game demand recently. I mean, apparently my mum’s PC can run Unreal Tournament 3, which is a reality I’m not entirely sure I can deal with. I will be amused to see what the game looks like on a Radeon 9600, however. Not much, I would imagine, like this:

__________________

« | »

, , .

24 Comments »

  1. Roman Levin says:

    You do realize there’s a slight… gap, between what back-of-the-box system requirements say and what the game actually needs to run well?

  2. Jim Rossignol says:

    You’re right Roman, but this case is a slightly unusual one – go and test UT2004 on its minimum and recommended specs, and you’ll see what I mean.

  3. MisterBritish says:

    Luckily everyone is upgrading for crysis :)

  4. Pentadact says:

    Isn’t a 6200 much worse than an 9600, and an X1300 much much worse than 7800GTX? We were poring over these yesterday and I just can’t make sense of them. X1300 for recommended could almost be a typo – the X1800 is the 7800 equivalent.

    Regarding the headline: YOU DID WELL!
    [/Heavy]

  5. Alec Meer says:

    The 6200 /9600 thing I can understand (at a guess, it’s do with the higher Shader Model the 6200 can handle compared to the older if more brute force-y 9600). The 7800/1300 is a headscratcher though; skim-reading it I’d actually thought it did say X1800. Gotta be a typo.

  6. Dosfreak says:

    “Clearly Windows 2000 users will explode in fury”

    Why’s that? Bioshock works on Windows 2000, don’t see why Unreal 3 won’t either.

  7. Thelps says:

    I’m in the same boat. Quad Core, 8800 GTX, and nary-a-game to really test them on. ET:QW looks spectacular, but I want to really PUSH these components! Guess we’re all stuck waiting for Crysis, then.

    P.S. Gotta love this post ‘oh woe is me. My PC doesn’t break a sweat on any current games. Oh what a world, what a world!’

  8. Kieron Gillen says:

    Alec: Your mum is l33t.

    KG

  9. schizoslayer says:

    The important thing to remember is that you really really want a game like UT3 to be running at a steady 60fps. Not 10, not 20 but 60. Anybody who claims there is no difference has been playing on an out of date rig for too long.

    I fully expect that while the recommended specs (of which I exceed) may get the game looking like the screenshots I doubt it will get the game running smoothly at a high framerate.

    World in Conflict doesn’t exactly push my system hard but I still turn down the detail if I want to play online since anything less than 60fps in an online game is horrible.

  10. akbar says:

    Bah. My system is comfortably above the min specs for Bioshock, but it’s unplayable at 1024×768 and with most of the graphics options turned off – and Bioshock’s based on Unreal Engine 3, right? I am moodily contemplating an upgrade.

  11. Martin Coxall says:

    My penis is bigger than yours.

  12. Babs says:

    No it’s not.

  13. Jonathan Burroughs says:

    With great power comes great responsibility.

  14. The_B says:

    As much as this news is indeed Good(TM), I can’t help but be a little…suprised. But not at Epic, more at the people that thought for some reason UT3′s specs might be unreasonably high. I mean, granted – it looks really pretty on just about every preview and PR screenshot I’ve seen, but the Unreal Engine has always been famous for its adaptability and indeed the uncanny knack to not actually be that demanding if it doesn’t want to be.

    I mean, anyone remember running UT2004 on 320×240 for a laugh?

  15. Alec Meer says:

    As has been observed, our sole experience of the U3 engine thus far has been Bioshock, a game which (unreasonably) locked out any gpus not capable of SM 3.0. While clearly a high-speed online shooter couldn’t possibly be that silly, I am genuinely surprised it goes as low as the Radeon 9 series.

  16. John P (Katsumoto) says:

    as long as my new pc can run it on uber high i’ll be happy

    Should be coming next weekend, 2gb ram, E6550 overclocked to 3.0, 8800 gts. Way over the recommended, so i’m hopeful!

  17. Citizen Parker says:

    If computing power were a city, then I’ve been that family that lives in the messed up neighborhood, perpetually thinking “Next year, we’re gonna get out of this place, I swear it.”

    That said, I’ve always appreciated the Unreal family of games since they’ll generally run smoothly on damn near anything you can put some electrons through. I appreciate that some developers still think about me and return the favor with purchases, whereas I don’t care at all about CRYSIS simply because I know that I’ll only be able to run it with the new computer I buy in 2011 or so.

    Assuming that the USD can still be used to buy things in 2011, anyway.

  18. Robin says:

    Surely requiring SM 3.0 isn’t that unreasonable these days, unless we assume people still buy/use manky old ATi cards.

  19. Alec Meer says:

    An awful lot of people still use ‘em. 24% of all Steam users, for instance – http://www.steampowered.com/status/survey.html

  20. Schadenfreude says:

    Bumping Unreal Tournament 2003 up to max settings was a good un.

    “HOLY SHIT!!!”

  21. Joe says:

    My god, you lot are boring.

  22. mister slim says:

    I remember the UT2K4 demo running pretty well on my G3 iBook. Better than it should have, anyway. The only problem was aiming with the touchpad.