Thief System Specs Sneak Out, Don’t Induce Fear

By John Walker on January 20th, 2014 at 8:00 am.

Remember when the announcement of system requirements was this gulp-inducing deal? Would your PC be up to it? Was it going to become time to upgrade in order to play the game you were most interested in? With a new Crysis no longer being a priority, and not a single big-budget game-changer appearing in 2013, it feels almost anachronistic to even make system specs an announcement these days. And that’s no less true today, with the specs released for Thief. Unless your PC is built out of twigs, there’s a good change they’re not going to bother you.

Windows XP users will be disappointed?

Min Specs:

OS: Windows Vista with platform update
CPU: High-performance dual core CPU or quad core CPU
RAM: 4 GB
Graphics Card: AMD Radeon 4800 series / Nvidia GTS 250
DirectX: DirectX 10
HDD/SSD: 20 GB

Rec Specs:

OS: Windows 7 or 8
CPU: AMD FX 8000 series or better / Intel i7 Quad Core CPU
RAM: 4+ GB
Graphics Card: AMD Radeon HD / R9 series or better / Nvidia GTX 660 series or better
DirectX: DirectX 11
HDD/SSD: 20 GB

Of course, what’s far more complicated in reading these things these days is the graphics card requirements. Unless you’ve got a degree in Graphiscardology, what are you supposed to do with “AMD Radeon 4800″ or “Nvidia GTS 250″? They’re meaningless numbers, not pertaining to anything else, not even within the releases of cards from the same company. I do wish the industry would give this some thought and standardise again.

It’s interesting to note what a silly set of things min and rec specs are now. Those min specs are so ambiguous: “dual core or quad core”. Oh, gee, thanks. And those rec specs are just “um, modern?” rather than anything specific. More than 4GB of RAM you say? These seem to come down to, “It’ll work on an older machine, but it’ll work better on a newer machine.” Which is perhaps expected. However, the good news here is: “it’ll work on an older machine.”

__________________

« | »

, , , .

60 Comments »

  1. LordMidas says:

    My machine is modern… yet old too. An i5 3570K Ivy Bridge.. phew, 16GB DDR3… phew, NVIDIA 460GTX 768MB… arghs! But will it run Crysis?

    • Mechorpheus says:

      Whack a Nvidia GTX 760 in there or even splurge a bit on a Radeon R9 290 and you’ll be golden for I’d reckon a good portion of, if not all of, this current console generation. That CPU is great, and if you’ve not overclocked it yet I’d strongly recommend doing so, you’ll be amazed at the clock-speeds you can get out of those.

      Newer Intel Haswell CPUs only offer marginal IPC boosts for the same clock, and run much hotter so you’ll likely see less overclocking returns. Got to love Intel basically not caring about enthusiast desktop anymore…..

    • Bugamn says:

      I have a similar setup and I have played Crysis and Warhead. Not Crysis 3, though.

    • riverman says:

      I’m still rocking a 3ghz first gen i5 with 8GB of DDR2, and a radeon 7850, and it still handles any game at the full res of my 27″ lcd, barring AA and all that funky special ambient occlusion lighting. as the next gen games look pretty much the same as the current games, I can’t imagine this being too old for some time.

      funny thing to is that when I upgraded, it was only because my old board died. it was a 2.66GHz core2 dual core, and I didn’t notice much of an improvement at all in framerate. in my experience, the GPU’s seem to be what gives your games wings, as long as your CPU is fast enough and your ram is enough enough to keep up with the fun.

    • RealWeaponX says:

      I have an i5 3570k clocked at a humble 4ghz and no discrete GPU, and I can play Saints’ Row IV which claims comparable minumum specs to Thief…

      But I doubt it’ll play Crysis.

  2. Baka says:

    The only thing confusing me is the amount of HD space it wants to take up.
    Where does the size of current games come from, anyway? Is it just that no one is interested in compressing textures and such anymore? 21 out 24 GB of the recent MG:R Port are videofiles, is it the same with every other game the last years? I cannot imagine why a game that looks like the newest Riddick Iteration eats so much hard drive space.
    Not that there’s any shortage of that, simply curious.

    • Azhrarn says:

      I think for Metal Gear Rising it’s simply the fact that it’s a PS3 game. It came on blu-ray, so they had a TON of space to work with, and they quite obviously used it. Uncompressed video is one way to get those cutscenes looking their best, while still pre-rendering them.

      • Frosty840 says:

        Huh… That’s exactly the reason the MGS2 port filled an entire dual-layer (or double-sided, can’t remember) DVD, too.

    • KenTWOu says:

      Where does the size of current games come from, anyway?

      Prerendered HD video cut-scenes because current-gen consoles can’t load/show in-engine cut-scenes very fast and different language packs are the main reasons why the size is so big IMHO.

    • TheVGamer says:

      Higher texture resolutions and uncompressed audio, mostly. I’ll gladly sacrifice 30 gigs of space to have both of those. The only problem is downloading it all…

    • DatonKallandor says:

      There’s two sources of size bloat:
      1) Uncompressed Video of In-Engine Non-Realtime renders (Cutscenes) – this mostly found in console ports. This is also why very often PC Ports of Console games look BETTER when playing than in cutscenes even when textures didn’t get upgraded (anti-aliasing, AA, etc.)

      2) Shitty development practices, aka Repeated Textures saved individually instead of loaded from the same file. This happens a LOT in MMOs (I suspect because the higher ups think a big size game is a good bulletpoint, even when the size is actually bullshit). A great example of this is Vanguard, a game that launched at 30 gigs and was then optimized back down to about a third of that simply by getting read of repeated files.

  3. kaloth says:

    Regarding graphics cards, at least Nvidia graphics cards, there’s a quick and fast guideline for working out how your card holds up to spec.

    GPU power-wise, if you want to compare cards across generations you subtract 1 from the first column of the higher number card and add it to the second column to get rough equivalency of the previous generation. It works something like this:

    gtx 750 ~= gtx 660
    gtx 760 ~= gtx 670
    gtx 770 ~= gtx 680
    gtx 660 ~= gtx 570
    gtx 670 ~= gtx 580

    And so on.

    While it isn’t a 100% accurate way to compare cards, it’s close enough that you’ll be able to quickly work out if your card will be powerful enough to play that game you want to buy. This does NOT, however, take into account card specific features (like dx11 compatibility, etc). Although, unless you’ve got some old ancient thing, it probably wont really be something you need to worry about.

    I’m sure there’s some other quick equivalency thing for the AMD cards, but as I’ve never owned one I never bothered finding out.

  4. Ladygrace says:

    these spec are pretty laughable for how much of a fuss they were making. Or were they making a go at the new consoles specs? Either way a Mod to run on Windows XP will be made soon enough i would say. I think companies that use the whole “You will have to upgrade your PC” scare tactic pretty stupid as that means either they are lying about there game or the game will run like rubbish like BF4 did on release and still does.

    • PoulWrist says:

      BF4 runs rubbish? I was getting 80-90 FPS on the training ground at 2560×1440 with most settings at high, textures at ultra and post processing at medium. 100% resolution scale and no AA. On a nearly 2 year old GTX 680.

  5. bill says:

    Totally agree about the graphics card thing. As I don’t have a degree in Graphicscardology, the last 2-3 times I’ve actually bothered to read system requirements, i’ve just given up and not bought the game.

    My card isn’t that modern, but for the past few years we’ve been in the lovely state of almost all games working on older machines. I’ve gotten out of the habit of checking system requirements and trying to keep up with hardware numbers – just buy things and assume they’ll work.
    But a few recent releases, and probably partly down to the new console generation, have started to seem too demanding for my little old PC. Yet the card numbers are unintelligible, so I just base my choice on how shiny it looks. If it looks like it might be too demanding for my PC then they lose the sale.

    If only there was some kind of standardised system for showing how powerful a pc was.. like a windows experience score or something…

    PS/ Does anyone else think it’s strange that Steam doesn’t have some kind of built in system-requirements-checker? They already know everyone’s hardware, and they know the hardware requirements… surely it wouldn’t be very hard to implement and would save them a lot of support hassle.

    • Mechorpheus says:

      I agree its a real ball-ache to descipher what the heck the numbers mean, and it doesn’t help when Nvidia or AMD re-release the same chips with a different number and call it a new card. For example, the GPU which is inside the Nvidia GTX 680 (which is called GK104) was the flagship single-chip part for that series. So naturally when Nvidia released the GTX 780 (which itself contained a GK110, a significantly more powerful GPU and the same basic chip as in the GTX Ttian/780 Ti but with shader blocks disabled), they decided it would be fine to re-release the GTX 680 GPU, stick on some slightly better VRAM and call it the GTX 770.

      Oh, and just for kickers, they released that exact same GPU again for laptops and called it the GTX 780M….

      If you want my advice, if you’re trying to figure out if a game will run on your PC, go and grab a utility called GPU-Z, which will give you a load of information on your current GPU, and then use the lists available on wikipedia as a reference guide, or find some bench-marks for your card and try to compare it with newer models. Anandtech have their Bench system so you can call up benchmark results for loads of CPUs, GPU, SSDs etc. The last thing you want is to get it wrong and end up buying a game you can’t enjoy, trust me I’ve been there before.

    • AngelTear says:

      This is a tool that many people know about, but I hope it helps someone who still doesn’t.
      It analyses your computer and tells you how it compares to minimum and recommended requirements of the games in its database (and it’s a pretty rich database). To an extent, it can also point you to what you should upgrade to get better performance if you so choose.

      http://www.systemrequirementslab.com/cyri/

      • bill says:

        I didn’t know that one. I used to use the online one from futuremark. (yougamers or wegamers, I always got them mixed up), but it didn’t have a massive set of games and I think it closed down.

        If that one is easy to use, I’ll check it out.

        • TheMightyEthan says:

          There’s also game-debate.com (though it seems to be running really slow right now). You have make an account and put in your PC specs, but once you do you can just visit a game’s page and it will tell you how you measure up without having to run a utility each time like CYRI does.

  6. Rao Dao Zao says:

    I still only have 3GB of RAM, I can’t run it. :(

    … Not that I particularly want to. But still.

    • airtekh says:

      You should be able to.

      I have 3GB as well, and I’ve been able to run a whole bunch of stuff recently that listed 4GB as minimum spec.

      I’m pretty confident I’ll be able to run Thief too.

      • Darth Gangrel says:

        That’s what a demo is so good for, it not only gives you a hands-on idea of how the game is, but also on how well it runs. Demos are rare, but some games still have them. I get surprised every time I visit Steam and see a “download demo” button for a game.

  7. David says:

    For my games I just use Passmark values now, like Overgrowth requires a CPU with passmark 1000+, and GPU with passmark 200+.

  8. Whelp says:

    Ugh graphics cards…

    If I don’t give a shit about them for just 1 year, I’m confused as hell about the names. Why do they have to keep rebranding them?
    Isn’t an R9 just a rebranded 7-series Radeon? Why change the fucking name?

    At least I know what a 4800 series is, but only because I had one of those years ago.

    • PoulWrist says:

      They don’t change the names. The R9 280 is an upgraded version of the 7000 series, but not exactly the same. They named it 280 to avoid confusion, because, really, it doesn’t matter. You’re getting the performance and feature level that 280 means compared to 240, 250, 260, 270 and 290.

      The R9 290 and 290X are completely new cards, as are all the other 200 series Radeons. Nothing confusing here. You get the performance and features you are promised.

      • TechnicalBen says:

        Not always true though. Both companies have been known to just stick a new sticker on an old card.

        So want to upgrade the VXP200 to the new VXQ750. Turns out the VXQ750 was a VXP185 re-branded, and you should have got the new chip which is a VXQ800 and you actually paid more for a lower spec card than you originally had.

        Ok, so I made up some names above, because I can’t be bothered to both google, and IIRC both NVidia and ATI/AMD have done similar at different times, so I’m not picking on either.

  9. CookPassBabtridge says:

    The specs won’t be worrying me because I am a grumpy pants who feels Garrett should probably just have kept his stealthy yet comfortable retirement slippers on

    • MajorManiac says:

      To put my grumpy hat on for a moment. I would be very happy to see more Thief games. But only if they are made by competent/stable developers.

    • karthink says:

      I can’t figure out if this game is a reboot or a sequel. And if it is the latter, what it is a sequel to. Last we saw Garrett he was getting on a bit in years, and had become a keeper.

      I think Eidos would have received better previews if they’d just picked a new cruft-free protagonist, like they did with Adam Jensen.

      Ooh, and if Garrett was a mysterious keeper figure you (didn’t) get to see briefly in the game.

      • Henke says:

        Officially, it’s a reboot. There are however a lot of elements that make it seem kinda like a sequel to 3, like Garret having a young girl for a protegee for instance. Word around the TTLG forums(and by “word” I mean “wild speculation”) is that the dev team probably was well underway with making a sequel before they changed their minds and decided to make it a reboot instead.

        • karthink says:

          Hence the confusion. He’s got the mechanical eye too. What it appears to be is a sequel to a reboot that we never got.

        • CookPassBabtridge says:

          Will it be a Gritty Reboot? I like the idea of Garrett being on the slippery slope of a rampant male prostitution-funded snuff habit, and as “payment” a deviant but talented ageing master thief passes on his robbing skills before succumbing to syphilis. At last Garrett hangs up his leather bodice and exchanges it for the Thief’s cowl, retaining just one nipple piercing to warn him of his past. It would be a touching and realistic ‘becoming’ story of the like with which I am sure we are all familiar.

          The blackjack might need a good clean first though.

  10. Shieldmaiden says:

    They bother me. My PC is made of twigs.

  11. Ultra Superior says:

    I thought the guard was trying to pull thief’s cape off and I laughed.

  12. Shadowcat says:

    However, the good news here is: “it’ll work on an older machine.”

    “…provided it’s not running an older operating system”.

  13. KillahMate says:

    Note that the port is once more being outsourced to Nixxes, which is a pretty good guarantee that the game will at the very least be well-adjusted to PC, configurable, and free of any strange slowdowns and interface omissions that plague bad ports.

  14. Surlywombat says:

    Ah the good old days, when you knew a CPU was better than another CPU because the number was bigger (90 is better than 60)

    You also knew a graphics card was better because it was actually attached to your system, as opposed to not being attached to your system.

    Nowadays it really does seem like the GFX manufacturers go out of their way to make the numbers opaque, renaming chipset willy nilly. Restarting numbering schemes completely at random intervals. Bastards.

  15. MykulJaxin says:

    Thief System Specs Sneak Out, Don’t Induce F.E.A.R.

    Fixed?

    Also, I’ve just been going by how much dedicated video ram my graphics card has (i.e., it was okay for a while because I had 1.5 GB dedicated Vram). For me, things get confusing because I have a laptop and that adds a whole other dimension of confusion to everything.

  16. Turkey says:

    Man, I really want to peer into the alternate reality where Microsoft never entered the console race and dragged every PC-game developer with them. Either there would be a ton of amazing looking immersive sims or a desolate MMO wasteland.

  17. derbefrier says:

    unless you are still playing on that WoW machine you built almost a decade ago system specs dont matter nearly as much as they used to. I dont have anything to fancy but I haven’t had to worry about system specs for quite a while the most I might have to worry about is setting the graphics to medium instead of high. Also you people still using a decade old OS need to move on. really guys its time.

    • cpt_freakout says:

      I remember like 15 years ago or so we had to update the ‘family computer’ components like every two years in order to run the latest games decently. I’ve had my gaming laptop (I know, I know) for three years now and it’s still running most stuff at high or a mix of med-high, and apart from The Witcher 3 (if only because 2 was so badly optimized for a while) I don’t see any game in the horizon that will really make my computer struggle.

  18. Werthead says:

    Fixed rec specs:

    OS: Windows 7 or 8
    More Money Than Sense
    CPU: AMD FX 8000 series or better / Intel i7 Quad Core CPU
    No Pre-Existing Familiarity with the Franchise Whatsoever
    RAM: 4+ GB
    Tremendous Reservoirs of Good Will
    Graphics Card: AMD Radeon HD / R9 series or better / Nvidia GTX 660 series or better
    A Yearning Desire to Buy Post-Release DLC
    DirectX: DirectX 11
    HDD/SSD: 20 GB

  19. Loopy says:

    While I suspect my ageing 1st gen Core i5 750 will be ok for this and I easily have enough RAM (8GB), I fear that my Radeon 5770 is not going to cope, that may be the one thing I have to upgrade….

  20. phylum sinter says:

    This was one of the games that i thought i might have to finally upgrade my 5850… but i guess not.

    I suppose i might be missing out on AC4′s most pretty shadows, and maybe on Thief’s too, but i’m still not feeling a bit cheated by not seeing these things. Is it age/sense (my own) or the stagnation of our most glittering development houses that are slowing the pace?

    Who in their right mind needs a dual Titan setup? Who even needs an R9 for this generation of games – Yes yes, if you’ve got a massive monitor then i guess you might justify a little bit of that investment… but the games coming out today, do they even have the textures etc. to be appreciatively viewed above 1080p?

    EXPLAIN YOURSELVES, ENTHUSIASTS!!!

  21. Darth Gangrel says:

    A PC built out of twigs? Don’t be ridiculous. My laptop is built out of lapels, but it still matches the minimum specs. Not that I intend to play this new Thief, not with my backlog. I haven’t played Thief 3 yet, only the demo, which was my introduction to the Thief franchise and I intend to play the older Thiefs before letting this new one steal any of my time.

  22. Darkhorse says:

    Appreciate the news. You should talk about graphics more often like CV&G used to in the good old days of magazines. /me misses ed lomas

  23. sldsmkd says:

    My 4890 is more than 4000 better than the 250 they recommend to play this, but i’m confused because they recommend a 660. Is that better than a 290? Is 290 better than 450? I think I have a 9400 somewhere, that”s obviously the best.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>