Star Wars Jedi: Survivor system requirements, PC performance, and best settings to use
I quite enjoy the vwing-vwing lighstabering at the heart of Star Wars Jedi: Survivor, which is why it now pains me to have an extended moan about the PC version's technical troubles.
Despite the odd glimmer of joy, like better-than-expected performance on its lowest system requirements, much about playing Jedi: Survivor on Windows suggests it could have used a little more time in the bacta tank. Sluggishness and stuttering are problems on higher-end graphics cards, even before adding the strain of ray tracing effects, and FSR upscaling often fails to deliver a significant performance boost.
I’ve done my darndest to come up with a best settings guide, which should put a few more frames in your pocket compared to max quality, but if you are playing straight from launch then you can expect some turbulence along the way. EA themselves have said as much, promising a series of post-release patches to address bugs and improve performance.
Star Wars Jedi: Survivor system requirements and PC performance
An early warning of Jedi: Survivor’s performance woes comes in the form of some eye-widening system specifications, and not just its ravenous 155GB storage requirement. An 8GB GPU is a big ask for any minimum spec, as are relatively recent components like the Ryzen 5 5600X and Radeon RX 6700 XT for the recommended spec. There are some questionable equivalences being made here as well: the RTX 2070 is listed as an alternative to the RX 6700 XT, when the newer RTX 3070 is a much closer rival to AMD’s 6000 series card.
Star Wars Jedi: Survivor minimum specs
- GPU: AMD Radeon RX 580 / Nvidia GeForce GTX 1070
- VRAM: 8GB
- CPU: AMD Ryzen 5 1400 / Intel Core i7-7700
- RAM: 8GB
- OS: Windows 10
- Storage: 155GB
Star Wars Jedi: Survivor recommended specs
- GPU: AMD Radeon RX 6700 XT / Nvidia GeForce GTX 2070
- VRAM: 16GB
- CPU: AMD Ryzen 5 5600X / Intel Core i5-11600K
- RAM: 16GB
- OS: Windows 10
- Storage: 155GB
As a PC port more generally, Survivor gets some of the basics right. Mouse/keyboard controls are fully rebindable, even if a controller makes for the most comfortable Jedi-ing, and there are heaps of supported monitor resolutions (including those of an ultrawide, 21:9 or 32:9 persuasion). And while DLSS would have been nice, FSR 2 is on hand for upscaling, with its much wider compatibility range of Nvidia, AMD and Intel GPUs.
In practice, sadly, it almost doesn’t matter how you’ve built your PC: this game will find a way to disappoint you. The most noticeable issue is inconsistency, as different planets (and sub-areas of those planets) can vary wildly in the level of performance you get. After an extended opening on the urban sprawl of Coruscant, a PC that averages 40-50fps might start pulling double that upon an escape to the rocky wilds of Koboh – before dropping back down to 40fps when that same planet opens up a more free-roaming area. It’s normal for games to perform better or worse in different areas, but I’ve rarely seen one lurch back and forth like Jedi: Survivor does.
Garden variety stuttering is common as well, especially with higher settings enabled. This serves to make Survivor’s ray traced lighting and reflections (all contained in a single toggle) less appealing, despite the latter making a stark upgrade on screen space reflections.
FSR 2 can help, with or without ray tracing in the mix, but just like overall performance, its efficacy is mixed. To explain, we now turn to some benchmarks, which I manually recorded using an especially demanding semi-open-world section of Koboh. Here, a PC running an Core i5-11600K and RTX 4070 Ti could average 51fps at 4K, using a combination of the highest Epic preset and FSR 2 on Balanced mode. That was an easily visible improvement on the un-upscaled Epic preset, which produced 36fps. Not a bad showing for FSR, all things considered.
However, on the (far more commonly owned) RTX 3070, running Epic at 1440p, Balanced-level FSR 2 only bumped a 41fps result up to 48fps: a far less visible enhancement. And, incidentally, exactly the same as the 48fps I recorded when using Quality mode instead. So why even bother with Balanced on the RTX 3070, when it’s no faster and looks worse? And that’s not to mention an apparent bug when using the Low preset: disabling FSR here makes everything low-rez, almost as if it’s forcing the use of Ultra Performance mode. Only manually adjusting every individual setting to Low fixes it. Eh?
One positive is that the minimum-listed GTX 1070 was able to push a playable 47fps, on Medium settings plus Quality FSR 2, at 1080p. It was briefly encouraging to see a min spec GPU stay so far above the 30fps line, without resorting to 720p. But then its AMD "alternative", the RX 580, had so little affinity with Jedi: Survivor that it repeatedly crashed on launch.
Survivor also scales poorly when using newer, more preium hardware below 4K. The RTX 3070 averaged 41fps on the Epic preset at 1440p, but dropping all the way to Low quality only sped that up to 48fps. Perhaps more disturbingly, 1080p only barely performed better, with 49fps on Epic and 60fps on Low. You know something’s wrong when a card that can handle most games at 4K can only just reach the 60fps mark on the lowest settings and monitor rez combo that a modern desktop arrangement would use.
(Well, at least in this benchmark bit – like I said, Survivor is so inconsistent that 1080p/Low will get you closer to 120fps in certain other areas. But it does show that even a powerful GPU won’t hold a solid 60fps without major quality cuts.)
As for the Steam Deck, Jedi: Survivor is effectively unplayable on Valve’s handheld, still so after receiving the day one patch. Not only does it fail to even approach a steady 30fps on the lowest possible settings, but it regularly crashes to the point of the Deck needing to reset itself. Update: more on Steam Deck performance here.
Star Wars Jedi: Survivor best settings guide
Sticking with the RTX 3070 at 1080p, because honestly it performs so badly it’s kinda fascinating, we must confront the unfortunate reality that there just isn’t going to be a massive performance upgrade hiding behind a few lowered settings. This lack of potential is evident in how the RTX 3070 ran each graphics preset: 49fps on Epic, 53fps on High, 57fps on Medium, and 60fps on Low. There’s barely a difference between each level, despite them looking visually quite distinct from one another.
Still, let’s take a peek at each setting individually, to see how lowering them affects that Epic preset result of 49fps.
View Distance: Dropping to Low upped the RTX 3070’s average to 50fps, a single frame per second increase.
Shadow Quality: Low quality shadows also only barely raised performance, up to 51fps.
Anti-Aliasing: Low quality also added a mere 1fps to the average, and looked uglier, especially on moving objects. Keep this at on High, at the least.
Visual Effects: Another single FPS gain when dropping from Epic to Low. Sigh.
Post Processing: …and another, 50fps on Low. Siiigh.
Foliage Detail: Finally, something interesting. Low quality produced a relatively bountiful 54fps, though it makes a lot of the game’s environment look a bit barren and sad. Medium also doesn’t look as detailed as Epic, but still makes for a speed boost up to 53fps while adding more of that set dressing.
Field of View: This setting, despite showing more of the world on-screen, doesn’t actually hurt performance at all when increased. On 16:9 monitors, though, they can look quite fishbowly compared to the default setting.
Ray Tracing: A simple on/off toggle. Surprisingly, this didn’t completely bring Jedi: Survivor to its knees when enabled, though still cut that 49fps average down significantly to 34fps. At 1080p. On an RTX 3070. Siiiiiiigh. This might be worth enabling in the future, should EA make good on the promise of performance improvements, but for now it’s a big drain on a game that needs every frame it can get.
AMD FidelityFX Super Resolution 2: As explained above, results using FSR 2 are uncharacteristically iffy. Sometimes it works respectably, like on the RTX 4070 Ti on 4K, though on the RTX 3070 running at 1080, both its Quality and Balanced modes only raised that 49fps average up to 51fps.
Motion Blur: This doesn’t affect performance either way, so on or off is your call. I prefer it off.
Film Grain: Jedi: Survivor goes hard on the grain effect. Disabling it only got me another 1fps extra, but I prefer the visuals without it.
Chromatic Aberration: One last 50fps for the road, gained after turning this mild effect off.
To conclude: urrrrrrggggghhhhh. The only individual setting that provides more than a couple of frames per second is Foliage Detail, with everything else having such little effect that the only way of smoothing things out is to make further cuts en masse.
I therefore use the term "best setting" so loosely it’s in danger of slipping off the screen you’re reading this on. But, here’s what I’d use to give performance a little nudge, without unnecessarily putting the boot into quality:
- View Distance: Medium
- Shadow Quality: Medium
- Anti-Aliasing: High
- Texture Quality: High
- Visual Effects: High
- Post Processing: Medium
- Foliage Detail: Medium
- Field of View: Your pick, Default is fine
- Ray Tracing: Off
- AMD FSR 2: Quality, but only at 1440p and above
- Motion Blur: Off
- Film Grain: Off
- Chromatic Aberration: Off
These settings gave me an average of 55fps, which you may note is only barely faster than using the Epic preset with only Foliage Detail changed to Low. However, that average doesn’t tell the whole story, as I saw far fewer sub-40fps dips when using this fully custom settings combination. Hey, it’s not much, but unless this game is absolutely peppered with optimising updates, it’s better than nothing.