Show Off: 3DMark 11

By Alec Meer on May 24th, 2010 at 5:45 pm.

Coraltastic

I don’t have a DirectX 11 graphics card yet, and haven’t hitherto been moved to desire one. This trailer for the latest iteration of venerable (and oft-controversial*) PC benchmarking app 3DMark rather makes me want one, however.

Of course, the visual splendour to be found in an application dedicated to pushing high-end PCs to the absolute limit has about as much in common with the graphical capabilities of contemporary videogames as do olympic athletes, lumberjacks or men who can successful change a tyre to me. So it’s utterly futile to dream of playing in worlds that look quite as lovely as this. We’re a long way off PCs being able to generate scenes of this quality at 60 or even 30 frames per second. What 3DMark11 will be good for, upon its release in the third quarter of this year (why can’t people just use months?), is to wave your big digital willy around and see if anyone’s impressed. For the rest of us, admiring the oceanic footage below may well be enough.


Here’s the video, though you should really click through to the HD version to get the full, deliberately murky effect.

Ooh, barnacley. Tell your more naive friends it’s the first Bioshock 3 footage.

If you’re toting a DirectX10 card, you may want to give 3D Mark Vantage a spin. If you’re still in DX9 climes, then you’ll want 3D Mark 06. If you’re still on DirectX 8, go back to Counter-Strike, old man.

* There have been a few tech press allegations over the years that NVIDIA and ATI have tailored their drivers to ensure superior 3D Mark results over the competition, regardless of how accurately that reflected their card’s real-world gaming performance. I have no idea of the truth or extent of this, or whether it still continues. Intrigue and conspiracy at even the driest of echelons, eh?

__________________

« | »

, .

127 Comments »

  1. Brumisator says:

    The sad reality is that Crysis still has the best graphics out there on any PC…and it was made 3 years ago.
    With 99% of all so called AAA titles being ports from console to PC, we’ve been at a technological standstill for ages.

    Now every once in a while we get a little gem, like metro 2033, dirt2 maybe.
    But to me, the graphical “wow” factor has completely been eradicated…until the playstation 4 and xbox 666 come along.

    So, 3Dmark11…okay…that’s nice…it might as well watch a high quality prerendered video of it, and it’d have just as much bearing on my gaming.

    • alseT says:

      Xbox 666 heheheheheheheh

    • ChaK_ says:

      I don’t really mind with graphics being a little bit “stagnant” at the moment

      I’d rather have gameplay or IA innovation, though the “next-gen” brought more the opposite….

    • AndrewC says:

      Clint Hocking said something to the effect of us being in the ‘who gives a shit’ era of graphical improvement – they are so detailed and adaptable now the improvements are all things so tiny or subtle those that can notice mostly don’t care.

      There’s certainly nothign in that video that can’t be done ight now, just, you know, not quite as well.

      For those of us who were gamers when the switch to 3D happened, or the switch to hardware acceleration, there is never likely to be another technical ‘wow’ moment to match it.

      It would be awfully nice if the adaptability of shaders and that finally stops the march towards realism and more detail and pushes games towards more individualistic art-direction. TF2! Mirror’s Edge! Other games! I get the feeling that this switch has already happened, and it is only the slow cycle of games development that is keeping us from really feeling its results.

    • Tei says:

      Yea, the race for realisim is starting to get dimising returns… It will continue forever, but will not be as dramatically has we have see from Wolfestein to Crysys.

      Other things are calling attention.
      Theres something now, reserved for Linux users, called FlashCache, that is a flash disk used as a cache. Is like a cyborg: part memory cache, part hard disk.

      There are audio engineers (like the one from BC2) that claim audio is better processed in a CPU core. So having 4 cores will be better than 2.

      I would love to see games use more audio proccesing, or loading very high res textures, to make the world feel crisp. Dramatic changes in audio and textures can make older games feel really …innadecuate… really quick, but Is something that will need more hardware.

    • Howl says:

      They did take a huge jump forward, only a lot of people didn’t notice. The ones running Eyefinity rigs are going through all their games and re-experiencing them. I can’t imagine life back on a single screen now, particularly for first person games.

      It’s simple (although not cheap, but then PC gaming never has been) to get set up with an ultra wide aspect ratio and it’s a significant a change as going from black and white to colour.

      Console led graphics are fine for now. With Eyefinity we’re having to deal with gigantic resolutions so it’s ok if the complexity doesn’t go up for a while.

    • AndrewC says:

      Do you work for Eyefinity?

      And I second the call for great sound! BC2′s sound is often awesome.

    • LionsPhil says:

      Theres something now, reserved for Linux users, called FlashCache, that is a flash disk used as a cache.

      (Emphasis mine.) Honestly, Tei, I wish you wouldn’t blabber on about things you evidently have no idea about.

      Here’s what it was called back when Vista did it. I hate to break it to the Linux/OS X fanboys, but for all their ills Microsoft really do try to put some effort into engineering performance tweaks. Hell, old XP does boot-time precaching and feeds information about which programs tend to get loaded and which files they then read through to the defragmenter to try to keep your startup reads contiguous. They just tend to do this at the same time as the shell team are throwing on another thick layer of chrome, and the licensing lot are adding more DRM cripples, and they tend to make facepalmingly bad mistakes on the first release, so the net result is people “lol windows is slow and bloated”.

    • bob_d says:

      It’s not so much a technical issue, but an economic one. When you have a graphical improvement, the development costs increase. (This is why the next generation of consoles is delayed – development cost is expected to double, and no one is quite sure how they’re going to manage that.) Developing for the highest-end PCs means your audience has shrunk considerably (especially lately, when sales of new graphics cards are way down), and since AAA games are having a hard time being profitable to begin with these days, you really don’t want to make it even harder. As AndrewC points out, many of those graphical improvements are pretty subtle, so how many developers want to spend more money on features people don’t even notice? (Example: Unreal 3. How many players noticed that the character’s ears had realistically translucent scapha? Not me, and I knew the feature was there.) Also, some of that processing power is being used to make development easier; what once would have required some clever trickery on the part of developers can now be done with brute processing power.

    • teo says:

      I think Crysis proved that we can still take huge leaps forward, it’s just that no one’s doing it. I saw this coming years ago and there’s not much we can do about it.

    • Azhrarn says:

      @Tei: as the owner of a Xonar D2X I resent that remark. :P
      I’d much prefer game developers to stop using that silly Creative EAX in everything, since it just doesn’t work in Vista and Windows 7!
      The Game eXtensions mode the Xonar has allows me to use EAX to some extend on all older games, and ALchemy does the same for Creative owners (but only for games programmed into ALchemy).
      Just program your games sound engine into Dolby Digital or DTS so we can all enjoy the benefit. =D
      As for processing it on the CPU, my Xonar doesn’t offload as much from the CPU as the Creative cards, to I don’t mind, just leave the realtime processing of DDL or DTS to my Xonar, and don’t force that onto the CPU. =D

    • Starky says:

      I honestly think as with the death of the sound cards (sorry but built in sound is good enough, often better, especially if you use an optical out to a decent preamp – exception being external multi-port audiophile soundcards) we’re starting to see the death of Videocards – which is why Nvidia is pushing down the route of having a graphics card be more of a second CPU designed for parallelism than just dedicated graphics rendering.

      It won’t be long (5-8 years maybe) until we see 16/32 core CPUs maybe even 64 cores, and even that will become redundant and the switch from ATI and Nvidia will be to providing onboard solutions.

      I still think it will take 10 years before you just can’t buy dedicated GPUs anymore, and motherboards will have a GPU built into the motherboard to push the pixels, but most of the grunt work will be done on the CPU itself, but it will happen.

      Just like it happened to sound.
      When sound rendering took 20%+ of your CPU speed, it made sense to have dedicated hardware for that, but now when sound rendering may use 1-2% it no longer does.
      As with GPUs, right now GPUs maybe handle 50% of the total workload in a game – so dedicated hardware makes sense. When that drops as it has been dropping, and rendering a game would only use 5-10% of a 32 core CPUs resources, well then what would be the point?

    • Azhrarn says:

      @Starky: while I admit that on-board audio is often “good enough”, the difference in quality between that Xonar D2X and my ALC889a hooked up to my Onkyo Receiver is nothing short of staggering.
      Not to mention that the Xonar is a lot better at upscaling stereo to Dolby 5.1 than the ALC889a is. I use an optical connection between my audiocard and the receiver and trust me, the ALC chip only pumps out PMC audio, which is only 2 channel (and as such at best “all-channel stereo”, not surround) unless it’s pre-encoded audio from a DVD or Blue-ray.
      The Xonar can encode the 5.1 audio from any game into dolby 5.1 in real-time which works perfectly fine over SPDIF in surround and isn’t limited all-channel stereo. You only get surround out of an on-board solution if you use analog connections or if it is pre-encoded in the correct surround format, and most games are not.

    • Starky says:

      This is a subject which somes up a lot on audio forums, so just to clear things up for people who may read this.

      For music, movies and any recorded audio source…
      If you are using a digital pass through (optical) there is minimal processing done on the source audio – and so there should be ZERO difference between onboard and dedicated, given that all the decoding is offloaded to the receiver.
      In that case there is no difference between on board, a £10 card or a £200 card – only if you are using the sound processing and DACs on the card does it matter (As in using the analogue outputs).

      Still, PCI cards are rubbish simply due to the electrical interference a PC produces – though you’ll not really notice it unless you have good speakers. I run a pair of M-audio AV40s – which while budget are pretty good, and you can hear it, it is why I switched to an extenral firewire card (the firepro 610) for my audio use.

      The only problem is when it comes to 5.1 in games.
      Games need a card which supports dolby live, to convert the game source into 5.1 and I don’t think any current on boards do, so realtek are limited to 2.0 in games via optical.
      They can do 5.1 via the analogue outputs, but onboard analogue does indeed suck – and eat CPU time.

      So in summary
      Onboard optical is just as good as any PCI card money can buy for anything but gaming at 5.1 – the quality of sound will be decided by your receiver and speakers.

      So the ONLY reason to buy a expensive dedicated sound card is if you game in 5.1 surround sound and have good quality speakers to back that up. Your average 5.1 home theatre speakers won’t sound different at all, game sound is just too muddy, processed and random to hear any minor differences.
      Any Audiophile needing multiple inputs and outputs should of course be using an external solution, preferably firewire.

      Final thing.
      Hardware acceleration:
      This was the big deal and the last hurrah of the SPU – EAX and other sound processing became redundant from vista onwards really, and frankly it was always a bit poor sounding, way too much and overused reverb.
      Even so, openAL and Alchemy still allow some use of hardware acceleration (and thus taking load off the CPU) – but frankly the load is so small, so minute that you’re maybe saving 1-2 frames per second as limiting factor in 90% of gamers systems is the GPU anyway.

      So unless you’re running a 2.0Ghz dual core with a pair of powerful GPUs in crossfire/SLI the CPU overhead will not be an issue.

      @Azhrhan
      If you are using optical out and a Onkyo reciever – your soundcard isn’t doing any of the decoding or up-scaling the receiver is. That’s the entire point of having one (I personally have a Yamaya 765 for my entertainment system). so as said in which case the only difference is 5.1 in games only as on-board does not yet support Dolby live (as you say DVDs are not effected as they are pre-encoded), though they will in time and even that slight advantage will be gone.
      Personally I game using headphones on my PC anyway, as I find spacial awareness vastly superior with 2.0 headphones (HD 212′s) than with even the best 5.1 setup. But that is very much a YMMV choice.

    • RedFred says:

      As far as audio goes you are only as good as your source.

      So if you want to run a cheap on-board chip to an expensive amp be my guest. It’s your money. However I will stick to a decent soundcard, thanks.

    • Starky says:

      That is the whole point, a optical pass-through isn’t a source – it does not decode a damn thing, all it is doing is letting the signal pass through – hence the term.

      So yes I’ll keep using my onboard pass-through, to a expensive external DAC – because as a digital source (anything on a PC is digital) passing through a digital connection (optical) has ZERO degradation.

      Optical is optical is optical. It is the DAC that matters.

      External DAC >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> The best PCI DAC (unless it is in a shielded, whisper quiet passively cooled machine) >>>>> Onboard

      Have a read up on it, google about it you like and you’ll see I am correct. Any audiophile should know these basics.

    • Azhrarn says:

      @Starky: in most cases, you would be correct.
      However, the Xonar can always output through it’s SPDIF in DDL or DTS (if you set it up as such), and as a result it upscales any audiosource using less than 6 channels to 6 channels. My receiver can also do this (if I turn off the up-scaling on the card and set the receiver to do it instead), and the difference in audio quality is indeed fairly small between the two.

      As a result, I’m always using the audio-processing chip of the Xonar, even with 2 channel sources. DVD and other pre-encoded audio sources are played back as pass-through though and aren’t affected by the card.

      The Xonar is also electrically shielded (if that has any effect, a PC is a very noisy environment in the EM range), but I don’t have the sort of technology available to compare the signal-to-noise ratios of my onboard and Xonar PCI-Express solution.

    • Starky says:

      @Azhrarn

      That is fine, and don’t get me wrong, I’d recommend a soundcard to someone using it as their main analogue output, or if they really wanted 5.1 surround in games as you clearly do.

      The last gamer card I tried was a X-FI fatality, and while the quality was good, much better than a onboard if using analogue direct to amp, the 2 features it offered that onboard does not, I do not use anyway.

      The upscaling on that was no where near as good as my Yamaha offers, and tended to muddy the mid and over emphasize the bass – I’ve never used a Xonar, but I’ll take your word that the quality is equal with your receiver.
      And as I said before, I game with headphones on, so 5.1 in games isn’t an issue for me – and the rare times I use my speakers (ususally when others are playing/watching), the upscaling on my receiver is more than good enough.

      Still my point remains, for 95% of users onboard pass-through with an external DAC is the better solution (often better and/or cheaper given you can put the money saved on a soundcard towards a better speaker setup) given almost all receivers offer this functionality it’s hard to recommend to anyone to use a gamer sound card unless 5.1 gaming is a priority for them.

      You’re correct that if it is shielded and you have a quiet case (fans and harddrives produce by far the most noise in a case) you’ll maybe get a better SNR than onboard – but that does not matter with optical as optical is not analogue and thus not vulnerable to electronic interference except in one form…

      Jitter.

      To which there has been no serious study, and what there is is anecdotal. mainly because any difference is so minor as to be moot, it’s only something you want to take heed of in a professional high quality recording studio (which means shielded rooms, calibrated systems and the whole 9), for home use it will never impact in an noticeable way.

      Honestly as soon as onboard solutions can offer Dolby live (in order to output 5.1 via optical for games) I’d struggle to recommend to anyone the use a gaming soundcard, when that £100 would be vastly better spent on a better receiver/active speaker set-up.
      And to an audiophile I’d never recommend a gaming card anyway, external (even USB, but preferably firewire) with at least 4in/4out is a vastly superior solution.

    • Tei says:

      You guys talk like the game devs want a glorified persona orchestra with real life quality.
      But thats don’t seems the problem (fidelity) but the ability to add “effects” like echo, underwater, and others… these effects added dynamically is what can put new games apart from old ones. Some cards able some effects, but thats is. It could be that this sound engineer was lyiing a bit, but I guest the big picture is true, you don’t have a generality of effects hardware accelerated, so you have to do these processing on the CPU

    • Starky says:

      @Tei

      That was mentioned though, the ability to do those sound effects takes a tiny amount of CPU power, reverb, echo, distortion what have you are really CPU light compared to the power of CPUs today.

      Back in the day, that processing power needed equalled a fairly large percentage of the CPU budget (up to 25% in some games), as processing power increased that percentage needed decreaced, because the CPU needed does not increase.
      Back on a 800Hmz pentium I could run maybe 2-3 VSTs at once adding realtime effects before the CPU would begin to choke, so offloading that on dedicated hardware made sense.

      Now I can run 20 VSTi’s and it hardly even dints my CPU maybe peaking at 10% cycle usage, so hardware acceleration has minimal impact – in games, even audio effect heavy games that percentage is smaller, maybe 5% of a quad cores cycles.
      5% of a quad core, in a game which is GPU limited anyway isn’t a big deal.

      Seriously on modern CPUs you can add all the effects you could ever want and it will have minimal performance impact.
      So the only thing you really want a soundcard for is a better quality DAC, but if you are going digital to a external DAC that becomes redundant in everything except 5.1 games, and very soon that will become redundant also, as on board solutions will offer 5.1 dolby live.

      Hardware acceleration for sound is just no longer needed.

  2. Dain says:

    Shame it’s unlikely there’ll be a game like this. Underwater games like Aquanox or Deep Fighter don’t fit into the cosy genres of FPS so it’s unlikely we’ll see anything like them again.

  3. Batolemaeus says:

    It’s not really a secret that Ati/Nvidia do a lot of optimization in their drivers, not just for 3dmark. Replacing shaders is a common technique.
    Some good read (although older): http://ixbtlabs.com/articles2/antidetect/

  4. Sobric says:

    New generation of graphics, and there’s still so much lens flare! Ahhhh!

  5. Mr_Day says:

    I don’t know about anyone else, but I suddenly want a remake of Archimedean Dynasty.

    • Malcolm says:

      Ooh – loved that game. Although there was always the sneaking suspicion that they’d set it underwater only to excuse the appalling draw distance :)

      But whatever the reason it made for some good, if moderately paced, dog fighting and contained the action nicely.

  6. Mario Figueiredo says:

    >> I have no idea of the truth or extent of this, or whether it still continues. Intrigue and conspiracy at even the driest of echelons, eh?

    Indeed. I extend that to also having no desire of knowing whether that is true or not.

    When you live surrounded by a gaming community that swears their games look better now that they have a card running them at 100 fps, when before they had one running them at 60 fps, you know you are deeply committed to a largely ignorant community. So caring for what they have to say on these matters is rarely a good idea, when you can’t separate what’s genuine information from fabricated information. Besides hyped and controversial information travels much faster in the tube.

    In any case, I’m afraid GPUs are becoming irrelevant in the PC market. With console ports becoming an ever increasing reality, if there isn’t a shift in game publishing for the PC, soon we will be playing at the tune of consoles advancements. Some companies may resist and produce high demand graphics, but they are invariably meet with a smaller player base and become a niche.

    Not sure whether I should be glad or sad. Truth be told some games shouldn’t need super hyper powered GPU to be made beautiful. A lesson that at least the console taught us under some circumstances. And because they failed to see that, they lost an important share of players who weren’t willing to go through the experience of spending +200 on a graphics card in addition to the game. Especially because they completely failed to scale down their games in a meaningful way.

    Games that only look good at maximum settings should have a return clause.

    • Lack_26 says:

      I know what you mean on people bitching about frames per second, I’m fine with anything over about 20-24fps, 30-40 is nice but I really don’t see any noticeable difference once you get over 40.

      Also, would it be possible for a modern game to switch to a far more abstract look on lower-grade machines, I know it would take extra work, but surely it would open up the market to PCs further down the PC beast-itude scale.

    • DrGonzo says:

      Things do look nicer at higher frames per second. 100 fps DOES look better than 60, in fact 200 fps would probably be a noticeable difference over 100 fps. Though I understand what you mean, although it may look better than 60fps, 60fps is completely acceptable.

      Plus it is the relative difference between frame rates that make the difference. So as you said 40 may not be very noticeable over 30 but 100 would.

    • LionsPhil says:

      Er. Assuming you’re using a TFT, your refresh rate is probably about 75Hz, tops.

      It is physically impossible to display more than 75FPS on such a display, unless you’re going to change to a different frame half way through a vertical scan. The resultant “tearing” is considered a bad thing and is why games have a vsync option to explicitly avoid it.

      There doesn’t seem to be any rush to get higher refresh rates out of TFTs, not least as there’s no evident demand—they don’t blank as hard and thus flicker as much as CRTs, and I’m not sure human perception can really see much faster than that. Research/engineering effort is better spent on increasing pixel density (mmm, high-density displays are lovely) and general image quality.

    • Mario Figueiredo says:

      See, this is more or less what I mean.

      The simple matter of fact that the human eye and brain can’t register anything above 60 fps is… well, ignored. Placebo’s a bitch.

      Then there’s also the matter of the screen refresh rate like LionsPhil mentions. Which is also largely ignored.

      I think really it’s not most users fault. They are lead into this reasoning by word-of-mouth like a fly spreads diseases.

    • Sagan says:

      Regarding the FPS discussion:
      A point which is often forgotten in that discussion is the response time. Basically the more fps you have, the quicker the game responds to your input. Gamasutra had an article about it, where they measured the response time of various games. This article explains why the best possible response time is 3 frames, and that is obviously less time with more fps.

    • AndrewC says:

      @ Sagan: Er your maths suggest if the game had a frame rate of one million your response time would be quicker than the time it takes for electrical impulses to get from the eye to the brain and then down to your finger.

      This is about how fast the brain can process things. It’s faster than 30fps, but less than 60fps. It all gets really complicated really quickly but it’s often bandied about that a film running at 48fps most succesfully fools the brain into thinking it is actually watching something ‘real’ and ‘in the room’. The eye has a functional FPS of about 48.

      But, I mean, what kind of a game would you be playing if a reaction time you’d miss at 60fps instead of 120fps was crucial to success?

    • Uhm says:

      Where do you get this FPS of eyes from?

    • Sobric says:

      @ Mario & Andrew C

      I don’t think that “human eye can’t register over 60fps” is true by the way.

      http://whisper.ausgamers.com/wiki/index.php/How_many_FPS_human_eye_can_see

      I know, it’s just a page on the internet, but it feels like the right answer to that statement, which is “Well over 200 FPS actually, but since the human eye doesn’t see in Frames it’s irrelevant”.

      It’s also worth noting that hardcore FPS freaks are seeking steady FPS more than max FPS. Since the difference between 200 FPS and 80 FPS isn’t really noticeable, when that explosion goes off in your face that drops your FPS to 80, you really don’t notice it in terms of performance, but you certainly would if your FPS drop from 60 to 20. That’s what many “hardcore” PC gamers are after.

      Anyway, LionsPhil is bang on. The refresh rate on your monitor caps “what you can see” so meh.

    • jarvoll says:

      According to Biological Psychology by Breedlove, Rosenzweig and Watson (and my endless lectures on brain function), the visual association area in the parietal cortex (high-level processing of visual information, think of it as post-processing) can detect differences of around 5FPS up to about 60FPS, and this is the area of visual processing in the brain that we are consciously aware of. However, the superior colliculi (whose only function is to *immediately* detect fast movement and swing your eyes around to focus on it; this is why if you’re sneaking you should never move quickly, the colliculi will see you), the thalamus (initial processing and sorting of visual information) and the primary visual cortex in the occipital lobe (the main area of visual processing) can all detect visual information up to about 120FPS.

      We can’t consciously tell whether, say, 100FPS is slower or faster than 110FPS, but we CAN tell that it’s different. To illustrate the difference between conscious and unconscious processing, imagine a person who has lost their visual association area (recall: the conscious part) due to stroke, and are therefore “blind”. Experiments on these kinds of patients, who can’t consciously see ANYTHING, has shown that the other visual bits of the brain are still going: when put in a room full of big objects that block movement, and asked to navigate across it with no walking stick or other aid, these patients manage to walk around all the obstacles despite not being able to see them. How? They “feel” their way based on information they don’t realize they’re getting from the remaining, primary visual processing areas of their brain.

      So, basically, real life is 120FPS, and even if we can’t consciously tell when something is not displaying this quickly, we can still *feel* it, and so for games to feel like real life, they will ultimately need to display at 120FPS.

    • Sagan says:

      @AndrewC:
      OK lets say the eye can see 48 images per second. If the game now runs at 30 fps, and it takes three frames for the game to react to you pressing a button, then there would be a perceived delay for your eye of 4.8 frames. (3 * (1 / 30) / (1 / 48) = 4.8)
      If the game ran at 60 fps, and then with three frames delay, your eye would perceive the reaction to you pressing the button 2.4 eye-frames later. At 144 fps you would finally see the reaction with only 1 perceived frame delay.
      And with your million frames, it would appear as if the pressing of your button led to an instant response on your computer.

      Consciously it also appears as if there was an instant response with 30fps, but the response time is one of the reasons why, when playing a fast game, it just “feels” better when playing at higher fps. You don’t really need this to play the game well. After all, the game reacts to your input within 1 frame. It just takes longer until it is visible for you on screen. But shorter reaction time just feels better.

    • jarvoll says:

      …and I’ve just read that article linked above, and it doesn’t seem to give any actual evidence for its figure of 200FPS, except an anecdote about USAF pilot intake exams. The phenomenon they describe has nothing to do with the saturation of the primary visual cortex, though, it’s simply a part of our memory system called Iconic Memory (in which we have a very short memory for the most recent thing we’ve seen; if we consciously make efforts to remember it, it goes from Iconic memory to Short-Term memory; the equivalent for audition is Echoic memory).

    • Miker says:

      @AndrewC “But, I mean, what kind of a game would you be playing if a reaction time you’d miss at 60fps instead of 120fps was crucial to success?”

      Competitive Quake, perhaps?

    • Mario Figueiredo says:

      @Jarvoll

      I don’t have the book anymore. I read it I think 2 years ago when I borrowed it from a friend. But I don’t remember frames per second being discussed anywhere. But neither your explanation makes any sense.

      Detection of a disturbance in the visual field has really nothing to do with our ability to observe continuous motion in the way it is being discussed here. It’s like saying because I can hear a ping drop when I’m in complete silence, I will be able to hear it too when in a rock concert. Neither I can understand how do you measure a visual disturbance in FPS. The thought alone bogles me. But I’ve seen that being attempted before, and I won’t pursue it any further. But you even agree that we are limited to a ~60 fps visual input processing. That you call that the conscious observation also completely confuses me. There’s not conscious about visual input processing. Especially at the speed it takes place. So, I’m really not sure what you are trying to say.

      A visual disturbance is processed differently by our brain as you acknowledged. That particular event has a different processing pattern. We are indeed a lot more sensitive to the tiniest disturbances in both vision and audition. But you cannot then grab that and pretend that the exact same visual scene passed at 60 fps and then at 120 fps will appear different. They won’t. They will appear exactly the same because it will be the other part of your brain processing that scene.

      I can see however one point you make there. That a game could take advantage of this by introducing a higher refresh rate and thus be able to use such disturbances in, say, sneaking games like you exemplify. The problem with this is that you have no guarantee the suggestion will trigger in the player brain. More often it won’t because the visual area of a screen is much smaller and much brighter, reducing the actual ability of the player to discern details (like in a very bright sunny day). So you would actually have to lower your FPS (increase the number of frames in which the object is on the scene) to implant the suggestion, if I make myself understood. Effectively rendering the increase in framerate useless.

      As for that link, you are however absolutely right. It’s really not very helpful and is, to my knowledge riddled with flaws.

    • Mario Figueiredo says:

      @Miker

      Wether a game runs at 100 or at 60 fps, events will happen at the same time. There’s no desync happening between different frame rates. What happens is the dropping of frames. So, a higher frame rate will not increase player response times.

      However, there’s of course a limit. A general consensus among game developers is that frame rates below 30 will be dropping too many frames. A player running at those will indeed start having problems playing. You have witnessed this before I’m sure. It’s particularly hard to aim because while you are doing you keep loosing frames and the object is no longer where you’d expect it to be. But keep in mind that even then the game will be synced. If you are playing at 5 fps and fire your shot in the right frame as you would fire if you were running it at 100 fps, you will hit your target.

    • Urthman says:

      It all gets really complicated really quickly but it’s often bandied about that a film running at 48fps most succesfully fools the brain into thinking it is actually watching something ‘real’ and ‘in the room’. The eye has a functional FPS of about 48.

      AndrewC, you are aware that FPS in films is an extremely different beast from FPS on a computer? Each frame of a film contains a blur of an analogue reality. Rendered frames are digital chunks with nothing between each chunk. (Motion blur tries to simulate the difference, but it is not a very good simulation.)

  7. ChaK_ says:

    still don’t have a DX11 GPU yet either.

    Waiting for a big game for me to switch, maybe BF3 if it ever comes. BC2 has DX11, but well…not yet !

  8. Vague-rant says:

    Personally, I’d be glad for PC graphics to move at a slow pace. I’m a student, so graphical prowess isn’t really my main concern when balanced against rent, food etc. If things were to go down a path where I could just pick up a card at the start of a console generation with the title “Xbox-whatever equivalent”, I’d then happily sit back and not upgrade till it overheated and blew up my system.

    I have a DX10 card at present and will almost definitely not upgrade any time in the next few years no matter how prominent DX11 becomes.

    Also; Bioshock 3; Sponsored by MSI

    • bookwormat says:

      Personally, I’d be glad for PC graphics to move at a slow pace. [...] If things were to go down a path where I could just pick up a card at the start of a console generation with the title “Xbox-whatever equivalent”, I’d then happily sit back and not upgrade till it overheated and blew up my system.

      this.

    • PHeMoX says:

      Student or not, rich or poor, it’s not good for us consumers when graphics improvements stalls big time!!

      Even now, when progress is still going this strong and at a quite incredible pace, customers risk getting ripped off by the many variations of graphics cards in a certain price range anyway.

      In other words, when you literally buy each new 3D graphics card that comes out, you’re not benefiting from progress at all, but wasting money.

      Hence, nothing really changes when you as an individual decide not to upgrade yet for a longer time. Most 3D cards, especially when at some point having been high-end, will last longer than the Xbox generation actually.

    • Tom OBedlam says:

      ^this completely

      I’m a student and conceivably will be until I’m 30 (if all goes well). I’m a pretty dedicated gamer but I work from a laptop because thats most convienent – and most expensive – platform for me to work from. Due to the inherent space restrictions in a laptop and cards built to order I can’t upgrade my old Inspiron 6400 any higher, I’ve got a 256mb video card and 4gb of RAM and I can play ME2 on OK settings, but when ME3 comes out, if I can’t run it I’m not going buy a new computer to run it. No matter how much I want to play it.
      Is there a reason why games can’t be scalable backwards, if someone is happy to play on crappy blurry settings what stops games allowing them to be played like that?

    • archonsod says:

      “Student or not, rich or poor, it’s not good for us consumers when graphics improvements stalls big time!!”

      On the contrary, when they can’t simply rely on flashy lights to sell the next game the developers have to start looking at other areas.

      Give them around three more years to get over their large breast fascination and we might even see them looking at improving areas like gameplay…

  9. AndrewC says:

    It sure is pretty though. Can’t wait for the giant blue cat-people demo!

  10. LionsPhil says:

    Bah, 3DMark2001 was best. That meadow after zooming to Earth is still more impressive than anything Oblivion manages.

    The bumpy rust effect in this is kind of nice, but mostly it seems like only a marginal step up from ’05. Which is, frankly, fine by me, as it’s time for the graphics race to the top to whimper out and die. All it’s done for us is make computers hotter and noiser, and made the expected visual level of a “full” game a gigantic and mercilessly expensive art investment.

    Better processors are far more exciting because they give us more interesting game world simulation. One day we might even get real-time fluid phyics. (See Dwarf Fortress on a map with a waterfall for how much this hurts currently—and it’s dealing with nice, big quantisations thanks to being tile-based.)

  11. Mo says:

    The ATi, nVidia “optimizations” have been going on for years. Anyone remember the whole q3test1 controversy? As I recall, ATi cards would out-perform nVidia cards in the Quake3 flyby demo, but if you renamed the executable to anything but “q3test1.exe” ATi cards wouldn’t do so well anymore. They’ve gotten smarter since then. :)

  12. Starky says:

    The day of boasting about better graphics are done with, and it’s not at all due to the consoles porting games over.
    It’s simply due to diminishing returns, more CPU and GPU resources are needed to make marginal improvements in graphical fidelity.
    Shadows can seem slightly more real (ambient occlusion), edges can seem slightly sharper (higher/better AA), models can have more polygons – but to notice all this detail you really have to stop, pay attention and focus on it.

    And in a fast paced game this really doesn’t happen.

    Sure in a video it looks amazing, but what you are basically seeing is a game cutscene – what difference does it make if it is pre-rendered or in-engine if you cannot interact? And if you COULD interect you’d never take the time to ooh and aah at those visuals, or at least you’d stop noticing after 5 minutes.

    Crysis had the same thing it looked amazing, still does and still probably the best looking game available when you start pushing it at high resolutions and stupidly high settings. Yet after 15 minutes you stop noticing the graphics and just play the game – with maybe a few exceptions or when it changes environment enough to wow you again (like switching to the ice levels with frozen waves).

    Bottom line is that budget (both in real money the studio can spend and hardware resources) is vastly better spent on Gameplay, AI, features and design.

    • panik says:

      “Yet after 15 minutes you stop noticing the graphics and just play the game”
      But should those graphics lower at any point, you would shout “WTF…my beautiful graphics”.

    • Jason Moyer says:

      That’s because graphical consistency is more important than graphical fidelity.

    • PHeMoX says:

      The graphical consistency in Crysis is more than excellent, which is why you might feel adjusted to it at some point, but it’s a bit stupid to suggest it loses it’s wow-factor, which is probably what was meant here.

      Graphics are meant to look at, not to make you stop and stare, although it’s quite an accomplishment if it accomplishes exactly that!!!

      I say, more more more of generation defining Crysis-like games please, sir yes sir!

    • Starky says:

      @PHeMoX

      But it DOES lose it’s wow factor, that is why PS1 games look like garbage even though at the time they were ultra realistic looking and awesome.

      We just get used to new graphics and they become the norm – and as was said above it is only when we lose them we notice.
      In other words, it is only when you stop playing Crysis and load up another game that you go “why doesn’t this look as good as Crysis?”
      It’s the same reason why stylized games and 2D games age much better. Abstract doesn’t age.

      Still that wasn’t my main point – my point was that, the reason more games are not pushing crysis graphics, is simply that the budget it requires to do that are better spent on other things.

    • Bremze says:

      Yeah, it would be nice if slowing down the graphical addvances would leave more money for gameplay, but that wont happen in AAA games anytime soon. You know where the money not spent on graphics will go?

      “The LA Times indicates that Modern Warfare 2 cost between $40 million to $50 million to develop. When factoring in the production and distribution of materials on top of marketing expenses, the launch budget was somewhere around $200 million for the game. ”

      Yep, development cost a quarter of the budget, rest went to marketing. Although printing, shelf space and distribution cost isn’t shown, I’ll guess AT LEAST HALF THE 200 MILLION BUDGET WENT TO MARKETING.

    • archonsod says:

      I’m not even sure it’s diminishing returns. Few devs are making their own graphic engines these days, and a third party engine always has limitations beyond the scope of your developers. Plus it’s a bit pointless, you could license the UT3 engine and spend two years making the graphics the absolute best they could be, or you could release six games over the same period with comparable graphics to your peers. Guess what the average beancounter is going to pick as the better choice for the business.

  13. Miker says:

    Though I can’t say that I can tell the difference between 60 fps and 100 fps, I’m firmly in the camp of preferring 60 fps to 30 fps by a wide margin. Going from, say, Dirt 2 at 60 fps to the Split/Second demo at 30 fps is frankly rather jarring, as I play most of my games at 60 fps — especially racing games.

    And to play devil’s advocate, I’m pretty sure that no self-respecting Quake player will play at 30 fps, and the most hardcore won’t deign to play at 60 fps. It’s a bit crazy, but hey, they have their standards.

    • PHeMoX says:

      When FPS is not continuously around the same value, you will notice that regardless of 200 fps or 60 fps.

      When it’s a rock solid 200 fps all the time, you won’t notice the difference compared to a rock solid 60 fps all the time.

      Then again, there’s no such thing as such a perfect scenario throughout any game.

  14. Radiant says:

    I have a high spec card and have only 1-2 AAA games that look different then they do on the fucking xbox.
    Fuck you direct x 11.

  15. cliffski says:

    I just downloaded this crapware.
    First it spammed physicx into my systray, then it turns out that if you want the ‘trial’ rather than paying, I need to give them my email address.
    Done that… given option to choose 4 different demos, but none of them is in the trial edition.
    total
    waste
    of time
    I hope me downloading this crap cost them $0.01 in bandwidth, at least.

  16. bob_d says:

    The Unigine DirectX 10 and 11 comparison video really shows off the features of 11 better, as you have some reference for what’s actually different. The hardware tessellation, where you can use a displacement map to turn a low-poly model into a high-poly model actually got me excited – as a developer, at least.

  17. Bob's Lawn Service says:

    I am somewhat underwhelmed.

  18. considerations says:

    @lack_26
    I’ve been thinking about this for a while because I am considering working on my own game at the moment and I’m not sure this is entirely a reasonable option. To a certain point while designing an art style you decide you want to make your game look and feel a certain way, and as a result that is how you end up constructing the game… which should be obvious and I’m not trying to say you don’t realize this. But in order to create the same “feel” it would probably be extremely difficult without just overhauling the whole art style. A better option is something like tf2 which pretty much looks good no matter what (at least to me…).

    I am at a point in my design process where I’m stuck between choosing a lower level art style that would be both more playable (on more computers, I mean) and easier to make and a higher level art style that would require significantly more work but would allow for more detail and a more expressive world…
    I could technically do both, but because it is supposed to be a single player game I don’t believe it is a good idea as it would result in alternate feels.

    I could have been misreading what you wrote, but if this is what you meant that is why I (personally) would not want to do that.
    I would be interested in doing that in a multi-player game, though it would be difficult to “balance” the different art styles…

    @DrGonzo
    How large of an improvement that is will also be relevant to a particular person’s opinion and experience. For most people there will be no difference between the higher level frame rates. I used to only notice differences in frame rate up to about 30 fps, then it all looked the same. Now I see the difference between 30 and 60…so before it would have made no difference to me, but now it is a bigger deal…

    • considerations says:

      I predicted that reply fail, wooooo!

    • Lack_26 says:

      Nope, that’s pretty much what I meant and you give some perfectly good reasons of why not to do it. I’d imagine having to take the game’s art direction in two ways would be a big problem.

      But what about simply having more block colours as opposed to just having a blurry version of the existing textures (which usually look absolutely horrible). For example, take a game like fallout 3, on low settings it looks horrible and murky (still playable, but blurry and not aesthetically pleasing), if you re-did those textures to look more akin to a cell-shaded texture (or even something similar to how TF2 does it) then it might look better than a blurry mess on a low spec computer.

      (I just used fallout 3 as an example, I’m not sure if it’s art direction would suit it but it’s just an example). Although re-doing textures would take a lot of time and probably massively increase the actual amount of DVD/HDD space the game takes up.

    • considerations says:

      It would be better if they did something along the lines of that, truthfully.

      One thing that has bothered me about more recent high end games is that when I turn their settings down they look worse than old games AND don’t run as well.

      I remember trying to turn down the settings on Unreal Tournament 3 resulted in a murky mess of a game. Playable, but uglier than it should have been.

      I think it should be possible to do something along the lines of what you are suggesting to do…perhaps they just need someone to think of the proper algorithm to simplify the textures, rather than using the textures they generated but at lower settings? Hmmm… I will think about this more, and perhaps post again… (have to run for the moment, just about to leave…um…work…)

  19. PHeMoX says:

    This looks quite incredible. :)

  20. Sagan says:

    Looks like my comment didn’t post.

    Well I’ll just write the gist of it again.

    Basically if you read current scientific papers, you will find that we can pretty much now render arbitrarily complex geometry in real time. If everyone had a high end graphics card, games would already have geometry that can be as complex as you want. Like John Carmack’s megatexture, except for geometry.

    I fully believe that a big selling point for the next console generation is going to be that they allow infinite geometry. Or at least so detailed that the constraints are then on the disk space, not in the computing power of the graphics card. That is to say much more detailed than can be displayed on monitors.

  21. Superbest says:

    I wish when people talked about how a game has “good” or “bad” graphics, they considered them at settings other than maximum. Truth is, I doubt most of the consumers play the game at highest, and it’s ridiculous that a current game at medium or low settings runs at 20 fps, and still looks nauseatingly horrible compared to a game from 2003 that runs at 160 fps.

  22. Peter Radiator Full Pig says:

    How long does it take to make a scence like that?
    At the moment i play more 2d games than not. And the graphics can be gergeous, when done right.

    Look at that Supermeat boy video.

  23. Wulf says:

    Did anyone ever feel that graphics were a crutch? I did.

    Developers invariably opt for one of the following three scenarios:

    - We have no art director, we’re going to pile on shaders and polygons for our wow factor.
    - We have an art director, we don’t need to pile on shaders/polygons to show you something truly beautiful.
    - We have no art director, and we’re just not going to give a crap about that stuff anyway.

    Examples (respectively):

    - Oblivion
    - Guild Wars
    - Diablo III

    Now, of the three, the ArenaNet philosophy is the same, I believe that any game that wants to use graphics to any aesthetic effect should have at least one artistic director, or they shouldn’t bother at all. Look at Aquaria, for example, and Machinarium, they look better than a good lot of triple-A 3D games. What I hope for is that we’re now entering into the age where, perhaps, not that no one gives a crap about graphics, but rather people actually do give a crap about art, and don’t just cheese their way through with high polygon counts.

    I hope we’re getting to the point where it’s important for a development house to have an artistic director or even a few. Guild Wars puts a lot of modern games to shame, because they went for real art, using proper artists, rather than just cheesing through with graphical effects, shaders, and whatnot, to give people their visual bling-bling which doesn’t actually look that pretty at all. Guild Wars could put a lot of modern games to shame. And did you see the trailer for Guild Wars 2? A lot of that looks like it’s straight out of a painting.

    Speaking of paintings, I’m going to cite one of my few console loves here, one that I sat down and played from start to end, one that was so beautiful I could have bloody wept over it. Okami. If you’re unfamiliar with it, look it up. The art of Okami was breathtaking, it emulated a Japanese watercolour style, and it was one of my favourite games for not only the gameplay, but the aesthetics, and for actually looking like something special. What if we had more games that looked like this? (Skip in a little way for footage from the game’s engine, too.)

    That was on the Playstation2. So anything, anything can do art, and anything can look amazing. What I find is that games that went for polygon-pushing and shader-stuffing age a hell of a lot quicker visually than their more artistic counterparts. So yes, I’d vote for leaving us where we are in graphics, because better graphics cards are just a crutch for developers who want easier work. I want to see more art, I want to see games that look genuinely beautiful just because they were put together by artists.

    Aquaria can run on really old machines, and it’s still bloody beautiful, amazingly so. Guild Wars is old. Uru Live has awe-inspiringly alien locales of the like we’ve rarely seen, Ecco: Defender of the Future for the Dreamcast is responsible for one of the most memorable undersea environments, and very little else has come close to that, and even Valve get it, concentrating on characterisation and making their characters less plastic instead of just piling on the latest graphical features. Vampire: The Masquerade – Bloodlines, years later, (using a tweaked Source engine) has some of the most emotive characters we’ve ever seen in gaming.

    For me, the latest graphics card has always been an e-peen thing, and I’m glad we’re moving away from that, because I really couldn’t care less about next gen graphical hardware, I want developers to start making games that actually look good, rather than using every possible feature of the latest graphics cards to fake some kind of visual excellence.

    More art directors, more aesthetic attention, more art in games (if not necessarily art games) then, that’s what I say. I want something that looks as beautiful as Guild Wars, as alien as Uru Live, and with characters as emotive as Bloodlines’. THAT would be a triumph.

    • Vandelay says:

      Making a dig at a game that hasn’t even been released yet? We haven’t even seen that much of Diablo 3.

      Agree with pretty much everything else you have said though. Art style is much more important than the pixel pushing.

    • Mr Labbes says:

      I absolutely agree. Currently, I’m playing Okami on the PS2, and it’s absolutely gorgeous. Also a lot of older games, like Baldurs Gate 2 or even UFO: Enemy Unknown (or whatever you may want to call it) still uphold their art style, which is rather impressive.
      Also, I still think that Guild Wars and the Source engine are the prettiest graphics I need. I mean, it’s nice that ME2 looks as awesome as it does (or Mirror’s Edge, which is absolutely gorgeous), but if games stayed on the graphical fidelity of a HL2, I could live with that.
      Then again, I have never been a graphics whore.

    • jaheira says:

      More art is fine, but why not have a higher technical standard at the same time? Okami is gorgeous,yes, but surely it would have been somewhat more gorgeous at, say, a higher resolution? It’s possible that the devs for that game were not constrained by tech, but I doubt it.

      Contrary to most here, it would seem, I AM a graphics whore. The fact that PC games no longer push the tech envelope makes me a sad puppy.

    • Wulf says:

      @Vandelay

      As I point out below (to Al3xand3r, who apparently likes l33tspeak), screenshots can be very telling. The problem is that it looks like loads of things are just mashed up, and they have no sense of what should be there or what shouldn’t, it has no visual cohesion, the screenshots just look like a real mess to me. The lighting looks horrible, the ambience is off, and overall even Diablo II had a better sense of conveying that sort of environment and atmosphere.

      Compare screenshots of Diablo III, Diablo II, and Torchlight, then you’ll see what I’m getting at. There will always be the fanboys, I get that, but I can’t make excuses for their lack of visual cohesion. It’s just… chaotic. Really chaotic. And not in a good way.

      @Mr Labbes

      Precisely.

      I’m not a graphics whore either, but I am an art whore. I love seeing games that are pretty not because of the fidelity, as said, but simply because of the attention to detail, striking a balance between visual variety and cohesion (which Guild Wars did so, so well), and seeing things that are as picturesque as a painting. I mean, for all my love of Mass Effect 2, it has dead faced characters and somewhat boring environments, but that’s a lot because of the engine they use for it, maybe, I think? Mass Effect 2 has great graphics, but outside the cutscenes there’s very little that could be considered to be picturesque.

      @Jaheira

      Because I’ve never actually seen a higher technical standard that has better art. Can you name an example? I can’t. And I’ve been with gaming a long time. As I pointed out above, there tend to be three approaches: get an art director and make breathtakingly picturesque things which don’t need to push the limits of hardware to be beautiful, cheese through it with polygon-pushing and shader-shoving, or simply not bother at all.

      The problem? If you go for a very high-graphics approach, it means that things need to have very high-res textures created, lots of bump-mapping, this, and that. And there’s the problem, they don’t have the resources to make this stuff look beautiful, so it just looks passable. Mass Effect 2 (again, outside of the cutsscenes) was pretty good to look at, but the environments were passable, they weren’t really beautiful. I think the only couple of missions in Mass Effect 2 that made me raise an eyebrow at the art were Jacob’s and the snow planet of the Firewalker missions. Those were a bit special. But the thing is is that they were exceptions rather than the norm.

      A studio, even a triple-A studio, doesn’t have the resources to make something both picturesque and high-fidelity, and given the choice between something that has bucketloads of polygons and graphical effects, or something that looks beautiful just because it is, I’d opt for the latter. I find it more immersive, and I find bland environments much more noticeable in a very negative way. It might be because I stop to look at everything, and I do, I take in all the details, and with a lot of triple-A games I feel really disappointed. Only very few manage something that’s just so pretty-as-a-picture that it makes me smile. And even technology can be pretty, other games have done it.

      And to be honest, if someone were to say that the future was filled more with the likes of Guild Wars 2 than Mass Effect 2 then I would smile. Just compare screenshots of the two to see what I mean. I want Mass Effect 2 to look as good as Guild Wars 2, I want it to have some really picturesque scenes. I want to go to a planet in ME and stand on a cliff-face, I want to overlook the most stunning vista, with a distant cityport with domes, and spires, and ships flying too and from it. But if they go for the high-res stuff, it limits what they can do on screen. Does that make sense at all?

      I mean, in Omega and the Citadel they tried to do this, but the ships were shrouded in shadow and low-detailed, this stood out like a sore thumb to me, because it was like they had to hide that they couldn’t do it better, because the environment the player was in was high-fidelity, so they couldn’t do much with details. It’s like… do you want a field with a few incredibly detailed trees, or a beautiful, hand-crafted forest?

    • Starky says:

      Eh? Diablo 3 are you kidding me? Blizzard has ALWAYS been about art direction over everything except gameplay in their games.
      All of their games so far have had second to none art direction – Which is why age fairly gracefully (diablo 2 still looks good today, compare that to 3D games of the time).

      Now I know a lot of people are having a go at D3′s choices, but that seems to be more personal preference than real argument.

      Hell, look at a video tour of the Blizzard HQ some time, it’s more like an art gallery than a Dev studio.
      You may not like the art direction Blizzard choose for some of their games, but to claim that Blizzard have no art direction is simply absurd.
      Everything they do in their games is meticulous, and deliberate.

      For the record I’m not sold on D3′s art style either – though I’ve not seen enough to condemn it – but like or dislike I doubt it will matter if the gameplay is good.

    • Thants says:

      I think Mirror’s Edge qualifies as technically and artistically impressive.

    • Wulf says:

      @Starky

      “Eh? Diablo 3 are you kidding me? Blizzard has ALWAYS been about art direction over everything except gameplay in their games.”

      No, I’m not. I really don’t like what they consider as art.

      “All of their games so far have had second to none art direction [...]”

      In your opinion, anyway, but I just don’t see it. I’m sorry. It looks ugly to me.

      “(diablo 2 still looks good today, compare that to 3D games of the time).”

      The people who did Diablo II are the people who went on to become ArenaNet and Runic. The people who made Guild Wars and Torchlight. If you ask me, that’s where all the artistic talent went.

      “Now I know a lot of people are having a go at D3’s choices, but that seems to be more personal preference than real argument.”

      There can be no real argument where critiquing is concerned though, can there? We all have different standards.

      “Hell, look at a video tour of the Blizzard HQ some time, it’s more like an art gallery than a Dev studio.”

      So they like a little egoboo with putting their art up on their walls, they hardly compare to the concept artists of Guild Wars (see the link in my reply to you below), in my opinion. Frankly, I don’t think they even come close. I think it’s mildly insulting for someone to think that they do, but whatever.

      “You may not like the art direction Blizzard choose for some of their games, but to claim that Blizzard have no art direction is simply absurd.”

      To claim that the amateurish efforts of Blizzard qualifies as art is equally absurd.

      “Everything they do in their games is meticulous, and deliberate.”

      I won’t disagree with anything but the art, which I think looks rushed and messy.

      “For the record I’m not sold on D3’s art style either – though I’ve not seen enough to condemn it – but like or dislike I doubt it will matter if the gameplay is good.”

      And as with many, you seem to think that gameplay somehow boosts art. If you can tell from the screenshots that the art style is lacking, you shouldn’t have felt the need to make this post. I’m not talking about gameplay, I was never talking about gameplay, I’m just talking about how the game looks, aesthetically, which to me is horrible.

      @Thants

      Huh, you may have me there. Are there really that many examples of it, though?

    • Wulf says:

      Like I said below though, at the end of the day, I don’t know if we’ll ever convince each other… and since this is all opinions, all subjective, an area where there can be no facts, I really don’t see any reason to continue hashing it out.

      I think Blizzard are amateur, I think their concept artists are generic pap.

      You think Blizzard are an example of seminal art direction, and you think their art belongs in a gallery.

      I recognise how you feel, but it’s now how I feel, and that’s all there is to it.

    • Lilliput King says:

      @Wulf

      You suggested Diablo 3 was using graphics as a crutch, and that the studio made a conscious decision based on that:

      “We have no art director, and we’re just not going to give a crap about that stuff anyway.”

      Regardless of how you feel about Blizzard’s art, you were wrong there, weren’t you?

      Anyway, Crysis counts as a technical and artistic achievement. The appearance of the aliens was notable for the gameplay descending into sludge but also for the transformation of the island, which was brilliant to behold.

    • Bremze says:

      @Wulf

      I’ll just address the Diablo 3 screenshots part. Is there something particular that you dislike? The only complaint I can think is, that the watercolor-like environments clash with the monsters, making them look out of place, but hopefully that’ll be fixed.

    • Starky says:

      @Wulf

      No I do not think gameplay boosts art, I never said that.

      I think that if gameplay is good enough art doesn’t matter – if the game is fun, the game is fun visuals be damned. Still good visuals are a good thing to have.

      I’m not trying to change your mind, you hate the D3 art style, as do many – personally I think it looks great in parts, but a bit bland in others, and as said above the watercolour backgrounds clash with the characters.
      Still we’re seeing alpha footage, D3 is at least a year from release, and I’m sure Blizzard will sort it all out by then, they never release a game with anything except a mirror shine of polish.

    • skalpadda says:

      Wulf:
      Out of curiosity, what is it about the Guild Wars art direction that makes you hold it up as a good example? I’ve only played the game very briefly, but from that and seeing screenshots I can’t say I’ve seen anything stand out as particularly imaginative or unique about it but I’m open to the idea that I just haven’t seen enough or the right things.

      I don’t get your claim that Blizzard don’t give a crap about art direction at all though. I can’t think of many developers who can match them when it comes to amazing amounts of diverse influences and ideas while still managing to maintain a consistent art style in each game. If their aesthetics don’t tickle your fancy that’s fair enough, but calling them incompetent at art is ludicrous.

    • skalpadda says:

      Sure that’s neat, but in the game?

  24. Carter says:

    But can it run Crysis?

    I’ll get my coat

  25. Legionary says:

    This is all very well, but it lacks any kind of meaning. We still haven’t reached much into DX10′s capabilties in terms of visual improvements; the bottleneck is the disk space and memory budget, and DX11 isn’t going to improve either of those things.

    The ability to render a very pretty sequence in the murk doesn’t speak a single thing about the future of gaming.

  26. Ricc says:

    Not a single game out there has ever managed to consistently engross me with graphics alone. Be it Crysis or Metro 2033, you can’t play for longer than a few minutes without noticing that this round shape is a polygon or that these textures are muddy or that some object in the background just changed its quality of detail as I came closer.

    We are not even *close* to photo realism and everybody, who says that games look like Pixar movies these days, is a god-damned lier. ^^

    Call me a graphics whore, but not having to constantly notice these above things is what makes me look forward to games of the future.

  27. Al3xand3r says:

    The more detail, the more development cost in general, but still, the fact we still get advances in hardware does help the lower end studios make graphics that stand up better. Not to the same effect a company that spends million will, but still, their code can be a bit sloppier yet due to excess power still allow them to have vast open spaces, push ton of polies, physics, AI, actors, etc, and their models to be that much less optimised without an obvious performance hit, and what not. Tools also advance to automate or make certain processes easier. So, whether any game surpasses Crysis or not, the advance in hardware is still very welcome if you ask me. Just not as urgent for the end user which is also a good thing in my opinion.

  28. James G says:

    Incidentally, on the subject of framerates new nVidia drivers are out, and most the early responses I can see suggest quite favourable performance increases. (I was more interested in double checking to ensure they were stable.)

  29. Al3xand3r says:

    Also, lol @ the dig @ Blizzard’s work for lacking art director when that is their primary quality. Funny troll :)

    • Wulf says:

      I totally disagree. I think Diablo III looks like an ugly mess. I’ve stressed before that I don’t care about cartoony, it just looks like a general mish-mash and hodgepodge, all slammed together, with things in the D3 screenshots that stick out like they don’t belong there, poor lighting work, horrible ambience, and so on. The Torchlight screenshots look much, much better by comparison.

    • Skurmedel says:

      But that is your opinion Wulf, totally subjective. How is it a sign that they have no art director or doesn’t care about it? Take a look at what their concept artists release on DeviantArt and you’ll probably think otherwise. You might not like their style, but one can hardly say they don’t know their craft.

      And yes, personally I think Diablo III looks good in the limited material that has been released.

    • Starky says:

      Indeed – as I said above more than any other company I can think of Blizzard take great care over their artistic direction, not everyone may like it (some people hated the “WoW look” – I personally wasn’t sold on it until I saw it in game and saw how it fit the world – and some of the architecture in that game is mindbogglingly good).

      You may not like the direction they choose, but there is nothing chaotic about it – everything in their games is precisely designed to be how they want it to be, and they have the best artists in the business working for them.

    • Wulf says:

      @Skurmedel

      “But that is your opinion Wulf, totally subjective.”

      Precisely. It is just my opinion. So why do you and Al3xand3r feel so invested in the opinion of a random person on the Internet? One can never speak factually on these things, I’m just going by how I think they are. So yes, I’m not trying to present my opinion as fact, it’s always something I avoid.

      “How is it a sign that they have no art director or doesn’t care about it?”

      I all ready covered this, but if I have to repeat myself…

      I said: I think Diablo III looks like an ugly mess. I’ve stressed before that I don’t care about cartoony, it just looks like a general mish-mash and hodgepodge, all slammed together, with things in the D3 screenshots that stick out like they don’t belong there, poor lighting work, horrible ambience, and so on. The Torchlight screenshots look much, much better by comparison.

      That’s my opinion, yes. But it’s also my opinion that they don’t have a great art-director, if they have any art-directors at all then they’re no talent hacks. Again, this is my opinion. People critique art with their opinion, reviews exist to critique games, and I’m pretty good at spotting bad art. Art means a lot to me, on various levels, and I just don’t think that Diablo III is very picturesque or pretty from the screenshots.

      “Take a look at what their concept artists release on DeviantArt and you’ll probably think otherwise.”

      I’ve seen their work. And frankly? Compared to the likes of Dociu (Guild Wars concept artist) it’s utter crap. It’s generic fantasy pap that just about any student out of art college could manage, it isn’t particularly striking or amazing in any way that I could see.

      But again, this is my opinion.

      “You might not like their style, but one can hardly say they don’t know their craft.”

      Now who’s trying to push their opinion as fact? I can say what I like, and if I believe it to be the truth then that’s important, and I do believe what I say is the truth, at least from my personal perspective. It won’t be your truth, but I’m never going to say that Diablo III represents a group of people who know good art. I don’t believe that.

      “And yes, personally I think Diablo III looks good in the limited material that has been released.”

      That’s your opinion and you’re welcome to it!

      @Starky

      “some people hated the “WoW look””

      I will point out that I absolutely love Torchlight’s aesthetic, and that’s all bright and colourful too. Sometimes there’s more to an opinion about Blizzard’s artistic talent (or lack thereof) than ‘WoW gayness’, and I really hate how that’s cited every time someone says they’re just not at all impressed with it. I mean, if it was as great as everyone’s making out, then I should be as sold on it as I was with Torchlight, and yet… I’m not.

      “You may not like the direction they choose, but there is nothing chaotic about it [...]”

      That’s your opinion, to me it looks all a bit hapdash. There are colourful powers that don’t effect nearby architecture with their light, there was that screenshot with the radioactive bright orange boars in a dark dungeon, it just doesn’t look visually cohesive at all. I’ll often look at a Diablo III screenshot and see something that looks out of place, and something that looks not right, and it’ll come over as messy and amateurish to me.

      “[...] and they have the best artists in the business working for them.”

      This I couldn’t agree more with, really. Dociu is one of the best artists in the industry, and Blizzard does not compare. Look up Blizzard concept art and compare it with some of this stuff. In my opinion, they don’t even compare.

      But again, I’m surprised at the investment in my opinion if you’re all so sure about what you say. At the end of the day though, we’ll have to agree to disagree. You see something in their art that I don’t, I’ll never convince you, you’ll never convince me, and that’s that. I’m quite confused over how Blizzard’s work could ever be mistook for artistic panache, really. Shrug.

    • Starky says:

      @ wulf
      I’d hardly call it “investment” that sounds like a round about way of trying to dismiss us as fanboys – a phrase I loathe, almost as much as the actual fanboys who caused it. It’s almost like you have to pre-defend yourself against the accusation in any discussion to prevent it rearing it’s ugly head.

      This is just a random internet discussion to pass the time – I doubt any of us really care what you think, like or dislike – the discussion is simply a way to waste time (like almost all activities involving the internet).

      After all, like or dislike, D3 will sell millions of copies, 4-5 right off the bat, and maybe 5 more over the long tail, plus expansions and DLC, blizzard will make a vast amount of money as they always do – people will love it, and hate it, but the vast majority of us will probably like it well enough.

      Still, what surprises me is how vehemently you hate it, if anyone seems to be overly invested it’s you Wulf – disliking something is one thing, but the insults and aggression towards Blizzard is quite another.
      You must be aware that you’re borderline flame baiting right? It’s probably thanks to the quality of RPS and the standard that these comment sections usually enjoy that no one has really bitten.

  30. Hmm says:

    Dear companies creating PC hardware, dear PCGA, whatever…
    If you funded a PC exclusive game that looks like this (SubCulture 2?) every once in a while, I would run to the store to buy a new PC immediately.
    There are people who buy consoles for ONE or two games, if you want anyone to get a DX11 card, you NEED to give them reasons. Maybe such games wouldn’t sell millions, but they would push people towards PC gaming = hardware would sell better.

    • jaheira says:

      Correctamundo.

    • Tom OBedlam says:

      I can guarantee you it wouldn’t work on me. I have to live to a budget, you can make the swankiest looking game you like and, no matter how much I want to run it I can’t/won’t upgrade just to play it. Before I updated my graphics card from 128 to 256 I exclusively caught up on games like KOTOR that I hadn’ t got round to playing.

    • Eamo says:

      Except that this demo revolves very much around depth of field effects which, while very pretty indeed are annoying as hell in a game. Look at how much of this demo revolves around what is basically camera focussing tricks. It looks great but the reality is that in a PC game it is impossiple to determine which part of the screen the user is currently looking at so you can’t do any kind of depth of field tricks and have to do the totally unrealistic but much more user friendly system of drawing the entire screen as if everything was in focus.

      Put the entire scene into focus and it will look very much like a lot of games out there as regards graphics quality.

  31. whaleloever says:

    I was hoping that the camera would pan out at the end and it would be a giant underwater MSI logo. And then a big 3D MSI logo would spin across the screen, as a bunch of puffer fish formed the shape of an MSI logo and then the camera zoomed out some more and it turned out that the sea spelled out the MSI logo and then God came down and he was an MSI logo. That’s what I was hoping for.

  32. jarvoll says:

    I am firmly of the opinion that, these days, animation blending and collision have much more to offer game visuals than incremental shader improvements. A hyper-realistic model loses all sense of realism the minute it turns on the spot without its legs moving, clips through the environment, or refuses to react fluidly to having an intended animation interrupted.

  33. Bassism says:

    I’m going to go ahead and agree with a lot of what has been said so far.

    I think Wulf has the right idea in suggesting that games be made with a focus on art direction as opposed to fancy graphics. There are countless old games that still look rather beautiful, despite their total lack of graphical prowess. And, again, as pointed out, better graphics make it increasingly harder to make things look ‘nice’.

    The biggest reason that I look forward to stagnant video hardware is that whole constant need to upgrade thing. Now, I realize that you don’t need to upgrade, and for as long as I’ve been a gamer I wait until I pretty much can’t run anything before upgrading again (and in desktop form I go to middle of the road hardware). Playing the “I’m a student” card, I really can’t afford computer upgrades with any sort of regularity. Even my plans to buy a new computer this summer have shattered, leaving me with the old mbp for yet another year.

    But there’s really no reason why my computer, which could pump out some jaw-droppingly beautiful games 3 years ago without breaking a sweat, and can even do Crysis and look pretty good, should only be able to run GTA 4 at 800×600 and 15 fps, while looking worse than something that came out 10 years ago.

    This whole issue was a big part of the exodus away from PC gaming in the first place.

    It may very well be the ‘fault’ of console ports, but I feel like the big developers and mass market might have reached a point where they realize that you don’t need a trillion pixels onscreen with flash-bang shaders to be pretty.

    Developers can pump out nice looking games on consoles years after their release. The only reason that we can’t play nice looking games on a three year old pc is that hardware vendors keep pushing out new hardware, which makes devs want to push out more shaders.

    Anyway, rant over.

    Most people in this thread: I agree.

    • blargh says:

      My PC is 3-years old, and was midrange PC at the time I bought it (C2D 3Ghz, 2GB RAM, 8800GT). I can still play games with fancy graphics on it just fine, and in fact have not had to reduce visual settings except for a couple of games.

      The reason GTA4 runs like crap though, is because the port was a piece of crap. You need only look at the Digital Foundry’s comparison of the PC version (running on an i7 and a GTX 295) to the PS3 and 360 versions, and just see how a PC that has hardware that is miles ahead and a whole lot more powerful than a current generation’s console can’t provide a decent playing experience. Framerate was all over the place all the time jumping around between 20 and 60, whether there was something on screen or not. Absolutely horrendous, and it just made me realize how really little effort some companies put into porting console games to PC, and how pathetic some people’s attempts were to defend Rockstar by claiming that the game was just “CPU hungry” when even a top of the line CPU gave you the same framerate you’d get on an entry level quad.

      The point is, a lot of companies don’t really seem to make an effort these days. Don’t reward them for laziness by buying their games. Vote with your wallet, and buy from developers that actually give a damn.

  34. DerangedStoat says:

    For a display of graphical prowess, they failed completely to explain how the MSI logo’s were being magically spotlighted throughout the demo (ignoring for the time being why on logos on a submersible would need to be lit up at all considering no one is going to be able to see them at that depth).

  35. Thiefsie says:

    Nothing new there,

    Bokeh was in Just Cause 2, lens flare has been everywhere since the dark ages and frankly… why?? We don’t see lens flare with our eyes… it is frankly an ugly artifact of using a lens and should not be held in the high regard it supposedly is.

    It is ‘less real’ having lens flare than not…

    Colour me unimpressed…

    Sadly (actually I’m more pleased about this than anything else) the need to upgrade video cards has been lacking for the past couple of years.

  36. Thiefsie says:

    Also…

    Sub Culture and Archimedean Dynasty remakes

  37. Magic H8 Ball says:

    Anonymous Coward said:
    Except that this demo revolves very much around depth of field effects which, while very pretty indeed are annoying as hell in a game.

    It worked well in Crysis and in Clear Sky. The latter even used it when reloading the weapon, which didn’t make much sense(I can’t imagine a hardcore merc not able to change a mag without looking at the gun) but was an interesting game mechanic.

  38. Scilantius says:

    ^this

    Definately more subculture please!

    • Scilantius says:

      *sighs*

      Reply fail – I was trying to reply to Thiefsie. I really ought to figure out how this works one day.

    • blargh says:

      I’ve figured it out, actually.

      The first post always fails to reply properly, and anything after that works. If you clear your browser cache regularly, then this will happen more often than not.

    • Berzee says:

      Make sure that you see “click here to cancel reply” above the Name textbox. If you don’t see it, you aren’t replying and you have to fiddle with things (clearing cache as was mentioned, prolly)

  39. Jacques says:

    Direct X 11? I’m still more than happy playing Total Annihilation on my laptop.

    I’ve never understood the graphics thing, I’d rather play a well optimised game with good art direction.

  40. Ben McMahon says:

    Actually 120 hz has a number of benefits.

    Firstly just becuase it feels smooth at 60 hz doesn’t mean that you don’t get any benefit at higher frequencies. The brain is good at compensating but doesn’t mean image quality can’t be improved during fast motion.

    Higher frequencies mean the image is sharper when there is a great deal of movement. For example if I move the mouse fast enough in fps games on my crt in 85hz I can seen a trail of images, this effect is reduced at 100hz

    In the case of tft’s the artifacts in the processing are reduced and image if sharper during movement. Check out review of 120hz panel on tft central. You can clearly see the reduction trails/artifacts. The reviewer said motion was perceptively more reactive at 120hz.

    Finally you can also benefit from higher frame rates with v sycn and lower input lag

  41. Sudogamer says:

    @

    Final thing.
    Hardware acceleration:
    This was the big deal and the last hurrah of the SPU – EAX and other sound processing became redundant from vista onwards really, and frankly it was always a bit poor sounding, way too much and overused reverb.
    Even so, openAL and Alchemy still allow some use of hardware acceleration (and thus taking load off the CPU) – but frankly the load is so small, so minute that you’re maybe saving 1-2 frames per second as limiting factor in 90% of gamers systems is the GPU anyway.

    @ Starky, great post. Probably better than my article “What happened to EAX?“, and I agree that the EAX ‘effect’ (and other hardware accelerated ‘features’) were nothing more than a bit of selective compression, EQ and reverb.

    However, my biggest gripe with modern games is there is hardly any attention to the environment – ie you could be stood in a deep / vast chasm, surrounded by water and your NPC’s will still sound like they’re in rich sounding studio vocal booths. Ruins the immsersion somewhat :)

    • Starky says:

      @Sudo good article man

      It’s a message I’ve been telling people for the past 2 years, but people are only just starting to believe – I just wish Realtek would hurry up and provide 5.1 for games via optical as you mentioned in your article, and then there will be literally no need for a soundcard ever again.

      I liked EAX when it was used correctly, sadly that was very rare (AvP springs to mind as a game that used it right) – and I think creative shot themselves in the foot by keeping it closed in an industry that more and more is abandoning any kind of closed standards.
      It’s the same reason why PhysX will only ever work so long as Nvidia keeps throwing money at Devs.

  42. Urthman says:

    1. I largely agree with Wulf that I wish more developers would have the attitude that blowing gamers away with graphics means investing in brilliant artists and art direction rather than just investing in better graphics technology.

    2. On the other hand, better tech makes a big difference even to stylized games. Have you seen Okami or Zelda: Wind Waker running in an emulator at 1920 x 1080 with 8x AA? Utterly fantastic compared to what the original console can do on a television. Even something totally abstract and simple like Everyday Shooter looks fantastic when you can have diagonal lines that are as crisp as vertical ones.

    3. It’s only a matter of time before someone makes another Crisis for PC. Video cards will keep getting more powerful (or cheaper, depending on how you look at it), and there will be a critical mass of DX11 cards out there, then a game will come out that will blow away anything you can possibly do on a PS3 and suddenly the PC graphics arms race will be on again.

  43. Scundoo says:

    Water is too clear
    Too much lens-flare

    Not impressed

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>