Page 3 of 3 FirstFirst 123
Results 41 to 56 of 56
  1. #41
    Network Hub
    Join Date
    Feb 2012
    Posts
    155
    Ignoring the whole Xbox/Windows 8 political thing, DirectX 11.2 sounds actually quite awesome.

    Essentially, for a minor bump in version, you get:


    • Dynamic resolution and hardware overlays, which should give designers another option for maintaining framerate by gradually dropping a little bit of resolution instead of removing details from the scene, all without any interface text becoming fuzzy. This and the reduction of frame latency they also announced could make games a lot smoother without noticeably changing image quality.
    • Hardware support for virtual textures, which can be used to create the huge and seamless (mega)textures as espoused by John Carmack in the game "Rage", but hopefully a lot more reliably than doing it in 'software'. It streams textures to the GPU allowing more memory to be used on whatever that you're currently looking at close up rather than what is just out of the way, putting more pixels where it matters.
    • Mappable GPU buffers, which helps reduce one of the major chokepoints for moving data between the CPU and GPU. It's a problem that's currently making GPU physics so slow and limited at the moment, for example.
    • Realtime shader linking, a surprisingly important feature depending on how it's used, can make GPUs a lot more flexible. Ever noticed how art applications don't use GPU acceleration for anything much but simple, fixed function filters like gauss blur, despite how the whole rest of the pixel-pushing experience needing more speed and less lag? The GPU "shaders" that produce graphical effects were just too inflexible, you just couldn't make one comprehensive enough to cope with anything an artist might want to do, but small and simple enough to run fast. Now custom shaders can be put together rapidly out of pre-made routines even as the app is running, meaning you can make new effects on the fly and use them in a lot more situations than previously.


    So yeah, those aren't minor features, and I'm a bit surprised they're not just bumping the version number to 12 instead of 11.2.

  2. #42
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,494
    Quote Originally Posted by Eleven View Post
    Dynamic resolution and hardware overlays, which should give designers another option for maintaining framerate by gradually dropping a little bit of resolution instead of removing details from the scene, all without any interface text becoming fuzzy. This and the reduction of frame latency they also announced could make games a lot smoother without noticeably changing image quality.
    Question - how is this different to what BIS do with the ARMA games? They have an option for a 3D resolution which affects 3D rendering and an 'interface' resolution which affects UI rendering. The text stays sharp but the rest of the game looks like an absolute blurry mess at anything other than native resolution. If what you've listed here is a similar sort of setup, that's not going to be a particularly good solution. Is it the same thing, or is this fancy new tech indistinguishable from sorcery?
    Nalano's Law - As an online gaming discussion regarding restrictions grows longer, the probability of a post likening the topic to the Democratic People's Republic of Korea approaches one.
    Soldant's Law - A person will happily suspend their moral values if they can express moral outrage by doing so.

  3. #43
    Secondary Hivemind Nexus Hypernetic's Avatar
    Join Date
    Mar 2012
    Location
    Philadelphia, PA
    Posts
    2,154
    Quote Originally Posted by soldant View Post
    Question - how is this different to what BIS do with the ARMA games? They have an option for a 3D resolution which affects 3D rendering and an 'interface' resolution which affects UI rendering. The text stays sharp but the rest of the game looks like an absolute blurry mess at anything other than native resolution. If what you've listed here is a similar sort of setup, that's not going to be a particularly good solution. Is it the same thing, or is this fancy new tech indistinguishable from sorcery?
    It's an overlay not all rendered together. Have you ever used mumble or anything? I assume it's like that. The UI is processed separately and displayed as an overlay on top of the 3D image.

    Or is that what ARMA does?

  4. #44
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,494
    Quote Originally Posted by Hypernetic View Post
    It's an overlay not all rendered together. Have you ever used mumble or anything? I assume it's like that. The UI is processed separately and displayed as an overlay on top of the 3D image.

    Or is that what ARMA does?
    I don't know exactly how it works, but the monitor remains at (for example) native resolution which keeps the UI looking crisp, but the 3D viewport is rendered at a lower resolution and appears to be upscaled to native (causing blur). It might be similar to Mumble or the Steam overlay except the entire screen isn't rendered at a low res, rather it stays at native and it's the 3D viewport that's rendered at a lower res.

    EDIT: So to clarify, your monitor is set at 1920x1080 and the UI renders at that resolution, but the 3D view is rendered at 1280x720 and upscaled.
    Nalano's Law - As an online gaming discussion regarding restrictions grows longer, the probability of a post likening the topic to the Democratic People's Republic of Korea approaches one.
    Soldant's Law - A person will happily suspend their moral values if they can express moral outrage by doing so.

  5. #45
    Secondary Hivemind Nexus Hypernetic's Avatar
    Join Date
    Mar 2012
    Location
    Philadelphia, PA
    Posts
    2,154
    Quote Originally Posted by soldant View Post
    I don't know exactly how it works, but the monitor remains at (for example) native resolution which keeps the UI looking crisp, but the 3D viewport is rendered at a lower resolution and appears to be upscaled to native (causing blur). It might be similar to Mumble or the Steam overlay except the entire screen isn't rendered at a low res, rather it stays at native and it's the 3D viewport that's rendered at a lower res.

    EDIT: So to clarify, your monitor is set at 1920x1080 and the UI renders at that resolution, but the 3D view is rendered at 1280x720 and upscaled.
    I think this new thing is the opposite. So you would have the 3D view at 1920x1200 and the UI at a lower resolution that is easier to read? I guess. If that's the case I'm all for it, I can barely read small print out of one eye as it is.

  6. #46
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,494
    Quote Originally Posted by Hypernetic View Post
    I think this new thing is the opposite. So you would have the 3D view at 1920x1200 and the UI at a lower resolution that is easier to read? I guess. If that's the case I'm all for it, I can barely read small print out of one eye as it is.
    No I think it's closer to the ARMA thing, since it says this:
    Quote Originally Posted by post
    option for maintaining framerate by gradually dropping a little bit of resolution instead of removing details from the scene, all without any interface text becoming fuzzy.
    And that's pretty much what ARMA2 and 3 do - the UI renders at native resolution so that it doesn't look like a blurry mess (you can scale it however you like) while the 3D port renders at whatever and looks like someone smeared grease all over your face.
    Nalano's Law - As an online gaming discussion regarding restrictions grows longer, the probability of a post likening the topic to the Democratic People's Republic of Korea approaches one.
    Soldant's Law - A person will happily suspend their moral values if they can express moral outrage by doing so.

  7. #47
    Secondary Hivemind Nexus Sparkasaurusmex's Avatar
    Join Date
    May 2012
    Location
    Texas
    Posts
    1,399
    What video cards will be able to support this 11.2? Nothing that's out yet?

  8. #48
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,382
    Quote Originally Posted by Sparkasaurusmex View Post
    What video cards will be able to support this 11.2? Nothing that's out yet?
    Presumably at least the Radeon HD 7000 series, since it uses the same architecture as the Xbone. Nothing is confirmed though.

  9. #49
    Secondary Hivemind Nexus Hypernetic's Avatar
    Join Date
    Mar 2012
    Location
    Philadelphia, PA
    Posts
    2,154
    Quote Originally Posted by Sparkasaurusmex View Post
    What video cards will be able to support this 11.2? Nothing that's out yet?
    It depends if it requires hardware changes or not. I'd hope that at least 11.1 cards could run it. If new 11.2 hardware is required for it then expect developers to never use most of the features.

    I just built a new PC a month ago with a GTX 770 in it, so I seriously hope they don't require new shit.

  10. #50
    DX 11.1 hardware WON'T support 11.2... C'mon now, when was a DX change EVER supported by a previous generation? They WANT you to buy more gpu's, that's part of the whole point.

  11. #51
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,382
    Quote Originally Posted by Micheldemono View Post
    DX 11.1 hardware WON'T support 11.2... C'mon now, when was a DX change EVER supported by a previous generation? They WANT you to buy more gpu's, that's part of the whole point.
    This time. The Xbox One uses GCN graphics, which is DX 11.1 hardware.

  12. #52
    Secondary Hivemind Nexus Hypernetic's Avatar
    Join Date
    Mar 2012
    Location
    Philadelphia, PA
    Posts
    2,154
    Quote Originally Posted by Micheldemono View Post
    DX 11.1 hardware WON'T support 11.2... C'mon now, when was a DX change EVER supported by a previous generation? They WANT you to buy more gpu's, that's part of the whole point.
    But not many people are going to for a few odd features, especially ones that most games won't even implement. Which will just make it a dead API.

    Hell, just because microsoft makes a new DirectX/3D version doesn't mean hardware manufacturers will even pick it up. Most "directx 11.1" cards are still direct3D feature level 11_0, meaning they don't even use all of the 11.1 shit.

    Quote Originally Posted by Sakkura View Post
    This time. The Xbox One uses GCN graphics, which is DX 11.1 hardware.
    Also this.

  13. #53
    Network Hub
    Join Date
    Feb 2012
    Posts
    155
    According to a Microsoft blog post on the first of July, the hardware feature level hasn't been defined yet, so we won't know what will or won't support it for awhile yet. They do say that all of the new features will be optional for hardware producers to implement, which means that game makers can't depend on having any of them. It will probably work like most DX11 support in games at the moment, where games don't depend on it but can use it if you have the hardware.

    http://blogs.msdn.com/b/chuckw/archi...3-edition.aspx
    Last edited by Eleven; 08-07-2013 at 04:23 PM. Reason: removing duplicated duplicate wording

  14. #54
    Network Hub
    Join Date
    Jul 2011
    Posts
    133
    Quote Originally Posted by soldant View Post
    Question - how is this different to what BIS do with the ARMA games? They have an option for a 3D resolution which affects 3D rendering and an 'interface' resolution which affects UI rendering. The text stays sharp but the rest of the game looks like an absolute blurry mess at anything other than native resolution. If what you've listed here is a similar sort of setup, that's not going to be a particularly good solution. Is it the same thing, or is this fancy new tech indistinguishable from sorcery?
    Doing it 'by hand' like ARMA does means that if either of the layers changes (either UI or 3d view) then the whole screen has to be redrawn (potentially a *lot* of pixels). By pushing it down as a standard feature of the hardware means it can be more efficient about building up the final image - it could even generate the final framebuffer sent to your monitor on the fly.

    Basically it's adding a real-time compositing / layering bit of hardware at the end of the graphics card pipeline. This used to be really popular back in the day of separate 2d/3d graphics cards, or separate mpeg decoder cards - the two outputs would be generated by different hardware and the actual VGA output generated on the fly by overlaying the two (usually with some kind of overlay mask - anyone who's tried to take a screenshot of a video playback and got a bright pink square has seen it. :) )

  15. #55
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,494
    Quote Originally Posted by OrangyTang View Post
    By pushing it down as a standard feature of the hardware means it can be more efficient about building up the final image - it could even generate the final framebuffer sent to your monitor on the fly.
    Okay so if I'm reading this right, it's a similar sort of technique but implemented at the hardware level for improved performance? Because if so... who the hell wants to use that? It looks ugly as sin.
    Nalano's Law - As an online gaming discussion regarding restrictions grows longer, the probability of a post likening the topic to the Democratic People's Republic of Korea approaches one.
    Soldant's Law - A person will happily suspend their moral values if they can express moral outrage by doing so.

  16. #56
    Network Hub
    Join Date
    Feb 2012
    Posts
    155
    I'm guessing here, but everything I have seen so far makes me think the dynamic resolution feature works quite subtly. The idea seems to be that you drop very small amounts of resolution, a few pixels at a time, as only people with magnifying glasses are likely to tell an (properly anti-aliased) image has dropped from 1080p to 1075p. Even that small drop results in about 15000 less pixels to render, helping to keep the framerate up without having to resort to removing graphical effects like games do now.

    If works like I think it does, it dynamically chooses the minimum amount of resolution to drop to maintain a framerate, and you may never notice that your game which runs at 1080p for 95% of the time occasionally drops a little resolution when things get busy.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •