Page 3 of 4 FirstFirst 1234 LastLast
Results 41 to 60 of 61
  1. #41
    Network Hub orcane's Avatar
    Join Date
    Sep 2011
    Location
    Switzerland
    Posts
    136
    Could have, would have, should have.

    I'm sorry but this is silly. You're basically asking them to endlessly support an old operating system because it's remotely possible to port new APIs to it, because an entirely different API can do it too.

    Well newsflash, DirectX is not OpenGL and whether we like it or not the former has been the standard for Windows gaming for quite a while. Sure Microsoft could do a lot of things, but in the end they decided to "catch up" with Direct3D 10 with the new kernel and driver model of their new operating system in mind without having to tie it back into their "expiring" previous OS, and it's idiotic to blame them for that. Windows 2000 was a lot closer to XP than Vista and even that was eventually outdated for games because it stopped getting the newer DLLs with all the little changes and advances in minor DirectX releases.

    I severely doubt that the whole DX10 thing was just a PR stunt to get people to shell out for a new OS and hardware, because realistically that wouldn't work and everyone could already see that back then - the enthusiast market is tiny compared to Joe User and corporate users who have zero interest in the latest DX10 games/hardware. So the majority of Windows users didn't care, and the smarter enthusiasts knew that eventually they'd upgrade anyway and new cards will support the API that wouldn't be used widely in games for years to come.

    So in terms of conspiracy theories yours is pretty weak.
    Last edited by orcane; 26-06-2012 at 03:13 PM.
    Stealth Mode!

  2. #42
    Network Hub
    Join Date
    May 2012
    Posts
    356
    Quote Originally Posted by Unaco View Post
    Except for things like, as Soldant says, "new Windows Display Driver Model introduced in Vista (which enables useful things like preventing the GPU driver from taking down the whole OS in some cases) isn't something that could be ported back to XP, and ultimately enables a technological progression beyond DX9?" The D3D10 stuff needed WDDM and hardware changes... WDDM couldn't be ported back to XP, and so an updated OS was required.

    Then there were things like overhauling the whole API, deprecation of DirectSound and the new X-Plat Audio stuff, better hyper/multi-threading support, better paging of gfx memory. There were advances... they may not have been blindingly obvious to users, and may have been more behind the scenes, in the hands of the Developers and how the OS is handling things.
    WDDM may have been introduced in part to reduce system crashes among other things, but it does not require a hardware upgrade to a DX10 compatible card, it supports DX9 cards.

    DirectX 10 may have overhauled and deprecated some functionality but since when was deprecating outdated functions a requirement for a hardware change, especially when hardly anyone transitioned over to Vista. Nothing you actually mention necessitates a hardware upgrade, beyond Microsoft making a hardware upgrade a requirement if you wanted to use the "new" rendering features of Direct3D 10.

    As I said before the visible additions to Direct3D that Microsoft used in its marketing, did not actually require some new hardware.

    Quote Originally Posted by orcane
    I severely doubt that the whole DX10 thing was just a PR stunt to get people to shell out for a new OS and hardware, because realistically that wouldn't work and everyone could already see that back then - the enthusiast market is tiny compared to Joe User and corporate users who have zero interest in the latest DX10 games/hardware. So the majority of Windows users didn't care, and the smarter enthusiasts knew that eventually they'd upgrade anyway and new cards will support the API that wouldn't be used widely in games for years to come.
    Evidently you missed the marketing push around the time of Vista, there was a very heavy emphasis on DirectX, as I said they even mentioned dropping OGL support which drove alot of developers over to DirectX in fear of OGL support being cut, MS was planning to layer OGL over Direct3D for the sake of backwards compatibility, but at the same time killing performance and meaning OGL could not be updated beyond the version supported along with no ability to extend it. By blocking OGL ICDs as well you couldn't restore the native support for OGL.

    There was genuine fear that OGL development would disappear on Windows platforms:

    http://www.opengl.org/discussion_boa...87#post1153687
    Last edited by byteCrunch; 26-06-2012 at 03:50 PM.

  3. #43
    Secondary Hivemind Nexus b0rsuk's Avatar
    Join Date
    Nov 2011
    Posts
    1,308
    Don't forget Shadowrun. It was hyped as a Vista game, but a simple crack enabled it to work on Windows XP:


    1. 1. Extract RARs
    2. 2. Install
    3. 3. Go into your installationfolder and delete the srsw_shadowrun.dll.
    4. 4. Copy srs_shadowrun.dll and srsx_shadowrun.dll to your installationfolder.
    5. 5. If you want the XP compatability, copy the files from the xp/ subdir to
    6. your installationfolder.
    7. 6. If you want a shortcut on your desktop, this is the time to make one!
    8. 7. Play!
    9. 8. Have fun!


    It worked until they released a patch.

    pass

  4. #44
    Secondary Hivemind Nexus SirKicksalot's Avatar
    Join Date
    Jun 2011
    Posts
    2,508
    Shadowrun and Halo 2 loaded bigger data chunks in the RAM memory than XP allowed. IIRC Halo 2 only did this once in the Gravemind level. The cracks forced them to split the data.

  5. #45
    Activated Node
    Join Date
    Apr 2012
    Posts
    31
    Quote Originally Posted by byteCrunch View Post
    Except for the fact in actuality DX10 and 11, could have just been DX9.1 and 9.2, DX10 was more or less created as a separate product that required an update to help push sales of Windows Vista and hardware, not to mention all the really misleading marketing surrounding it and Microsofts threats to completely drop OpenGL support with Vista.

    There is no magic new features in DX10 that actually required a hardware upgrade, just look at Crysis you could enable all the DX10 features on a DX9 card, it was purely to help sell hardware and Vista, by restricting DX10 to Vista onwards. This ultimately mislead many, many gamers into believing that the only way to access these latest advances was by upgrading to Vista and upgrading their GPU, which is just a load of crap.

    If developers really want the latest and greatest, OpenGL is really the only way, since all the latest GPU features are always OGL extensions long before they make it into DX, tessellation for example had been in OGL for atleast three years (and was possible long before even that) before DX10 incorporated it, and as far as I am aware with updated drivers all features of OGL are fully backwards compatible right back of XP.

    DX10 and 11 isn't progress, that's just Microsoft trying to catch up.
    Are you kidding?

    DX10 is a redesign from scratch of the whole API, and also a redesign of GPU hardware with shaders being unified and geometry shaders added (as well as compute shaders, although they were only exposed via CUDA or ATI Stream).

    It includes a ton of things that require an hardware upgrade, such as geometry shaders, correct sRGB blending, additional texture formats, pixel shader features like texture lookups in vertex shaders, integer operations in shaders, system values in shaders, higher limits, etc. (these just off my mind)

    OpenGL had a version of tessellation that was not hardware accelerated, not programmable and not used by anyone.

    In other words, your whole post is bullshit.

    Regarding restricting DX10 to Vista, it was obviously done to sell Vista, although porting WDDM and DX10 to XP would have taken some engineering effort.

  6. #46
    Network Hub orcane's Avatar
    Join Date
    Sep 2011
    Location
    Switzerland
    Posts
    136
    An effort they understandably didn't want to make for an aging operating system that wouldn't get much use out of the new feature anyway.

    Making Halo 2 and Shadowrun (and Alan Wake before it was moved to Xbox exclusively) Vista-exclusive was bad but also had nothing to do with DirectX 10 and everything with Vista PR. It's justified to complain about this, but not porting DX10 to XP was fine.

    It also didn't affect most games - before reinstalling and using Windows 7, I've had Vista64 and XP on my PC since 2009 and IIRC I only played a single game since then that had no DX9 render path for XP (Just Cause 2).
    Stealth Mode!

  7. #47
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,364
    Quote Originally Posted by Finicky View Post
    In my case neither : (bunch of devices) ... had driver support, and they never got it either.
    Mysteriously, windows 7 recognised all 3 during the OS install, oh MS , you only try when you have new shit to hock, don't you).
    Vista was the end process of what, 6 years development? Granted much of that development only took shape in the latter quarter of that period of time, but with a public release candidate and ample time to sort out driver issues, device manufacturers still didn't manage to get it right. Creative in particular deserve to be blamed to hell and back; they knew full well that the audio system in Vista was being rewritten and their current drivers were useless, but they didn't do jack shit about it. Windows XP was built in a time when 512MB of RAM was adequate, single core processors were pretty much all the consumer market had, and 16-bit legacy support still couldn't be dropped because too many apps relied upon it. You can't keep bolting things on, it'll collapse eventually. You can't blame Microsoft for 3rd parties not getting their act together.

    Quote Originally Posted by b0rsuk View Post
    This is very vague and of no relevance to most gamers.
    I'm not justifying DX10's existence, but DX10, 11 or a hypothetical DX20, in any event it relies on a new display driver model which wasn't going to get ported back to XP. It was going to be far too difficult to do so. If it was so incredibly easy, just an arbitrary decision made by marketing, then projects like that one designed to enable DX10 support in XP would have been successful. Spoiler: They never achieved anything of note and they all died.

    Quote Originally Posted by byteCrunch View Post
    ...and Microsofts threats to completely drop OpenGL support with Vista.
    Completely drop OpenGL support? That was more FUD spread by idiots with a chip on their shoulder.

    Quote Originally Posted by wuwul View Post
    Regarding restricting DX10 to Vista, it was obviously done to sell Vista, although porting WDDM and DX10 to XP would have taken some engineering effort.
    Some engineering effort, involving porting the new display driver model into an ageing kernel from an entirely different generation to the newer kernels. WinXP simply isn't robust enough to support something like that. There was no way it was going to be backported.

    Also it's kind of amusing that people think that DX10 was the major selling point of Vista. I shouldn't have to remind you that PC gamers aren't the majority in the PC world at all; by and large PCs do work-related tasks. The casual "PC gamers" don't even need a meaty GPU with the capability to use DX10 effectively, so you can count them out of the 'marketing' hype. Really, Microsoft arbitrarily limited DX10 to sell more copies of Vista. Copies which were probably pirated or OEM versions with bits and pieces of hardware. Sure thing...

  8. #48
    Network Hub
    Join Date
    Jun 2011
    Posts
    445
    Quote Originally Posted by soldant View Post
    Vista was the end process of what, 6 years development? Granted much of that development only took shape in the latter quarter of that period of time, but with a public release candidate and ample time to sort out driver issues, device manufacturers still didn't manage to get it right. Creative in particular deserve to be blamed to hell and back; they knew full well that the audio system in Vista was being rewritten and their current drivers were useless, but they didn't do jack shit about it. Windows XP was built in a time when 512MB of RAM was adequate, single core processors were pretty much all the consumer market had, and 16-bit legacy support still couldn't be dropped because too many apps relied upon it. You can't keep bolting things on, it'll collapse eventually. You can't blame Microsoft for 3rd parties not getting their act together.
    I remember seeing a pie chart showing the causes of reported crashes in Vista in the early days, and something like 40% were due to video drivers. The whole myth that Vista was "shit" is pretty much based on this kind of thing, while everyone gladly praises Win7 for being awesome despite the fact that it's virtually identical. Just goes to show how effective a re-brand can be.

  9. #49
    Secondary Hivemind Nexus Lukasz's Avatar
    Join Date
    Jun 2011
    Posts
    1,641
    but win7 is to vista what XP SP1 was to vanilla XP was isn't it?

    other problem with vista was that it was too power hungry. and it was installed on notebooks which could not run it properly. that problem does not exist with win7 because tech moved on and i think it is less power hungry than vista.

    so the issue is not that rebrand is super effective (it does contribute tough) but that win7 genuinely was better system for users in day one than vista.
    and for users it does not matter whether it is virtually identical. it matters only that when they had vista they had troubles. when they got win7 they did not have those troubles.

  10. #50
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,364
    Quote Originally Posted by Lukasz View Post
    but win7 is to vista what XP SP1 was to vanilla XP was isn't it?
    Not even. WinXP didn't start to get good until at least SP2 by which time technology was moving ahead and driver support was much improved. Win7 is to Vista as Win98 was to Win95. XP was still pretty bad with SP1... in fact I remember a lot of people bitching that SP1 made it worse initially.

    Quote Originally Posted by Lukasz View Post
    and for users it does not matter whether it is virtually identical. it matters only that when they had vista they had troubles. when they got win7 they did not have those troubles.
    That's true, but that doesn't make it any less irrational nor does it mean that Vista was really a bad product.

  11. #51
    Secondary Hivemind Nexus Lukasz's Avatar
    Join Date
    Jun 2011
    Posts
    1,641
    Quote Originally Posted by soldant View Post
    nor does it mean that Vista was really a bad product.
    it kinda does.

    it was worse for most people than xp.
    therefore
    it was a bad product because superior product was on the market.

    because of tech. advance and ironing of bugs Vista 2.0 aka Win 7 became a better choice than XP. Sans why it is a good product.

  12. #52
    Secondary Hivemind Nexus soldant's Avatar
    Join Date
    Jun 2011
    Location
    Terra Australis Incognita
    Posts
    4,364
    Quote Originally Posted by Lukasz View Post
    it was worse for most people than xp.
    therefore
    it was a bad product because superior product was on the market.
    So my 1080p TV is a bad product if I try to play 480p video on it and it looks bad because there's no 1080p source available?

    Vista wasn't inherently bad. Driver support was bad, but the OS itself was not. Microsoft can't be blamed for that.

  13. #53
    Network Hub orcane's Avatar
    Join Date
    Sep 2011
    Location
    Switzerland
    Posts
    136
    They actually can be blamed for many things in Vista.

    Despite of some improvements Vista was quite the resource hog and had several issues that weren't fixed till 7 (and some never, eg. it was a massive step backwards in letting users customize their UI experience that was exacerbated with the mandatory start menu redesign in 7). These things were not driver problems, like the way the OS rendered the desktop/UI which occasionally made it seem unresponsive until they completely changed it for 7. Vista was also quite a mess before SP1 - it was sluggish, copy/move was slow, UAC was obnoxious, etc.

    It was clearly usable and in several ways an improvement over XP, but not for the average user. It's too easy to say "Vista was good, just 3rd party drivers were bad", and inaccurate.
    Stealth Mode!

  14. #54
    Network Hub
    Join Date
    May 2012
    Posts
    356
    Quote Originally Posted by soldant View Post
    Completely drop OpenGL support? That was more FUD spread by idiots with a chip on their shoulder.
    Microsoft started the said FUD, they demonstrated that they were going to layer OGL 1.4 over DirectX, they left the OpenGL review board among several other things.

    At the end of the day it achieved the desired result with alot of OGL graphics programmers being worried they might be out of the job if they didn't switch, so they did, this is why we are at the point where DirectX is so dominant in gaming, only now thanks to mobile platforms/indie developers and OGL ES are we even seeing OGL make a comeback in the gaming space. Widespread OGL use would give more access to games for people on non-Windows platforms.
    Last edited by byteCrunch; 27-06-2012 at 11:25 AM.

  15. #55
    Network Hub orcane's Avatar
    Join Date
    Sep 2011
    Location
    Switzerland
    Posts
    136
    DirectX was already dominant in Windows gaming before Vista, and OpenGL largely ignored in Ati/Nvidia Windows drivers...
    Stealth Mode!

  16. #56
    Lesser Hivemind Node
    Join Date
    Jun 2012
    Posts
    900
    Quote Originally Posted by soldant View Post
    Vista was the end process of what, 6 years development? Granted much of that development only took shape in the latter quarter of that period of time, but with a public release candidate and ample time to sort out driver issues, device manufacturers still didn't manage to get it right. Creative in particular deserve to be blamed to hell and back; they knew full well that the audio system in Vista was being rewritten and their current drivers were useless, but they didn't do jack shit about it. Windows XP was built in a time when 512MB of RAM was adequate, single core processors were pretty much all the consumer market had, and 16-bit legacy support still couldn't be dropped because too many apps relied upon it. You can't keep bolting things on, it'll collapse eventually. You can't blame Microsoft for 3rd parties not getting their act together.

    Hell if I can't!

    As I said, win 7 DID recognise all the above hardware in the same pc with legacy drivers to support it.

    Why do you focus on creative so much? Are you missing the part where it also didn't support my (extremely popular and not even 2 years old) nforce 2 mobo's built in sound and network cards? That there was no driver that worked with the ever so popular and still relevant and recent radeon 9800 pro (granted I blame AMD for this more).

    And again , win 7 had proper drivers included for all of those!
    Clearly they can if they want to, they just couldn't be arsed with vista.

    Even in the case of creative sound cards MS shared the responsibility to make an OS that frigging FUNCTIONS with current (and extremely common and popular) hardware.
    Like one in three people had an audigy or audigy 2 back then...
    Last edited by Finicky; 27-06-2012 at 03:29 PM.

  17. #57
    Secondary Hivemind Nexus Xercies's Avatar
    Join Date
    Jun 2011
    Posts
    2,139
    If companies couldn't be bothered to make new drivers and stuff for Vista I think that tells you much about how they even thought Vista was going to suck.

  18. #58
    Lesser Hivemind Node
    Join Date
    Jun 2011
    Posts
    978
    Quote Originally Posted by Mistabashi View Post
    People who call themselves "IT professionals" can be a surprisingly superstitious and poorly-informed bunch from my experience.
    Nobody who called themselves an IT professional would be recommending Windows in the first place. Unix has cheaper licensing, which means more IT budget for you, and far less chance of having the support outsourced to Bangalore.

    Quote Originally Posted by byteCrunch View Post
    Except for the fact in actuality DX10 and 11, could have just been DX9.1 and 9.2
    We already had a DX 9.1, 9.2, 9.3 and on to letters of the alphabet. The entire reason DX10 exists is that they rebuilt the API from scratch, removing most of the useless bloat it had accumulated since 6.0.
    The reason it required a hardware upgrade was because a large part of that bloat was down to backwards compatibility with hardware that, should it still happen to exist anywhere, would likely be of great interest to a museum. It's also why it wasn't backwards compatible with windows, because they shifted things around between the HAL and the API. None of which makes a blind bit of difference to the average consumer, but it does make a world of difference to the people who actually pay Microsoft to use the API.

  19. #59
    Secondary Hivemind Nexus alms's Avatar
    Join Date
    Dec 2011
    Posts
    3,704
    Quote Originally Posted by soldant View Post
    That's true, but that doesn't make it any less irrational nor does it mean that Vista was really a bad product.
    It could be argued that with all the time and resources spent on developing Vista, one had every right to expect a much better product. Then again, most projects stuck in project eventually suck at release time.

    And I still think most of the Vista (graphics) drivers fiasco at launch is to be blamed at least 50% on manufacturers.
    Looking for you daily bundle fix? Join us on The onward march of bundles
    Stalk my Steam profile, or follow my fight against the backlog on HowLongToBeat.

    "You take the Klingon's detached hand"

  20. #60
    Secondary Hivemind Nexus Lukasz's Avatar
    Join Date
    Jun 2011
    Posts
    1,641
    Quote Originally Posted by soldant View Post
    So my 1080p TV is a bad product if I try to play 480p video on it and it looks bad because there's no 1080p source available?
    YES
    if you had 720p TV on the market which would play 480p much better

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •