Intel set to release their first graphics card by 2020

Intel GPU

Watch out Nvidia and AMD. Intel have confirmed they’re going to release their first discrete GPU by 2020 – over Twitter, no less. First announced by Intel CEO Brian Krzanich last week during an analyst event according to Marketwatch, the race for best graphics card is about to get a whole lot more interesting, with Intel set to release GPUs for both data centre and AI applications as well as gaming – probably not unlike what our crack team of Photoshop monkeys have cooked up for us above.

For the three of you who have been following Intel’s job appointments over the last six months, this move into graphics cards will probably come as no surprise. They did, after all, hire former AMD Radeon graphics man Raja Koduri back in November 2017 to not only be their new chief architect, but also the senior vice president of Intel’s new Core and Visual Computing division and general manager of a new computing solutions group.

That’s the smile of a man with three different job titles.

At the time, Intel said Koduri would be responsible for expanding Intel’s already meaty integrated graphics department with “high-end discrete graphics solutions for a broad range of computing segments”, but remained shtum on exactly how fast those discrete GPUs might arrive. Now we know.

It’s not yet clear exactly whether Intel’s first graphics cards will be serious rivals to what Nvidia and AMD currently have on offer for gaming (or, indeed, will have on offer by 2020, be it Nvidia’s Turing graphics cards or their successors), but given Intel’s know-how over in the world of integrated graphics, I wouldn’t be surprised if we suddenly found ourselves with a three-horse GPU race instead of the current two.

Whatever Intel have up their sleeve, though, it will probably be some time before we hear anything more. The earliest possible date for a potential reveal would probably be at Las Vegas tech show CES next year, but Marketwatch argue that even this would be far too ambitious given that most graphics architecture and chip development cycles tend to take at least three years to complete. With that in mind, even Intel’s own release window of 2020 is an aggressive one, according to Marketwatch, but I guess we’ll have to wait and see what happens.

Either way, you can be sure we’ll keep you up to date as the Great Graphics Card Battle Royale kicks into gear.

30 Comments

  1. Premium User Badge

    Drib says:

    While Intel doesn’t exactly have a huge history of real graphics cards… this is interesting and I hope they make a solid entry. Not just because I want Intel to continue succeeding or whatever, but also because having a third company in the race for this might just help keep costs down and offer more options.

    Fingers crossed, I suppose.

    • HiroTheProtagonist says:

      Intel has previously done discreet graphics cards, but they tended to be less effective for the usual purposes (gaming) than offerings from ATI/nVidia/Voodoo/3dfx. I don’t hold much hope for them competing in the high end market, but a reasonably strong lower-mid market offering could easily compete with the GTX 1050 or RX 550.

      And in the end, it’s more about introducing competition into a market that’s been stagnant for nearly 20 years.

      • sosolidshoe says:

        I mean, sure competition is “good”(assuming you subscribe to the ideological underpinnings of neoliberal consumer capitalism, fnar fnar), but do we really need more at the “affordable” end? Even through Nvidia’s period of ostensible total dominance of the gaming community, there’s been plenty of back & forth over which brand is a better buy at any given time for budget builds, and a lot of the time ATI were “winning” that end of the market with ease.

        The issue has been ATI’s(AMD’s? I can never keep up with the naming and renaming and unrenaming) singular inability to challenge at the “enthusiast” end of the market that’s been an issue, and because of that inability Nvidia have been able to use the usual buzzwords about markets or whatever to push up the ceiling on pricing for enthusiast and midrange products – and don’t anyone try “cryptomining” me, this was happening long before that lunacy warped the market completely – and ATI have seemingly been happy to go along with that by just bumping up their own offerings to remain “a bit below the other guys”.

        If Intel come along and settle in at the low-to-midrange end of the market all that’s going to do is start a battle between them and ATI that likely ends with one or both dropping out of discrete graphics entirely, leaving us with either the same situation as now, the same situation as now with a different colour “other team”, or an even worse scenario where Nvidia have no real competition at all for a period of time.

        If Intel entering discrete graphics is going to be good for the industry and for consumers, as opposed to just good for Intel’s shareholders and bosses, they need to be willing and able to take a serious shot at Nvidia’s enthusiast crown.

        • Ryuthrowsstuff says:

          “The issue has been ATI’s(AMD’s? I can never keep up with the naming and renaming and unrenaming) ”

          You mean that whole thing where a company named ATI was bought by a Company named AMD. And for the past 8 years the entire entity has simply been “AMD”. There hasn’t been any re-naming or un-renaming. AMD bough ATI in 2006, 4 years after than it retired the brand. Its all been AMD since that point. No back and forth. Very little to keep straight.

          MEANWHILE. The market’s hardly been stagnant for 20 years. The market only barely existed 20 years ago. At worst we’re talking like a half decade where there’s been little fighting it out at the very, very top end of the market.

      • airmikee99 says:

        Stagnant for 20 years? Either you’re smoking some of the best shit medical botanists have ever grown, or you haven’t got a clue what you’re talking about.

        define: stagnant
        showing no activity; dull and sluggish. (can’t use the first definition because that’s about a body of water, which definitely doesn’t apply to this situation.)

        If the GPU market had been stagnant for 20 years we’d all still have DirectX 2.0 GPU’s running at 70 MHz with 4 MB of video RAM.

        • HiroTheProtagonist says:

          I meant competitively. There’s been no real competition in the market, and this move by Intel is arguably the first serious threat to the duopoly in nearly 20 years. You might want to learn to read context before assuming things.

          • Ryuthrowsstuff says:

            Dude the first discrete, consumer 3d GPUs to hit the market were released in the mid 1990’s. Taking it as 1995. That’s only 23 years the product category has even existed. And they really didn’t start to become a regular part of PC gaming until like 98. Manufacturers didn’t start to merge and/or collapse (or pull out for other areas of the market like Matrox) in the 00’s.

            The “duopoly” hasn’t existed much longer than 10 years at best.

      • FriendlyFire says:

        The market as well as Intel’s strategy are so very different this time around that using their prior attempts at entering the GPU space as reference points is largely meaningless.

        If Intel puts enough money and hires enough competent staff, I don’t see a reason why they couldn’t put out a competitive product. Their primary goal is most likely going to be competing with Nvidia in the enterprise compute sector, and that means they can’t settle for the equivalent of a midrange GPU. They need something that can effectively compete with Nvidia, and that should translate into an interesting high-end consumer GPU.

        The biggest problem Intel’s going to face is that they’ll have to adapt to narrower margins if they want to be competitive, and I’m not sure they’re going to be savvy enough to recognize that. They can’t rest on brand power here.

        • mitrovarr says:

          If they focus on the compute sector, the cards will likely be very poor for gaming. The drivers will be the hard part – making good drivers that offer strong gaming performance and aren’t buggy is a hugely difficult task. Nvidia and ATI (before it was part of AMD) both struggled for years to make good drivers before they succeeded, and they work almost continuously on them now, often introducing tweaks just to make one game work right or work faster. Catching up to those decades of driver work and tweaking is going to be incredibly difficult.

          • Excors says:

            It’s not like Intel is new to this – their integrated GPUs have to deal with almost exactly the same issues, perhaps just lagging behind by a couple of years since their drivers don’t need to bother immediately supporting cutting-edge games that demand too much performance anyway. And Intel already has teams that work with game developers to debug and optimise their games on Intel GPUs, and the tools for those developers to use, etc. A high-end gaming GPU would still require a big increase in software effort, but nowhere near as much as starting from scratch.

            I’d imagine they care more about compute than gaming though, since that’s a growing market and they don’t want to be left behind (like they were with mobile), and it’s probably easier for them to compete in since drivers are less important and silicon expertise is more important (and Intel is good at silicon).

    • brucethemoose says:

      Ah, but they do have a history. Intel attempted to make a discrete GPU more than once, but the experiments ended in failure before actual products were ever realized.

      Xeon Phi is kinda related to GPUs too (it’s basically the salvage of those old failures). If you’re willing to sell a few superfluous organs, you can buy one today.

      The 2020 release date implies that they started designing this GPU before hiring Raja, actually. 2 years isn’t a “tight” timeframe, its an impossible one, even for a expensive halo chip like a datacenter GPU.

    • fish99 says:

      Exactly, the sector needs more competition. The current near-monopoly nVidia has is bad news for customers.

      So, great news.

  2. TotallyUseless says:

    Introducing, Intel GFX card, with a whopping 120fps for MS Word at 4k and 30 fps 4K for MS Excel! The most cutting edge graphics card on the market!

    • Cederic says:

      Trust me, I know businesses that would pay good money for a chip that can run their Excel spreadsheets with updates at 30fps.

  3. icarussc says:

    Shtum? What on earth is shtum??

  4. woodsey says:

    So how does it work if more than one person buys him?

  5. nottorp says:

    It’s not the first Intel graphics card.
    link to en.wikipedia.org
    Not that they have a glorious history or anything.

  6. avrus96 says:

    Intel Larrabee confirmed!

  7. Premium User Badge

    phuzz says:

    So Intel are producing a graphics card, while also partnering with AMD for on chip graphics?
    I guess it makes sense to someone.

    • tehfish says:

      I don’t see the issue.

      Currently intel lack a high-performance GPU, so for decent high-performance onboard graphics *now* they had no choice but to outsource.

      After they put perhaps a year or two of development time in they could conceivably create a similar chip in-house. But right now they cannot.

      • Premium User Badge

        phuzz says:

        It’s rare for a company to partner with someone who is their main (and pretty much only) competitor in another market segment (ie CPUs). It’s even rarer for them to be simultaneously planning to compete with them in the same segment they’re currently partnering in.
        Like I said, I’m sure it does make sense, it’s just very unusual, hence noteworthy.

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>