Jump to content

Game Over For Graphics Cards?


Mini0n
 Share

Recommended Posts

Game over for graphics cards?

How Intel's Larrabee GPU tears up the rulebook

Spare a thought for AMD and NVIDIA. They have been happily smacking each other upside the head for a decade. But at least they have been doing so safe in the knowledge that their GPUs are distinct from – and inherently superior in graphics processing terms – to CPUs.

At least, that used to be the case. Intel has now unveiled Larrabee, a co-processor based on an entirely new approach to graphics processing. If Intel has done its sums correctly, not only will the very definition of the GPU be unceremoniously defenestrated, but also AMD and NVIDIA's graphics chips could even be pummelled to the very brink of existence.

If that sounds like hubris, try this for size. Larrabee could also tear up the rulebook for CPUs, too. That's right, a single new architecture might just take over as ultimate all-round processor, eventually cannibalising sales of Intel's own conventional CPUs.

What exactly is this deathly destroyer, this harbinger of doom made manifest in 45nm silicon? Some sort of retro-engineered alien technology? Well, here's the really hilarious bit. In simple terms, it's just a metric crapload of old Pentium MMX processors crammed into a single processor die and mounted on a PCI Express board much like any other graphics card.

Exactly how many of these cores Larrabee contains Intel will as yet not be drawn on. Given that chips based on the Larrabee architecture won't go on sale until late next year or early in 2010, it's entirely possible Intel has yet to finalise the core count.

Watching the clock cycles

However, we do know that the Pentium-derived design of the cores makes them much smaller than those found in an Intel Core 2 die. In fact, Intel suggests 10 Larrabee cores can fit in the same space as a single 65nm Core 2 Duo die. Extrapolate out from that using the knowledge that Larrabee will be based on 45nm silicon technology and we reckon Larrabee chips will boast at least 32 cores at launch.

Other than the Pentium link, the other major feature of the Larrabee core architecture is a superwide floating point unit. Capable of handling 16 instructions per clock cycle, it's four times as wide as the equivalent unit in one of Intel's existing desktop CPUs. Factor in each core's additional ability to support four software threads and the chip's potential is truly staggering. It's just possible that Larrabee might deliver 100 times the floating-point punch of a Core 2 Duo chip.

Impressive as that sounds, what we don't know is how good this battery of general purpose and x86-compatible cores will actually be for graphics processing. Sure, it will be much more programmable than any graphics chip before. But that doesn't mean it will be fast. Roll on 2009.

Fonte: techradar.com

Interessante.

Link to comment
Share on other sites

Game over for graphics cards?

How Intel's Larrabee GPU tears up the rulebook

(...) AMD and NVIDIA (...) have been happily smacking each other upside the head for a decade (...) At least, that used to be the case. Intel has now unveiled Larrabee, a co-processor based on an entirely new approach to graphics processing. If Intel has done its sums correctly, not only will the very definition of the GPU be unceremoniously defenestrated, but also AMD and NVIDIA's graphics chips could even be pummelled to the very brink of existence.

Basicamente. lol

Link to comment
Share on other sites

Game over for graphics cards?

How Intel's Larrabee GPU tears up the rulebook

Impressive as that sounds, what we don't know is how good this battery of general purpose and x86-compatible cores will actually be for graphics processing. Sure, it will be much more programmable than any graphics chip before. But that doesn't mean it will be fast. Roll on 2009.

Fonte: techradar.com

Interessante.

temos sp o senao

Link to comment
Share on other sites

Desde que ouvi falar disso pela primeira vez que fiquei de pé atrás. A Nvidia e a Ati têm grandes dificuldades para fazer os jogos e aplicações no geral escalar com 2 ou mais gpus. A própria Intel e também a AMD têm processadores com vários núcleos que não desempenham como se imagina(para a maioria dos utilizadores). A performance nunca escala a 100% com vários núcleos/unidades por ser impossível ter-se um aproveitamento máximo.

Neste cenário surge a Intel montada num cavalo branco com várias dúzias/dezenas de núcleos num chip. Todos sabemos a miséria de desempenho que os seus gráficos integrados são, não conseguindo competir com os da Nvidia e ATI. As drivers para os mesmos idem aspas. O que os leva a pensar que vão, à primeira, ter sucesso onde os concorrentes, que andam nisto há muitos anos, não conseguiram?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.