Language Selection

English French German Italian Portuguese Spanish

ATI's MultiVPU solution, don't get caught in the crossfire?

Filed under
Hardware

We'll have to give it to ATI for keeping our gaze fixed upon their new product for so long, whilst being fed all sorts of incomplete information about prospective performance and features. Now that the curtain has dropped on ATI's multi graphics processor solution we can only wonder what they have been doing for the past six months. Initially ATI commented that their solution would be a flexible and elegant one, and would, for example, work on any motherboard that has two PCIe slots, regardless of configuration. We would also be able to combine any two ATI PCIe graphics cards and get a boost in performance.

ATI was also quick to comment on NVIDIA's solution being a cumbersome one, requiring a special SLI motherboard, two identical graphics cards and last but not least an internal SLI connector to establish communication between the two cards. From looking at the ATI Crossfire solution they managed to eliminate none of these "drawbacks" as their solution has about the same requirements as NVIDIA's. You will also need a new motherboard sporting an ATI chipset with Crossfire support, a "master" graphics card that will work with any 2nd ATI PCIe graphics card and last, but not least, an external dongle to enable the two cards talk to each other.

So we are left scratching our heads, exactly how is this solution more elegant and flexible than NVIDIA's? At least NVIDIA's solution works with any 6800 or 6600 series graphics card, the Crossfire solution requires the purchase of a +$500 master card, so much for flexibility. And what's with that external dongle? An internal connector to establish communication and freeing the bracket of cable clutter and enabling a 2nd DVI or S-Video output is a far more elegant solution. By the looks of it the affordable SLI alternative that Crossfire was pitched as a few months ago has now turned into an expensive and not at all flexible solution that does not offer anything substantial over NVIDIA's. For the time being we'd suggest you stick with NVIDIA's solution and don't get caught in the crossfire.

Sander Sassen.

More in Tux Machines

Leftovers: OSS

Development News

  • GCC 7 Moves Onto Only Regression/Doc Fixes, But Will Accept RISC-V & HSA's BRIG
    The GNU Compiler Collection (GCC) is entering its "stage four" development for GCC 7 with the stable GCC 7.1 release expected in March or April. Richard Biener announced today that GCC 7 is under stage four, meaning only regression and documentation fixes will be permitted until the GCC 7.1.0 stable release happens (yep, as per their peculiar versioning system, GCC 7.1 is the first stable release in the GCC 7 series).
  • 5 ways to expand your project's contributor base
    So many free and open source software projects were started to solve a problem, and people began to contribute to them because they too wanted a fix to what they encountered. End users of the project find it useful for their needs, and the project grows. And that shared purpose and focus attracts people to a project's community.
  • Weblate 2.10.1
    This is first security bugfix release for Weblate. This has to come at some point, fortunately the issue is not really severe. But Weblate got it's first CVE ID today, so it's time to address it in a bugfix release.

Intel Kabylake: Windows 10 vs. Linux OpenGL Performance

For those curious about the current Kabylake graphics performance between Windows 10 and Linux, here are some OpenGL benchmark results under each operating system. Windows 10 Pro x64 was tested and the Linux distributions for comparison were Ubuntu 16.10, Clear Linux, Antergos, Fedora 25 Xfce, and openSUSE Tumbleweed. Read more

Google's open-source Tilt Brush: Now you can create 3D movies in VR