One of the biggest challenges with the Nouveau open-source graphics driver for NVIDIA graphics hardware in recent times has been with regard to GPU / video memory re-clocking. As a minor step forward, NVIDIA has contributed re-clocking patches for the GK20A graphics processor.
Re-clocking has long been a big challenge for the Nouveau driver to obtain maximum graphics performance while also maintaining optimal performance-per-Watt and being efficient while idling. With the Linux 3.16 kernel for select generations of GPUs is faster performance but it can be buggy while now today for Tegra K1 owners NVIDIA has come to the table with re-clocking code for the "GK20A" GPU found within this high-end NVIDIA ARM SoC.
AMD has just published a massive patch-set for the Linux kernel that finally implements a HSA (Heterogeneous System Architecture) in open-source. The set of 83 patches implement a Linux HSA driver for Radeon family GPUs and serves too as a sample driver for other HSA-compatible devices. This big driver in part is what well known Phoronix contributor John Bridgman has been working on at AMD.
In this article just for putting the initial CentOS/SL results into some perspective, I have some initial data from a single Intel Core i7 system running these new releases plus Fedora and Ubuntu Linux. Just as some initial metrics to get started with our benchmarking, from an Intel Core i7 4770K system with 8GB of RAM, 150GB Western Digital VelociRaptor HDD, and Intel HD Graphics 4600, I tested the four Linux distributions. The hardware and its settings were maintained the same during testing.
Originally for this first article I also hoped to test Scientific Linux / CentOS 6.5 too, but after doing the 7.0 tests and trying to boot the 6.5 releases, there was a kernel error preventing the testing from being realized (on initial boot was the i915 DRM error about detecting more than eight display outputs; when booting without DRM/KMS mode-setting support, there would be an agpgart error.) The i915 issue is corrected on future kernel revisions but for this system it was preventing the 6.5 releases from running nicely. From an older, more workstation focused system I will be running the new vs. old CentOS/SL releases.
Succeeding last month's NVIDIA 340.17 Linux driver beta is now the first official release in the 340.xx driver series for Linux / Solaris / BSD. The NVIDIA 340.24 driver was released this morning with new features but is heavier on the fixing side.
The main feature to the NVIDIA 340.24 driver (and carried over from the 340.17 driver) is initial support on Linux for G-SYNC monitors. The proprietary NVIDIA Linux driver now has support for dealing with G-SYNC (NVIDIA's variable refresh-rate technology similar in nature to AMD FreeSync and VESA Adaptive-Sync -- the support came just months after we reported NVIDIA was working on G-SYNC Linux support.
In complementing the earlier Linux 3.16 file-system tests on an SSD (and the later Btrfs testing), here are benchmarks of EXT4, XFS, and Btrfs from the Linux 3.15 and 3.16 kernels being compared from a traditional rotating hard drive.
As has become common practice at Phoronix, for each new development kernel we end up benchmarking the most commonly used, mainline Linux file-systems on a hard drive and solid state drive. With the SSD results out there in the aforelinked articles, in this article are results using a high-performance Western Digital HDD from a Core i7 Haswell system running Ubuntu and comparing the mainline stable Linux 3.15 kernel against a daily snapshot of Linux 3.16 from this week.
X.Org Server 1.16 has been delayed. However, it's not been delayed like in some of the more notorious past releases due to outstanding bugs, etc, but over letting in a late feature to this latest revision of the X11 server.
There was a lot of developer interest and pressure to let non-PCI device support get merged for the 1.16 stable release. This non-PCI support is needed for allowing the NVIDIA Tegra (open-source) graphics driver to work properly in an easy manner since it's not exposed as a PCI VGA device. Thierry Reding at NVIDIA along with support from other developers worked out adding this non-PCI graphics support on the X.Org Server side.
The Deepin Desktop Environment is written using Google's Go language and makes use of heavy HTML5. DDE also uses Compiz as its compositing window manager. As in the past some desktop environments / window managers have impaired the full-screen Linux gaming performance, I ran some simple Linux gaming benchmarks on Sunday to see if the Deepin 2014 performance differed at all from upstream Ubuntu 14.04 LTS. Ubuntu 14.04 was tested with the stock Unity 7.2 desktop using Compiz, GNOME Shell 3.10.4, and Xfce 4.10 all from the stock Trusty Tahr archive.
NVIDIA has today released ther 331.89 Linux, Solaris, and FreeBSD graphics drivers within their long-lived 331.xx graphics driver branch.
The NVIDIA 331.89 graphics driver for Linux (and Solaris/FreeBSD) includes support for the GeForce GT 730 graphics card, support for X.Org Server 1.16, and a variety of bug-fixes. The bug-fixes are great and the GeForce GT 730 hardware enablement is also great while nearly all NVIDIA 331 graphics driver users will be able to appreciate the X.Org Server 1.16 support considering the xorg-server update isn't even scheduled to be released until next month, will make it into the H2'2014 Linux distribution updates, and chances are the AMD Catalyst driver won't even support the new server for some months based upon their historical turnaround times.
For some brief benchmarking during Independence Day in the US, I ran some tests comparing Ubuntu 14.04 LTS stable against a fresh development snapshot of Ubuntu 14.10.
Over Ubuntu 14.04, the Ubuntu 14.10 "Utopic Unicorn" in its current development state has the Linux 3.15 kernel (but will end up using Linux 3.16), Unity 7.3.0, Mesa 10.2 (10.3 should make it in time for Ubuntu 14.10), and GCC 4.8.3 (while GCC 4.9 should make it for 14.10).