Language Selection

English French German Italian Portuguese Spanish

Graphics Leftovers

Filed under
Graphics/Benchmarks
  • RADV Lands VK_PIPELINE_CREATE_DISABLE_OPTIMIZATION_BIT

    The RADV Vulkan driver within Mesa has landed its VK_PIPELINE_CREATE_DISABLE_OPTIMIZATION_BIT support so applications/games can opt to disable optimizations when compiling a Vulkan pipeline. This is notably what was just covered the other day for helping to reduce stuttering with DXVK.

  • DXVK 0.51 Brings Fixes & Asynchronous Pipeline Compilation Support

    DXVK 0.51 is now available as the latest version of this library for running Direct3D 11 games under Wine via the Vulkan graphics API.

    The DXVK 0.51 release most notable adds asynchronous pipeline compilation support for Vulkan drivers making use of VK_PIPELINE_CREATE_DISABLE_OPTIMIZATION_BIT. This is the feature for reducing stuttering for games on DXVK and as of this morning is now supported by the RADV driver. We'll see how long it will take until the NVIDIA Vulkan driver and others support this feature. For now though DXVK ships with this support disabled and requires using the DXVK_USE_PIPECOMPILER=1 environment variable as this feature can cause hangs for Prey and potentially other titles.

  • VK9 Gets Better Support For Shaders, 64-bit Fixes

    While the rapidly maturing DXVK library has been capturing much of the limelight when it comes to piping Direct3D over Vulkan, the VK9 project targeting Direct3D 9 on top of Vulkan continues making progress.

  • Intel's Mesa Driver Prepares To Kill Off The Blitter

    Jason Ekstrand has spent some time away from the Intel ANV Vulkan driver to kill the hardware blitter usage within the i965 Mesa OpenGL driver.

    With a set of patches posted on Friday, the Intel Mesa driver eliminates its hardware blitter usage for Intel Sandy Bridge hardware and newer. Ekstrand explained that the graphics hardware blitter has been degraded on recent generations of Intel graphics, "On Sandy Bridge, the blitter was moved to another ring and so using it incurs noticable synchronization overhead and, at the same time, that synchronization is an endless source of GPU hangs on SNB. Some time around the Ivy Bridge time frame, we suspect that the blitter ended up with somewhat slower paths to memory than the 3D engine so it's slower in general. To make matters worse, the blitter does not understand any sort of compression at all and so using it frequently means having to do some sort of resolve operation."

  • Latest Intel ARB_gl_spirv Patches Published By Igalia

    It's almost one year since the release of OpenGL 4.6 and while there is support outside of the Mesa tree, mainline Mesa still doesn't support this latest OpenGL revision due to the holdups around SPIR-V ingestion support.

    Intel's i965 and AMD's RadeonSI drivers would have supported OpenGL 4.6 with mainline Mesa months ago, but they've been held up on the ARB_gl_spirv extension and the related ARB_spirv_extensions support. This work allows for SPIR-V modules to be used by OpenGL complementary to GLSL and allows for GLSL to also to be used as a source language for creating SPIR-V modules for OpenGL consumption. This is basically all about better interoperability between OpenGL and Vulkan -- not an easy task to implement.

  • RADV Adding New Bit To Help Avoid Stuttering With DXVK

    The RADV Vulkan driver will soon have VK_PIPELINE_CREATE_DISABLE_OPTIMIZATION_BIT to help avoid stuttering with DXVK for running Direct3D 11 games on Wine over Vulkan.

    While DXVK performance is already quite compelling and handling a surprising number of D3D11 games rendered via Vulkan considering how young this project is, DXVK and potentially the other Vulkan Linux drivers may soon see less stuttering.

  • Vulkan layer for Direct3D 11 & Wine 'DXVK' updated with fixes for Dark Souls 3, Overwatch & more

    DXVK [GitHub] is such an incredible project to bring Direct3D 11 support to Wine using Vulkan and another exciting release is now out.

More in Tux Machines

Red Hat News/Leftovers

Cloudgizer: An introduction to a new open source web development tool

Cloudgizer is a free open source tool for building web applications. It combines the ease of scripting languages with the performance of C, helping manage the development effort and run-time resources for cloud applications. Cloudgizer works on Red Hat/CentOS Linux with the Apache web server and MariaDB database. It is licensed under Apache License version 2. Read more

James Bottomley on Linux, Containers, and the Leading Edge

It’s no secret that Linux is basically the operating system of containers, and containers are the future of the cloud, says James Bottomley, Distinguished Engineer at IBM Research and Linux kernel developer. Bottomley, who can often be seen at open source events in his signature bow tie, is focused these days on security systems like the Trusted Platform Module and the fundamentals of container technology. Read more

TransmogrifAI From Salesforce

  • Salesforce plans to open-source the technology behind its Einstein machine-learning services
    Salesforce is open-sourcing the method it has developed for using machine-learning techniques at scale — without mixing valuable customer data — in hopes other companies struggling with data science problems can benefit from its work. The company plans to announce Thursday that TransmogrifAI, which is a key part of the Einstein machine-learning services that it believes are the future of its flagship Sales Cloud and related services, will be available for anyone to use in their software-as-a-service applications. Consisting of less than 10 lines of code written on top of the widely used Apache Spark open-source project, it is the result of years of work on training machine-learning models to predict customer behavior without dumping all of that data into a common training ground, said Shubha Nabar, senior director of data science for Salesforce Einstein.
  • Salesforce open-sources TransmogrifAI, the machine learning library that powers Einstein
    Machine learning models — artificial intelligence (AI) that identifies relationships among hundreds, thousands, or even millions of data points — are rarely easy to architect. Data scientists spend weeks and months not only preprocessing the data on which the models are to be trained, but extracting useful features (i.e., the data types) from that data, narrowing down algorithms, and ultimately building (or attempting to build) a system that performs well not just within the confines of a lab, but in the real world.