why do GPUs need to support Graphics APIs (on a hardware level)

769 views

like for example “GPU XXX Suports DX12”. which is touted as a feature on the retail box as well.
from what I understand the API is standardized for a certain graphical pipeline to draw various stuff on screen (polygon pipeline, brightness of a pixel pipeline…etc.) and it’s hardware agonistic, and it’s up to GPU manufacturers to “optimise” the workflow of said pipeline.
You can go into some detail, ELI10 if you will lol

In: Technology

4 Answers

Anonymous 0 Comments

Graphics APIs like DX12 are hardware agnostic to a degree, but the hardware needs to be capable of providing the features which are part of the API.

Take ray tracing for example. Say that the next version of DX includes an API for ray tracing (the existing API for it isn’t a core part of DX12). For a GPU to be compatible with that API it has to have the hardware to support ray tracing.

In theory that could be supported by implementing it in software in the driver, but that would be so slow it would be useless for most purposes. So it needs to have hardware support for it to be deemed fully compatible.

Anonymous 0 Comments

A graphic API is indeed a standardized way to ask the GPU to do stuff (This includes drawing, as well as general computation).

To support and API, GPUs have to be able to handle all the features that the API exposes to the programmer.
This includes stuff like supported shader instructions, texture formats.

For example, DX12 requires shader model 5.1 which includes a bunch of new instructions to do atomic computations (called `Interlocked…`). Theses instructions most likely require hardware support so GPUs that do not have the circuitry to perform these operations can not pretend to be DX12 compatible.

Anonymous 0 Comments

The API is a standard way to talk to a GPU – API support means the GPU can listen, and respond, to that. Back around the time Half-Life 2 came out a lot of GPUs were DX7-level and known as “fixed function”: you sent them triangles and they would draw and shade them in one specific way. Later versions of DX allowed you to send along a small program called a “shader” that determines the color of every individual pixel. Back then when a GPU advertised DX8 or DX9 support that meant it can understand and apply those shader programs and color the pixels in non-traditional ways. That’s how you got all these exotic, interesting looking materials like the water and stained-glass in HL2.

Anonymous 0 Comments

standardization mostly.

if there was no API mediating the interaction between OS and GPU then every single application that required access to the GPU would need ot be coded acccounting for every possible GPU the developer wanted ot support. This is a huge hassle for developers but aloso for the users that would experience cryptic errors due ot compatibility.

having an API mediating the GPU programming allows the developer ot develop their application towards a specific Feature set instead, the API woudl then make a number of API calls available that if the hardware doesnt support then it would not be compliant to tht feature set(ie: Dx12 having SM 5.1, but SM5.1 requires hardware support, if the gpu doesnt have that, then its not Dx12 compliant and ita assumed it cannot run applications requiring Dx12)