Vulkan VS DX12, a user perspective VS coders needs
category: gfx [glöplog]
xtri1m: i'm not familiar with vulkan yet, but i'd be very surprised if that was all needed. (I write a lot of Metal code, and a small fraction of that is needed to render a triangle - more than GL, but probably not unbearable for 4k)
The primary point of vulkan/dx12 - in my view - is that with higher level APIs the API can't do certain assumptions and thus has to do things in less efficient way.
In the past, the APIs have done moves towards this by, for instance, introducing vertex buffer objects - the driver can assume a lot of things about a VBO it can't for a naked array.
Exposing a lot of what the driver has to go through to developers will also force people (well, those people who care) to write more "driver friendly" code, which should improve performance.
Vulkan/dx12 still don't mean there's no driver. The driver still has a lot to do, to mask differences in hardware for instance (whether those differences are by design or unintentional), and whatever nvidia does in their "game ready drivers"..
In the past, the APIs have done moves towards this by, for instance, introducing vertex buffer objects - the driver can assume a lot of things about a VBO it can't for a naked array.
Exposing a lot of what the driver has to go through to developers will also force people (well, those people who care) to write more "driver friendly" code, which should improve performance.
Vulkan/dx12 still don't mean there's no driver. The driver still has a lot to do, to mask differences in hardware for instance (whether those differences are by design or unintentional), and whatever nvidia does in their "game ready drivers"..
xTr1m: it looks scary for sure, but maybe you could cut corners, say in device/pipeline/swapchain creation part. I am not really found of skipping error/compatibility checks, but isn't it demoscene tradition anyway?
It might be still too big for 4k, but I'm just trying to be optimistic and don't give up so fast;)
It might be still too big for 4k, but I'm just trying to be optimistic and don't give up so fast;)
Quote:
Who really cares about adoption rates and cross-platform-ness of these APIs when it comes to demo coding?
I'm not wasting my precious spare time to make sure my demo runs on linux/OSx, that's just a pain in the ass to do.
I agree there. Demosceners tend to write demos for very obscure platforms.
I think the point is more that if you DO write a demo for Linux/OS X/Windows or whatever, you want to be able to assume it runs on that 'platform', rather than having to have a very vague limited subset... ("Only with kernel version X and driver version Y on hardware A or B from vendor Z").
I vote for option 6:
"Apple Metal FTW!"
:) Of all the new APIs to me it seems to hit the sweet spot in terms of ease of use (just look at the amount of code needed to get a simple triangle on screen with DX12 or Vulkan) and important peformance features (command buffers).
The API is super streamlined and very elegant.
Have a look at the "Afterpulse" game by Digital Legends for a good example of what's possible.
Of course Metal will most likely stay Apple-only..
In general, i guess option 5 is the most likely one in the end :) Although it's sad we demoscene coders are not kicking everybodys ass with super-cool multithreaded effects code in the forseeable future. There are just 3 or 4 hardcore coders left in the scene that could pull it off..
"Apple Metal FTW!"
:) Of all the new APIs to me it seems to hit the sweet spot in terms of ease of use (just look at the amount of code needed to get a simple triangle on screen with DX12 or Vulkan) and important peformance features (command buffers).
The API is super streamlined and very elegant.
Have a look at the "Afterpulse" game by Digital Legends for a good example of what's possible.
Of course Metal will most likely stay Apple-only..
In general, i guess option 5 is the most likely one in the end :) Although it's sad we demoscene coders are not kicking everybodys ass with super-cool multithreaded effects code in the forseeable future. There are just 3 or 4 hardcore coders left in the scene that could pull it off..
Meanwhile, on a very dark corner:
"Nobody likes me" - AMD Mantle
"Nobody likes me" - AMD Mantle
Quote:
Meanwhile, on a very dark corner:
"Nobody likes me" - AMD Mantle
vulkan is mantle.
Just so you know,
Vulkan is a democratisation of years old Linux's Mesa Gallium. Very same architecture. Mesa Gallium was made some years ago to simplify the portage of open sourced GPU drivers, and it was laughted at for a long time. Once it was effective, it brought so many advantages that it was evident that it would conquer the world. Amongst "hidden features" are: Very complex drivers can be ported to stranges patched OSes (they have GL3.x on Amiga OS4 because of gallium. ) More low level system-wide-compatible code, and my favorite: old hardware continuing to be supported. I have better 3D Gallium drivers today on my ubuntu notebook from 2007 than the original windows 3D driver provided on this machine. shaders compile better. So if demomakers happen to use low level vulkan things, I will see that like another good side effect.
Vulkan is a democratisation of years old Linux's Mesa Gallium. Very same architecture. Mesa Gallium was made some years ago to simplify the portage of open sourced GPU drivers, and it was laughted at for a long time. Once it was effective, it brought so many advantages that it was evident that it would conquer the world. Amongst "hidden features" are: Very complex drivers can be ported to stranges patched OSes (they have GL3.x on Amiga OS4 because of gallium. ) More low level system-wide-compatible code, and my favorite: old hardware continuing to be supported. I have better 3D Gallium drivers today on my ubuntu notebook from 2007 than the original windows 3D driver provided on this machine. shaders compile better. So if demomakers happen to use low level vulkan things, I will see that like another good side effect.
Vulkan is dead....
DX12 is alive...
DX12 is alive...