3D acceleration kills the demoscene
category: offtopic [glöplog]
3D acceleration kills the demoscene? THEN THE FUCK GO MAKE A DEMO ABOUT IT!
and good luck!!!
Aww, the nostalgia ... hehe.
And of course everyone here turned out to be wrong, except Zone. ;)
And of course everyone here turned out to be wrong, except Zone. ;)
my favourite pc to watch demos on is dosbox. just sayin'
NECROPOST!
But indeed funny to see that after nine years the scene embraced change :)
But indeed funny to see that after nine years the scene embraced change :)
What I find interesting is how oldschool prods, at least at @party and Blockparty, seem rather hot at the moment . . . and of course there are the oldschool focused parties in Europe as well, like XParty for Commodore . . .
And mobile has been growing for a few years.
Folks seek out a challenge wherever they can.
Speaking for this most recent @party, Luis Gonzalez, for example, wanted to write a demo for a Blackberry. All the 3D libraries were proprietary, so he decided to make one of his own.
And of course there's neglected platforms getting new love, like Krüe and his ColecoVision, the Carnegie Mellon Computer Club and their Apple Lisa with a hacked HDMI output, and of course the two folks (galt and Blacklight, respectively) who have banged their brains against the wonderful idiocy of Dr Claw's trashpicked Mac SE for @party for two years running, producing the first two Mac SE prods listed on Pouet (although I imagine the KansasFest folks have done lots with it before). I do hope someone steps up next year, it could become a tradition ( :
And thats just a few examples.
And mobile has been growing for a few years.
Folks seek out a challenge wherever they can.
Speaking for this most recent @party, Luis Gonzalez, for example, wanted to write a demo for a Blackberry. All the 3D libraries were proprietary, so he decided to make one of his own.
And of course there's neglected platforms getting new love, like Krüe and his ColecoVision, the Carnegie Mellon Computer Club and their Apple Lisa with a hacked HDMI output, and of course the two folks (galt and Blacklight, respectively) who have banged their brains against the wonderful idiocy of Dr Claw's trashpicked Mac SE for @party for two years running, producing the first two Mac SE prods listed on Pouet (although I imagine the KansasFest folks have done lots with it before). I do hope someone steps up next year, it could become a tradition ( :
And thats just a few examples.
Moot point; the scene is already dead.
ok, now I understand the concept of "scene spirit".
software renderings are no so much complicated and it may be unusefull. but for sure a 4ko full asm is stronger than a crinkler state of mind trick.
maybe we would want to say, using devices depreciated the artefact but dos is no more accessible from many computers even if directx7 is still enoyable.
maybe we would want to say, using devices depreciated the artefact but dos is no more accessible from many computers even if directx7 is still enoyable.
re: "but dos is no more accessible from many computers even if directx7 is still enoyable. "
DOS is accessible from any computer that can access the internet. Exhibit A:
http://www.dosbox.com/
Of course, I have a machine running DOS in my living room, but I what statisticians call an outlier. We do impromptu demoshows all the time. ( :
DOS is accessible from any computer that can access the internet. Exhibit A:
http://www.dosbox.com/
Of course, I have a machine running DOS in my living room, but I what statisticians call an outlier. We do impromptu demoshows all the time. ( :
i enjoy eating cock
Whatever. Other than triangle culling and rasterizing the GPU does little that it isn't explicitly coded with shaders, of buffer allocations.
GPU programming has a lot of limitations to deal with just like a C64. It's a great challenge. I mean here the parallel execution, 2D cached data access.
On the other hand, CPU code is easier to write, because you are able to sort data, build trees, and use uniform grid hash space with linked lists and a lot more.
However the limitations of the GPU will ease as the scene moves on to CUDA.
I will show an effect at Assembly which I actually wanted to do 12 years ago when the hardware was not fast enough to do it realtime, and it's still slow on CPU.
On the other hand, CPU code is easier to write, because you are able to sort data, build trees, and use uniform grid hash space with linked lists and a lot more.
However the limitations of the GPU will ease as the scene moves on to CUDA.
I will show an effect at Assembly which I actually wanted to do 12 years ago when the hardware was not fast enough to do it realtime, and it's still slow on CPU.
will it work on radeon ? :grin:
Yes, it will be DirectX 9 - shader model 3.0.
it's a shame too, to have lost DOS. I do not understand completly the solution, the system still stay compatible even with 64 bits. It's lazyness.
CUDA sucks. Use OpenCL.
Right now i'm enjoying my holydays digging inside GLSL and all the
great possibilities that the main() { } offers to my old demoscene engine.
The philosophy in acceleration, at last! , has come to the roots!
Fully acces to the machine video buffers! plus a execution pipeline and logical operations oriented
to make anything imaginable, like in the old days!!!!
take a look at this! deSpiteing the
bottleneck inconveniences of the browser layering/javascript/mandatory "valid" drivers blacklisting: this atonishing tool can show the beauty and power of GLSL.it's C it's buffers! new 3d acceleration rocks!
great possibilities that the main() { } offers to my old demoscene engine.
The philosophy in acceleration, at last! , has come to the roots!
Fully acces to the machine video buffers! plus a execution pipeline and logical operations oriented
to make anything imaginable, like in the old days!!!!
take a look at this! deSpiteing the
bottleneck inconveniences of the browser layering/javascript/mandatory "valid" drivers blacklisting: this atonishing tool can show the beauty and power of GLSL.it's C it's buffers! new 3d acceleration rocks!