The 64 MB limit at BP07
category: general [glöplog]
> if someone could throw out four or five suggestions for different
> "level" systems
I'd be nice if we had a standardised list like this, and a demo could list it as running on config X.
Every couple of years you could add a new 'level' of hardware, to come with the new capabilities of graphics cards.
The big thing the PC scene is missing IMO is you can't really judge how good the code is anymore. Well, I'm a software rendering coder, so I find it hard anyway to judge D3D stuff anyway... but I have an idea of how things work (I read a lot of articles hardware and rendering techniques when I'm bored.)
But usually I'm watching some kkaptured video, and have no idea what hardware this demo runs fast or slow on. The coders could have done some awesome effect, either via ingenious code, or via dumb brute force of new hardware, and most people wouldn't have a clue.
Amiga for example has several discreet levels, mostly. OCS stuff, ECS stuff, AGA stuff, and then a few levels of CPU 68000-68060. It's easy to keep track of and judge the merits of what's running on the hardware.
Perhaps standardised specs would make things easier. They could also impede progress... who knows.
Then again, like I said, I'm not a 3D accel coder, so maybe for you guys you can still be wowed at all the code that's coming out...
> "level" systems
I'd be nice if we had a standardised list like this, and a demo could list it as running on config X.
Every couple of years you could add a new 'level' of hardware, to come with the new capabilities of graphics cards.
The big thing the PC scene is missing IMO is you can't really judge how good the code is anymore. Well, I'm a software rendering coder, so I find it hard anyway to judge D3D stuff anyway... but I have an idea of how things work (I read a lot of articles hardware and rendering techniques when I'm bored.)
But usually I'm watching some kkaptured video, and have no idea what hardware this demo runs fast or slow on. The coders could have done some awesome effect, either via ingenious code, or via dumb brute force of new hardware, and most people wouldn't have a clue.
Amiga for example has several discreet levels, mostly. OCS stuff, ECS stuff, AGA stuff, and then a few levels of CPU 68000-68060. It's easy to keep track of and judge the merits of what's running on the hardware.
Perhaps standardised specs would make things easier. They could also impede progress... who knows.
Then again, like I said, I'm not a 3D accel coder, so maybe for you guys you can still be wowed at all the code that's coming out...
I don't agree that the demoscene is about doing the maximum possible on the most common platforms, I believe it's more about doing the maximum possible on whatever platform you decide on. If not, TBL wouldn't be able to win Assembly with amiga-demos and still being true to "the demoscene spirit", as amigas are not common any more. The fact that most people choose to make demos on common platforms is a different thing. They simply choose to do so.
So, make a low-end demo, make a mid-end demo or make a fucking high-end demo. Use 64MB of disk-space, it's all good.
So, make a low-end demo, make a mid-end demo or make a fucking high-end demo. Use 64MB of disk-space, it's all good.
So is that size limit a bad thing? My answer is no. As larger demogroups start creating demos, that size limit will drive innovation. Better compression methods will be created, procedural textures will become standard, and developers will search for new ways to provide a rich quality in a small package. In the long run, this will affect the scene in a very positive way. Instead of larger files, devs could utilize these methods to pack 60 megs worth of demo into 10mb worth of space.
gloom: it's just my own curiosity really, no need for this.
But in an other hand having standard compo machines at parties based on the ones owned by the majority of sceners , mostly 2 machines, 1 with an ati & 1 with a nvidia card i guess, would be a little bit more fair & a common playground may clear some things up.
That standard would evolve with time, of course (considering such standard can be defined).
But in an other hand having standard compo machines at parties based on the ones owned by the majority of sceners , mostly 2 machines, 1 with an ati & 1 with a nvidia card i guess, would be a little bit more fair & a common playground may clear some things up.
That standard would evolve with time, of course (considering such standard can be defined).
64meg is a good thing from my point of view. in the 90s 3d object just had a diffuse map. now with modern hardware you need diffuse+gossiness+specular+normal map in hi resolution. so this takes disk space. there is no magic .
kusma: i'm talking about pc only of course, it's obvious that anyone can choose any platform around & still kick ass, & as tbl recently proved, screen resolutions are totally irrelevant.
64mb.. excellent for noisedemos (who suffer the most from compression! :) )
I second nagato^. When I watch a PC demo, I'm not able to judge it properly and I don't want to read books in order to be able to.
If more information about a demo was published on the big screen (including actual size zipped and unzipped) I would have more "ahh" and "ohh" moments during the pc compo. But 2006's BP PC-demo compo was a boring mess with only very few exeptions (well, at least 8 out of 30 afair).
If more information about a demo was published on the big screen (including actual size zipped and unzipped) I would have more "ahh" and "ohh" moments during the pc compo. But 2006's BP PC-demo compo was a boring mess with only very few exeptions (well, at least 8 out of 30 afair).
Since when was the demoscene about people telling other people what they should or should not do?
Preacher: Since the dawn of time.
the pc-demo-compo is not the 64k-intro-compo.
It's not the wild compo either.
and use gzip so you can use 100Mb unpacked data easily!
Quote:
And still 10MB managed to look aesthetically better than 64MB, oh my.so if you're making some cubes+glow, dont worry, you can still make it in 10mb (although it probably should be 64k or less if you know what you're doing, but hey).
Well, we'll pack a video in 63.5 Mb. The rest 0.5 Mb we'll fill with a player code and some 4k-intro-like effects. It would be cool, you'll see...
Preacher: Doom is right. but then again has anyone ever listened to those people?
maali: i read that, bitch. :)
and what will happen when someone would like to use good hdr maps and SHPRT lighting system and some high resulution textures(diffuse, normals, speculars and other), and also high quality models :P ?
this will take 64mb :P
this will take 64mb :P
ps: it was an offer of my services ;)
the PC demo category is dead, just release videos if you can't fit something in 64k
I'll better stop the endless upgrade of my PC and start to download video in H.264 MOV format. Same size, less problems, more fps.
manwe: if your only reason to watch demos is having them in less size and higher fps then indeed thats a good idea. what happened to the proof of concept that something extremly beautiful can be rendered realtime? so what if it takes more size then a videorender of the thing, does it mean it's not worth witnessing your machine rendering it _realtime_?
I'm still waiting for a 64MB high quality (as in at least 1280x720@60fps) capture of a demo that looks as good as the original.
most of the latest top notch demos can't be rendered in realtime on no-so-up-to-date machines anyway. you can't call 5 fps "realtime".
so if this 64mb limit would allow using large tables for optimizations i'd say go for it. else people again won't be able to judge those "cool looking particle engines that are not meant to be fast" from video captures.
so if this 64mb limit would allow using large tables for optimizations i'd say go for it. else people again won't be able to judge those "cool looking particle engines that are not meant to be fast" from video captures.
i guess it's time to re-introduce the interactive mode in demos (just like fr did recently!)