*-only demos
category: general [glöplog]
BadSector: you're right about the GeForce period.
and you're right too about your upcoming being flamed because this period has just been terminated by the lame NV30+ series and pretty outstanding ATi R300+ series.
:P
and you're right too about your upcoming being flamed because this period has just been terminated by the lame NV30+ series and pretty outstanding ATi R300+ series.
:P
What if we turn it around?
Have the party organizers make a video version of the demos (perhaps top 3 or so)?
Else making a video-version might be hard, because people don't have the equipment for it, or have to write a video-encoder into the demo itself, which may take even more time than making it compatible with other hardware...
I think that the party is the best place for making video-versions of demos, because you can gather all the required equipment and knowledge there.
Have the party organizers make a video version of the demos (perhaps top 3 or so)?
Else making a video-version might be hard, because people don't have the equipment for it, or have to write a video-encoder into the demo itself, which may take even more time than making it compatible with other hardware...
I think that the party is the best place for making video-versions of demos, because you can gather all the required equipment and knowledge there.
i opt for video versions, i haven't been able to watch a decent demo on my pc for years (i have a nifty viper770 :D) so a video would be awesome (i'm still waiting for a highqual vidcapture from the popular demo)
everything what isnt a R9800P sucks total bollux and should be burnt! ;p
and every demo that doesnt run on it, is pover code!
aaah, with ati-propaganda like this, i should get free cards from them! :D
and every demo that doesnt run on it, is pover code!
aaah, with ati-propaganda like this, i should get free cards from them! :D
Vendor specific demos suck. even though i've got a geforce.
look: cocoon's demo was reported to run on gffx-es, being slow tho. this i wouldn't consider such a big problem, if and when the compomachine has a radeon installed. I think - and call me a fool on this, i wont get mad, that's my nick anyways -, sometimes sooner or later nVidia, or [insert manufacturer name here] might release a card which will be able to run the demo decently, and then we'll be happy.
The first time I've seen 2nd reality it was slow like a dead snail, but that was my vidcard. Same for Raw Confession. who cares? I'll enjoy the prod when I'll be able to keep up with the reqs... And I'll bitching then if i don't like it... Till that time i'll only curse the "PC incompatibility" and stuff...
look: cocoon's demo was reported to run on gffx-es, being slow tho. this i wouldn't consider such a big problem, if and when the compomachine has a radeon installed. I think - and call me a fool on this, i wont get mad, that's my nick anyways -, sometimes sooner or later nVidia, or [insert manufacturer name here] might release a card which will be able to run the demo decently, and then we'll be happy.
The first time I've seen 2nd reality it was slow like a dead snail, but that was my vidcard. Same for Raw Confession. who cares? I'll enjoy the prod when I'll be able to keep up with the reqs... And I'll bitching then if i don't like it... Till that time i'll only curse the "PC incompatibility" and stuff...
I'd be rather pleased if Breakpoint raised the bar a little, since they're currently the place people are most eager to release their stuff at. And hey, if you don't like it, release it at some other party.
I'm not saying people need to make it 1:1 compatible, but at least giving people the ability to run it in a spartan mode would be a *major* development.
Oh and Scali, I don't think it's nice to make the organizers do the capturing. There are plenty of opportunities -even at the party itself- to find someone with a capture device to help you out ;)
I'm not saying people need to make it 1:1 compatible, but at least giving people the ability to run it in a spartan mode would be a *major* development.
Oh and Scali, I don't think it's nice to make the organizers do the capturing. There are plenty of opportunities -even at the party itself- to find someone with a capture device to help you out ;)
well i think it would be a major stepback instead of development to let people watch a demo in a spoiled 'spartan' mode. you should either watch it or not watch it, there are no steps in between.
I probably shouldn't really take place in this discussion as I'm the one who asked, but I still feel the need to articulate my opinion. Personally, I'm absolutely against a) and more for b) , though I could happily live with c) too. Too those commenting on the time demands on b) and c), I advise to read my original post again ;) - I've either organized or helped organize several compos on various german parties during the last few years, and I'm very conscious about the amount of extra work involved. I can't say I'm anxious to spend twice the time I already do on compos, but the whole issue is personally important for me, since I feel there's something very wrong about a significant percentage of the "better" demos released in a year to only run on a specific brand of hardware.
I also don't think it's really comparable to GUS-only demos; GUS was one particular piece of hardware that was in itself very widely spread among sceners, and that provided capabilities not available with anything else.
Both is *not* true for the current generation of graphics cards. As seems to be quite evinent from both the "what graphics card will you buy next" and "what is the best graphics card you own" polls on scene.org, the distribution is relatively inhomogenous - assuming that all the "non-pixelshading" cards are NV, we have about 54% NV and about 36% ATI as of the time I'm writing this. For what it's worth, that means a NV-only or ATI-only demo by definition misses 40% of the sceners. Which would, maybe, be ok if there really were *big* differences between the manufacturers.
Only fact is, there are not, atleast not on the level (pixelshader 2.0 level cards) we're talking here. There is DX9, there are standardized ARB vertex program, fragment shader, and vertex buffer extensions for OpenGL, which means there *are* standardized and working ways of accessing the biggest part of the "new" functionality for ps2.0 cards from both APIs. Things like different performance characteristics (i.e. vendorspecific extensions being faster than ARB extensions in some cases) really don't count here. If you feel the need to optimize, by all means to so. If your "optimization" means it won't work on 40% of the installed cards, and you drop the compatible path, that's pure arrogance, nothing else. I don't see any reason not to punish that type of attitude in a competition really. If you feel the need to make a "pc demo" that only runs on a pc very much alike to yours, with everyone else having to watch a divx, put that demo into the wild compo where it belongs.
That much for the radical and consequent part. I know the counterarguments too. Stuff like floating point textures is still not properly supported on NV cards, and I can understand it when some people count that as an essential or atleast very important feature concerning the visual quality of their demo. And while the situation is relatively clear-cut in the pixelshader 2.0 range, it's not nearly as easy if you're targeting somewhat lower end specs, were especially under OpenGL you basically have no choice but to use non-ARB extensions.
Also, even though my experience shows that with careful coding, at least "doesn't run at all" can be pretty much avoided ;), there are more subtile things like shaders passing validation and working okay on NV while looking awful on ATI and the like, which are really impossible to consider during development if you don't switch the gfx card in your coding machine regularly ;)
Thus, while b) is still an ideal to me, atleast currently I'd vote for c) myself, for practical reasons - and I hope loong posts like this help convince sceners that an okay level of compatibility can be achieved with managable efforts, and also make them appreciate the additional work (even though it's often not that much) involved in making a compatible demo, already during the compo.
Disclaimer: This is purely my personal opinion and does in no way imply this is the attitude shared by the breakpoint organizing as a whole. (Though, ofcourse, I hope my co-organizers agree with me on the matter ;)
I also don't think it's really comparable to GUS-only demos; GUS was one particular piece of hardware that was in itself very widely spread among sceners, and that provided capabilities not available with anything else.
Both is *not* true for the current generation of graphics cards. As seems to be quite evinent from both the "what graphics card will you buy next" and "what is the best graphics card you own" polls on scene.org, the distribution is relatively inhomogenous - assuming that all the "non-pixelshading" cards are NV, we have about 54% NV and about 36% ATI as of the time I'm writing this. For what it's worth, that means a NV-only or ATI-only demo by definition misses 40% of the sceners. Which would, maybe, be ok if there really were *big* differences between the manufacturers.
Only fact is, there are not, atleast not on the level (pixelshader 2.0 level cards) we're talking here. There is DX9, there are standardized ARB vertex program, fragment shader, and vertex buffer extensions for OpenGL, which means there *are* standardized and working ways of accessing the biggest part of the "new" functionality for ps2.0 cards from both APIs. Things like different performance characteristics (i.e. vendorspecific extensions being faster than ARB extensions in some cases) really don't count here. If you feel the need to optimize, by all means to so. If your "optimization" means it won't work on 40% of the installed cards, and you drop the compatible path, that's pure arrogance, nothing else. I don't see any reason not to punish that type of attitude in a competition really. If you feel the need to make a "pc demo" that only runs on a pc very much alike to yours, with everyone else having to watch a divx, put that demo into the wild compo where it belongs.
That much for the radical and consequent part. I know the counterarguments too. Stuff like floating point textures is still not properly supported on NV cards, and I can understand it when some people count that as an essential or atleast very important feature concerning the visual quality of their demo. And while the situation is relatively clear-cut in the pixelshader 2.0 range, it's not nearly as easy if you're targeting somewhat lower end specs, were especially under OpenGL you basically have no choice but to use non-ARB extensions.
Also, even though my experience shows that with careful coding, at least "doesn't run at all" can be pretty much avoided ;), there are more subtile things like shaders passing validation and working okay on NV while looking awful on ATI and the like, which are really impossible to consider during development if you don't switch the gfx card in your coding machine regularly ;)
Thus, while b) is still an ideal to me, atleast currently I'd vote for c) myself, for practical reasons - and I hope loong posts like this help convince sceners that an okay level of compatibility can be achieved with managable efforts, and also make them appreciate the additional work (even though it's often not that much) involved in making a compatible demo, already during the compo.
Disclaimer: This is purely my personal opinion and does in no way imply this is the attitude shared by the breakpoint organizing as a whole. (Though, ofcourse, I hope my co-organizers agree with me on the matter ;)
the point of a spartan mode is that people at least know they're missing out on the full enchilada, gem. The point is that you at least have an inkling of what you're missing out on. Like a screenshot, only moving ;)
Completely agree with you, ryg
actually, i prefer a Radeon 9500-9800 as only competition hardware.
- looking at the scene.org poll, ATI's are the more popular PS2.0 card, even without considering how worthless the FX5200 is.
- there are many NV-only demos, but few ATI only demos. I don't mind a few demos slipping through the net.
- the usual reason for NV-only is ignorance, while the usual reason for ATI-only are hardware features.
- all those coders who complain that they don't have a radeon will keep NV-compatibility naturally while trying to fix thier code.
- i doupt that those unhappy people using NV-only extensions will switch to ATI-only extensions. nobody is incompatible on purpose.
- it will be very easy to organise, leading to fitter, happer and more productive ryg.
If you code on a GeForce, you should not be afraid of ATI-compatibility. If you use DirectX, your demo will automatically run, except for a few minor problems. If you are using OpenGL you should avoid NV specific extensions, it should be easy to find out what a Radeon is capable of. But I don't know much about OpenGL.
- looking at the scene.org poll, ATI's are the more popular PS2.0 card, even without considering how worthless the FX5200 is.
- there are many NV-only demos, but few ATI only demos. I don't mind a few demos slipping through the net.
- the usual reason for NV-only is ignorance, while the usual reason for ATI-only are hardware features.
- all those coders who complain that they don't have a radeon will keep NV-compatibility naturally while trying to fix thier code.
- i doupt that those unhappy people using NV-only extensions will switch to ATI-only extensions. nobody is incompatible on purpose.
- it will be very easy to organise, leading to fitter, happer and more productive ryg.
If you code on a GeForce, you should not be afraid of ATI-compatibility. If you use DirectX, your demo will automatically run, except for a few minor problems. If you are using OpenGL you should avoid NV specific extensions, it should be easy to find out what a Radeon is capable of. But I don't know much about OpenGL.
I don't know about the rest, but I could live with chaos' proposal. Can we have a vote? :)
*vote* :P
that may as well lead to me finishing my ps2.0 intro. *vote* :)
I am for law enforcement. I am not against the newest technology, but demo should stay something fun.
I guess most sceners are students or peoples that just want to have fun while watching demos. Just imagine how a decent graphic card could be expensive for someone who has a little budget and u're done.
The only thing to respect should be to respect DirectX API. It was designed for that. then optimize, and try to make the whole thing works with a majority of gfx cards.
My prefered demos are Farbrausch demos because they work on my office's PC :p and seriously, it works on almost decent PC tested so far.
I guess most sceners are students or peoples that just want to have fun while watching demos. Just imagine how a decent graphic card could be expensive for someone who has a little budget and u're done.
The only thing to respect should be to respect DirectX API. It was designed for that. then optimize, and try to make the whole thing works with a majority of gfx cards.
My prefered demos are Farbrausch demos because they work on my office's PC :p and seriously, it works on almost decent PC tested so far.
and hum.. don't use PS2.0 as long as it will be reserved to a minority.
i wont frown is some people can't see my intro tho... i'm doing it 'cos i wondered if the thing i'm doing is possible or not... so far it seems it's possible, so it's more like an experiment. anyway, lot of comments came for the recent Cocoon demo, so i think enough people have HW to watch it.
There is always the question of how new the hardware in the compo PC should be, yes.
My rules of thumb for this are simple - new featureset after about one year delay, "just" faster cards/CPUs always if we can get them.
Being shown in the compo is always a big thing. If we can make the demos in the compo run somewhat smoother than on the average scener's PC by using faster hardware, why not? There are no losers with this.
As for new features, this is always a tickly subject, so I won't elaborate much. The bottom line about my "1 year" rule of thumb is that the "we want new hardware" and "I want it to run on my PC" factions both equally flame me for it, so it seems to be about right ;)
My rules of thumb for this are simple - new featureset after about one year delay, "just" faster cards/CPUs always if we can get them.
Being shown in the compo is always a big thing. If we can make the demos in the compo run somewhat smoother than on the average scener's PC by using faster hardware, why not? There are no losers with this.
As for new features, this is always a tickly subject, so I won't elaborate much. The bottom line about my "1 year" rule of thumb is that the "we want new hardware" and "I want it to run on my PC" factions both equally flame me for it, so it seems to be about right ;)
Seriously, I don't think I ever has heard of any ati-only demo or intro (as in making ati-specific code, like one easily could in opengl). If such exists, I guess it's a somewhat sucky prod since I don't seem to remember it.
Ryg, don't over-do this, you usually get fucked up enough during the party, no? :)
I think that chaos suggestion is just fine. I doubt that it would lead to a big amount of ati-only prods, so go ahead.
Now, if I just could get my thumb out of my ass and install that 9800 pro that lator gave me too many months ago...
Ryg, don't over-do this, you usually get fucked up enough during the party, no? :)
I think that chaos suggestion is just fine. I doubt that it would lead to a big amount of ati-only prods, so go ahead.
Now, if I just could get my thumb out of my ass and install that 9800 pro that lator gave me too many months ago...
I am for law enforcement too. Then all the biggest releases at least should work with both NV & Ati. If someone want's to stay vendor spesific, that's also ok, but these releases should be outside parties.
i agree with Ryg
i had buy a Nvidia card to view the 2002 and 2003 demos, i will NOT buy a Ati card to view the 2004 prods! if you release a prod based on a limited hardware, then you will have a few audience. And having no audience for a prod do an unuseful prod.
Simply make the hardware choice in the demo's launcher, and limit your effects if they are not compatible, but don't limit your audience volume.
i had buy a Nvidia card to view the 2002 and 2003 demos, i will NOT buy a Ati card to view the 2004 prods! if you release a prod based on a limited hardware, then you will have a few audience. And having no audience for a prod do an unuseful prod.
Simply make the hardware choice in the demo's launcher, and limit your effects if they are not compatible, but don't limit your audience volume.
As I said, I doubt that having just an ATI and demaning that the prods work on that will lead to productions demanding an ATI card to work. Sure, nvidia has a nice rep of making sucky drivers, but if nvidia makes drivers that aren't compatible with dx9 / ARB extensions, it should not be a valid argument for making nvidia-specific code when one can write nice code that should work on all modern cards instead.
zone: Quite frankly, the demolauncher is not why you make a demo, it never has been, and never will be.
If you want to look at 2002 and 2003 demos, then there is 2002 and 2003 demos for you, if thats OK with you, then who cares? Who cares about if you can run a demo anyways, ehh :)
As long as the demos are well coded and supports its apis in a correct way, then most likely you can watch a DX9 demo when you buy new cards, however lock yourself to geforce3 or 4 as for instance Shiva likes to do, without ANY reason, now thats stupid, but then again, he's german and claims NOT to be a nazi, so ofcourse something is wrong with him.
And its not about "limit" your effects to work on older cards, its about writing duplicated versions, since 2.0 shaders can just be stripped down and work with old 1.x shaders.
To all you NON-Coders who nag about prods being 2.0 shaders only: Get a book, start to code, make nonpixelshader demos or whatever then, see if we care.
lamers.
If you want to look at 2002 and 2003 demos, then there is 2002 and 2003 demos for you, if thats OK with you, then who cares? Who cares about if you can run a demo anyways, ehh :)
As long as the demos are well coded and supports its apis in a correct way, then most likely you can watch a DX9 demo when you buy new cards, however lock yourself to geforce3 or 4 as for instance Shiva likes to do, without ANY reason, now thats stupid, but then again, he's german and claims NOT to be a nazi, so ofcourse something is wrong with him.
And its not about "limit" your effects to work on older cards, its about writing duplicated versions, since 2.0 shaders can just be stripped down and work with old 1.x shaders.
To all you NON-Coders who nag about prods being 2.0 shaders only: Get a book, start to code, make nonpixelshader demos or whatever then, see if we care.
lamers.
maybe demanding law enforcement is easier if a) you arent a coder or b) you are a coder, but you like to only make things that run on a TNT (or other cards from the 90s, not this century)
but if you do try and use features from even dx8 class cards, you'll find quite quickly that compatibility due to "compatible code" (i mean something that youve tested with debug runtimes, tested on ref rast) is still open to being fucked over by different driver versions (future or past), different cards by the same manufacturer, or the inability by manufacturers to conform to standards.
and thats just whether it runs without glitching, let alone whether it runs fast.
"ATIvsNVIDIA" covers only a fraction of it. of course if you're an opengl coder and you decide to only use nvidia or ati specific extensions, its asking for trouble and its hard to accept that.
but hows about this - i'll write a prod on my gf4 on dx9, test on the ref rast and dbg runtimes, and what if it doesnt work on the ATI compomachine (or runs a fraction of the speed on the ATI)? is it my fault or ATI's fault for shit drivers?
finally, dont forget that democoders do this for free, and that theres a reason pc games go through masses of QA and have loads of patches released - cos its quite hard to make something work on many pc configs. and thats if youve got the hardware to test with in the first place.
parties should try and help the coders, not penalise for what might not be their fault.
but if you do try and use features from even dx8 class cards, you'll find quite quickly that compatibility due to "compatible code" (i mean something that youve tested with debug runtimes, tested on ref rast) is still open to being fucked over by different driver versions (future or past), different cards by the same manufacturer, or the inability by manufacturers to conform to standards.
and thats just whether it runs without glitching, let alone whether it runs fast.
"ATIvsNVIDIA" covers only a fraction of it. of course if you're an opengl coder and you decide to only use nvidia or ati specific extensions, its asking for trouble and its hard to accept that.
but hows about this - i'll write a prod on my gf4 on dx9, test on the ref rast and dbg runtimes, and what if it doesnt work on the ATI compomachine (or runs a fraction of the speed on the ATI)? is it my fault or ATI's fault for shit drivers?
finally, dont forget that democoders do this for free, and that theres a reason pc games go through masses of QA and have loads of patches released - cos its quite hard to make something work on many pc configs. and thats if youve got the hardware to test with in the first place.
parties should try and help the coders, not penalise for what might not be their fault.
Quote:
but hows about this - i'll write a prod on my gf4 on dx9, test on the ref rast and dbg runtimes, and what if it doesnt work on the ATI compomachine (or runs a fraction of the speed on the ATI)? is it my fault or ATI's fault for shit drivers?
Assuming you write for gf4, and the ATi card in question is a Radeon 9500+, I don't think this can happen.
The main problem areas are people using nv-only extensions (haven't seen anyone use ATi-only extensions, don't think we will see that either, since ATi has very little specific extensions, most stuff is supported through ARB-extensions) and people using DX9-features that GeForce FX doesn't support (like fp textures) or supports very badly (like pixelshaders).
For the first I say: use ARB extensions instead, or move to Direct3D.
For the second I say: Don't buy GeForce FX, Radeons support it NOW, and it's FAST. The next GeForce will support these features aswell (or nv will die), and since D3D is not vendor-specific, the demos will automatically work then. Same goes for S3 and XGI DX9 cards.
And that's why I agree with chaos.