Harmful code in demos?
category: general [glöplog]
Some people also reported xtal/complex screwed up their monitors. iirc it had something to do with that weird 50hz videomode.
Also unintentionally of course.
nowadays most demos are released at parties, thus the compo machine works as a reasonably decent filter. (i.e. usually the compo crew is the one falling victim first to this)
ooh, that could score massive comedy points for writing a worm once you find out they keep the compo machine on the party network...
I've downloaded gigabytes and never had any problems with "malicious" code. Buggy code? Yeah, all the time, but not malicious code.
IMO, everything on scene.org can be trusted.
IMO, everything on scene.org can be trusted.
hey, see what i found on my harddisk... :-)))
apparently some hardcore stuff on atari st (overscan and sync-scrolling) has fucked monitors in the past, but i figure this is probably an urban myth
in about 15 years in the atari scene, i haven't come across a single case of a monitor that was verifiably damaged by overscan/syncscroll code, or any other code for that matter.
having said that, it's perfectly possible to switch the shifter to 640*400 mono @ 72hz while a regular 50hz RGB monitor is connected. actually, this is what's done for a *very short period* to fool the shifter into displaying gfx in the left border (iirc). but! in the case of overscan code, you also need to switch back to 50hz to get a stable display. so the monitor is never exposed to a highres signal for more than milliseconds at a time. ofcourse one could produce some code that only switches to 72hz, never back to 50, and in that case a monitor theoretically might indeed be damaged.
as for the other way round, i know for a fact that a highres screen that is connected to a RGB output results in something that looks and sounds like nosfe would appreciate it. ie: scary! :P
having said that, it's perfectly possible to switch the shifter to 640*400 mono @ 72hz while a regular 50hz RGB monitor is connected. actually, this is what's done for a *very short period* to fool the shifter into displaying gfx in the left border (iirc). but! in the case of overscan code, you also need to switch back to 50hz to get a stable display. so the monitor is never exposed to a highres signal for more than milliseconds at a time. ofcourse one could produce some code that only switches to 72hz, never back to 50, and in that case a monitor theoretically might indeed be damaged.
as for the other way round, i know for a fact that a highres screen that is connected to a RGB output results in something that looks and sounds like nosfe would appreciate it. ie: scary! :P
Well, this sometimes happens during the coding sessions when debugging the overscan :D
Misfit II fumbled around with some texturediplacement that nVidia drivers didnt understand to reset at d3d init, the intros ran after Misfit II in the compo looked rather funny but the organizer was nagged to rerun the fucked up prods.
At TP2k or something (2k1?) the organizers thought the stereo eqp. was malfunctioning esp. as the sound in Mental Excession was so eurhm.. "slightly off tune", so they reran alot of prods (including The Product), there couldve been a real problem to, but i promiss you, you couldnt hear it during Mental Excession ran..
But fucking HW ? Never heard of, well except trying to force alot of screen modes that the monitor doesnt like, but thats not really doable nowadays anyways.
At TP2k or something (2k1?) the organizers thought the stereo eqp. was malfunctioning esp. as the sound in Mental Excession was so eurhm.. "slightly off tune", so they reran alot of prods (including The Product), there couldve been a real problem to, but i promiss you, you couldnt hear it during Mental Excession ran..
But fucking HW ? Never heard of, well except trying to force alot of screen modes that the monitor doesnt like, but thats not really doable nowadays anyways.
The CBM PET can suffer hardware damage from a "killer poke" into the CRT controller registers.
I also recall that there was an IBM graphics card that could fry the monitor... I think you did something like write 0 into the horizontal refresh frequency register.
I also recall that there was an IBM graphics card that could fry the monitor... I think you did something like write 0 into the horizontal refresh frequency register.
But these days it must be quite difficult to nuke hardware with code, right? I suppose the regular trojan threat is more realistic, even if it's extremely rare.
Self destruction? Check out this bastard.
Not only can you set the number of scanlines too high, and break the monitor, but this program:
FOR I=0 TO 255: OUT 6,I: NEXT
will actually cause the computer to start smoking. Literally.
Not only can you set the number of scanlines too high, and break the monitor, but this program:
FOR I=0 TO 255: OUT 6,I: NEXT
will actually cause the computer to start smoking. Literally.
xeron: ... and thus evolution selected the system for extinction :)
Sweet looking keyboard though!
That computer is a good example of why drugs are bad.
Quote:
That computer is a good example of why drugs are bad.
That computer is a good example of why drugs are good.
shifter: how about fucking when the gfx-driver version is newer than the compo-date? that way you'd bypass the compo-machine, and start a lot of confusion as an extra bonus!
kusma: or just checking the actual date of the party.
Gargaj: it think it's more usual for compo-machine to have a wrong set clock than a hacked driver ;)
plek and I actually had an idea quite some time ago for the ultimate compo machine demo which would've consisted of a trick by entering a 4k that hashes all the system specs together and displays them on the screen, possibly as a design element. then you can just write that down during the compo and enter your demo depending on that hashstring - voila, perfect compomachine-only demo :)
Gargaj: awesome! :)
from oldskool.org..
Fun with the Atari ST
Fred Butzen writes:
"Here's one of my stories. Not really old school, but it's funny.
In the mid-80s, MWC wrote one of the first C compilers for the Atari ST -- the "Jackintosh", so called because Jack Tremiel (formerly of Commodore) designed it with Atari as a rip-off of the Macintosh. It had a GUI built around Digital Research's GEM interface. It had two floppies, a 68000 CPU, and 512 kilobytes of RAM -- which in those days was a lot of memory. To save money, the hardware was totally sleezy. From time to time it would simply stop working, and you'd have to open up the unit and reseat all of the chips in their sockets. If you were in a hurry, you'd just lift up the unit about six inches and drop it. A total piece of junk.
Part of the savings was that the machine had no MMU and no parity on the memory. This meant that the machine was totally unreliable; however, it also meant that a kid hacker could do anything to the hardware, and there's no way the machine would stop him. (Mind you, that anything included smoking the tube, but that's another story.) For example, I wrote a little program that reset the base address of video memory to zero, which was where the operating system lived. (Video used RAM, of course, rather than memory on a separate card.) It was kind of cool, because when you ran a program that allocated memory -- say, a sort program, you could see the memory being allocated and deallocated on the screen. The difference between qsort and shellsort was really obvious just by the patterns they drew on the screen.
Anyway, as a programming exercise I wrote a version of the game of life for this machine. It ran fine; however, I made one mistake: I forgot to set the clipping rectangle around the screen. So, the first time I built a glider gun, the glider went creeping off the screen, and just kept on going, crashing through memory. Mind you, there's no MMU, so the machine is still running while the glider is rampaging through memory. Because the 68000 used memory-mapped ports for its hardware, you could see where the glider was going because the disk drives starting running, the lights flashed on the keyboard, the tube blinked, and so on. Finally, the glider hit something really vital and the machine died with "streaky bombs" -- a sign that the operating system was really, really sick.
Of course, I had to show this to all the other guys. We laughed ourselves sick watching that happen."
Fun with the Atari ST
Fred Butzen writes:
"Here's one of my stories. Not really old school, but it's funny.
In the mid-80s, MWC wrote one of the first C compilers for the Atari ST -- the "Jackintosh", so called because Jack Tremiel (formerly of Commodore) designed it with Atari as a rip-off of the Macintosh. It had a GUI built around Digital Research's GEM interface. It had two floppies, a 68000 CPU, and 512 kilobytes of RAM -- which in those days was a lot of memory. To save money, the hardware was totally sleezy. From time to time it would simply stop working, and you'd have to open up the unit and reseat all of the chips in their sockets. If you were in a hurry, you'd just lift up the unit about six inches and drop it. A total piece of junk.
Part of the savings was that the machine had no MMU and no parity on the memory. This meant that the machine was totally unreliable; however, it also meant that a kid hacker could do anything to the hardware, and there's no way the machine would stop him. (Mind you, that anything included smoking the tube, but that's another story.) For example, I wrote a little program that reset the base address of video memory to zero, which was where the operating system lived. (Video used RAM, of course, rather than memory on a separate card.) It was kind of cool, because when you ran a program that allocated memory -- say, a sort program, you could see the memory being allocated and deallocated on the screen. The difference between qsort and shellsort was really obvious just by the patterns they drew on the screen.
Anyway, as a programming exercise I wrote a version of the game of life for this machine. It ran fine; however, I made one mistake: I forgot to set the clipping rectangle around the screen. So, the first time I built a glider gun, the glider went creeping off the screen, and just kept on going, crashing through memory. Mind you, there's no MMU, so the machine is still running while the glider is rampaging through memory. Because the 68000 used memory-mapped ports for its hardware, you could see where the glider was going because the disk drives starting running, the lights flashed on the keyboard, the tube blinked, and so on. Finally, the glider hit something really vital and the machine died with "streaky bombs" -- a sign that the operating system was really, really sick.
Of course, I had to show this to all the other guys. We laughed ourselves sick watching that happen."
aehm... it seems someone stuck a kärcher up mr. butzen's rectum before he started to speak...
maybe it will clean away that glider floating around in his addled brain...