Live Coding Compo Framework
category: code [glöplog]
Hi!
I' making a separate thread for the Live Coding tool feedback.
Here's the link:
-------------------------------------------------------------------------------------
http://plastic-demo.nazwa.pl/LiveCoding/liveCoding_v0_9.rar
-------------------------------------------------------------------------------------
Please read the ReadMe.txt file before starting.
Tested on windows Vista/7 32bit & 64 bit, various NVIDIA's and AMD's. Basically if you have Visual Studio 2010 please Run.bat
Please change the path in bat for other VS versions (also works on Express). You can post them here so we will make a package update. The precompiled package contains binaries just for 64bit systems. you need to build it on your own on 32 bit machine.
Please write your feedback here and we will make patches before the party starts. After the party I'll add a tool as a production. Our goal is to make it work on any party, that would like to perform the compo. Believe me - it's fun :)
bonzaj
I' making a separate thread for the Live Coding tool feedback.
Here's the link:
-------------------------------------------------------------------------------------
http://plastic-demo.nazwa.pl/LiveCoding/liveCoding_v0_9.rar
-------------------------------------------------------------------------------------
Please read the ReadMe.txt file before starting.
Tested on windows Vista/7 32bit & 64 bit, various NVIDIA's and AMD's. Basically if you have Visual Studio 2010 please Run.bat
Please change the path in bat for other VS versions (also works on Express). You can post them here so we will make a package update. The precompiled package contains binaries just for 64bit systems. you need to build it on your own on 32 bit machine.
Please write your feedback here and we will make patches before the party starts. After the party I'll add a tool as a production. Our goal is to make it work on any party, that would like to perform the compo. Believe me - it's fun :)
bonzaj
we might try this out at revision shadowparty next weekend :)
First piece of feedback:
An always on top mode helps a lot with single monitor systems.
Also, window resolution is not exported/imported from the config file.
Otherwise it's working nicely, although I'd prefer hlsl. I'll play around with it :)
An always on top mode helps a lot with single monitor systems.
Also, window resolution is not exported/imported from the config file.
Otherwise it's working nicely, although I'd prefer hlsl. I'll play around with it :)
BoyC:
thanks - we will fix it tomorrow by adding extra stuff to config file. We also got informed that when there are no audio devices turned on in the system then it crashes :).
The tool is designed in a way that it should work with other compilers as well as long as they support post build step. We are using pipes for communication so Mac and Linux ports are possible. If anyone is willing to do them then go ahead. After the party we will release the source on github for convenience.
thanks - we will fix it tomorrow by adding extra stuff to config file. We also got informed that when there are no audio devices turned on in the system then it crashes :).
The tool is designed in a way that it should work with other compilers as well as long as they support post build step. We are using pipes for communication so Mac and Linux ports are possible. If anyone is willing to do them then go ahead. After the party we will release the source on github for convenience.
BoyC: and why didn't you enter the competition yet :)?+
Yeah no worries, took about 2 minutes to fix these issues :)
As for the compo, I've never done live coding before and want to test the waters a bit before I enter :)
As for the compo, I've never done live coding before and want to test the waters a bit before I enter :)
bonzaj, works nicely after plugging a recording device (aka microphone). i still have the feeling though, that there is nothing recorded.
sorry, my bad, there is fft :D
yes, there's fft in one of the textures. We will need to pick up 8 textures for the compo. On we can we had sth like:
3 tiled textures
1 tiled normal map texture
1 we can logo
3 funny textures showing sceners etc. :)
Maybe we could allow each contestant to enter the competition with one "user texture"?
3 tiled textures
1 tiled normal map texture
1 we can logo
3 funny textures showing sceners etc. :)
Maybe we could allow each contestant to enter the competition with one "user texture"?
well, that would probably be a tilable noise texture for everyone then ;)
oh, and i've just talked to cupe andere here is some more feedback: a smoothly integrated fft would be really nice. the fft itself is just way too noisy to be used as sync source.
what do you mean by soothly integrated FFT?
sth like - let's integrate it inside an octave and then smooth out the result with low pass filter? Just to make sure, since I don't know what will bring best results
sth like - let's integrate it inside an octave and then smooth out the result with low pass filter? Just to make sure, since I don't know what will bring best results
Hey, I just got a chance to look at your tool.
Things I like:
1. Offline shader pre-computation so the visuals don't stutter on compile. Major plus!
2. You made a tool and distributed it. thanks!
3. Livecoding on Demoscene events is now a thing <3
Things I don't like:
1. The code is not shown on screen. This is a major deal-breaker for me. In the words of TOPLAP (the Temporary Organization for the Promotion of Live Artistic Programming) Manifesto: "Give us access to the performer's mind, to the whole human instrument. Obscurantism is dangerous. Show us your screens." I don't want to think "how the hell did (s)he do that?", but "What the hell does that mean and why does it work?". I want to watch the coder scramble and make errors and find beautiful things by accident. I want to watch the process, not the result. You don't do piano performances behind a curtain. The performer's screen must be identical to the audience's.
2. Needs more inputs: you need at least temporally smoothed FFT, integrated FFT, smoothed integrated FFT. FFT alone is completely useless once you actually try to animate some geometry with it. No more than 8 bins or so are necessary... you can pack that into four uniform arrays if you want. Your FFT texture might be too fine-grained to use well (mipmaps of the FFT texture might help with this).
I realize that my first point will not be addressed for this year's revision (if at all) and that is fine as I'm very happy about the existence of this competition as it is. Writing a suitable editor that displays code and visuals takes some time... I have written a livecoding tool that we used to livecode GLSL visuals at this year's tUM that displays code and text - it isn't exactly stable (although it didn't crash during the 2-hour-performance), but it is pretty on screen and has some unique livecoding-related features. Never got around to make the code pretty as well and release it... :/
Dfox tried lobbying me into participating in the competition, and so far I have neither said yes or no, but I guess I'll concentrate on the other compos and I'm out. At least for this year.
Maybe we can meet at revision, have a beer, see how the compo goes and think about what can be improved? Maybe have a look at both tools? I am also totally willing to give the code away, but at the moment deployment is not exactly end-user ready.
Thanks for making this happen.
Things I like:
1. Offline shader pre-computation so the visuals don't stutter on compile. Major plus!
2. You made a tool and distributed it. thanks!
3. Livecoding on Demoscene events is now a thing <3
Things I don't like:
1. The code is not shown on screen. This is a major deal-breaker for me. In the words of TOPLAP (the Temporary Organization for the Promotion of Live Artistic Programming) Manifesto: "Give us access to the performer's mind, to the whole human instrument. Obscurantism is dangerous. Show us your screens." I don't want to think "how the hell did (s)he do that?", but "What the hell does that mean and why does it work?". I want to watch the coder scramble and make errors and find beautiful things by accident. I want to watch the process, not the result. You don't do piano performances behind a curtain. The performer's screen must be identical to the audience's.
2. Needs more inputs: you need at least temporally smoothed FFT, integrated FFT, smoothed integrated FFT. FFT alone is completely useless once you actually try to animate some geometry with it. No more than 8 bins or so are necessary... you can pack that into four uniform arrays if you want. Your FFT texture might be too fine-grained to use well (mipmaps of the FFT texture might help with this).
I realize that my first point will not be addressed for this year's revision (if at all) and that is fine as I'm very happy about the existence of this competition as it is. Writing a suitable editor that displays code and visuals takes some time... I have written a livecoding tool that we used to livecode GLSL visuals at this year's tUM that displays code and text - it isn't exactly stable (although it didn't crash during the 2-hour-performance), but it is pretty on screen and has some unique livecoding-related features. Never got around to make the code pretty as well and release it... :/
Dfox tried lobbying me into participating in the competition, and so far I have neither said yes or no, but I guess I'll concentrate on the other compos and I'm out. At least for this year.
Maybe we can meet at revision, have a beer, see how the compo goes and think about what can be improved? Maybe have a look at both tools? I am also totally willing to give the code away, but at the moment deployment is not exactly end-user ready.
Thanks for making this happen.
Quote:
The code is not shown on screen.
Awww, what's the point then? :(
What Gargaj said. No point in watching the compo, then.
so it's not a live coding event but a live waiting for stuff to appear on a screen event. yaay :P
what did the mercury boys use for that livecode-dj-visual-stuff at TUM? isnt that sufficient already?
what did the mercury boys use for that livecode-dj-visual-stuff at TUM? isnt that sufficient already?
IQ had a live coding tool, didn't he?
As far as I know, we'll switch between the code and the output regularly.
bonzaj: oh, i meant what cupe said in "things i don't like 2.", most important for me to do bass sync for instance is the temporal smoothness, i.e. a low pass i guess. i'm not an expert on this tbh. but in the current state, the fft is barely usable to do anything really cool with it.
bonzaj: what worked for me was this:
Make 8 FFT bins. make four sets of those. call the original 8-element-array FFT, the others FFTs FFTi and FFTsi for example. Initialize with zeroes.
each frame, with lambda being something like 0.9:
Needs some tuning and my setup was bit more complex (i had something like FFTs = max(FFTs*0.9, FFT_smoothed_a_tiny_bit)).
You can also make the speed of the exponential decay independent of the framerate by applying math :)
The integrated ones can be used to animate geometry: for example: rotating an object by FFTsi[0]-FFTsi[5] will make it turn left on the bass and right on the hihats. And go nuts on the dubstep wobble.
Make 8 FFT bins. make four sets of those. call the original 8-element-array FFT, the others FFTs FFTi and FFTsi for example. Initialize with zeroes.
each frame, with lambda being something like 0.9:
Code:
FFTs = FFTs*lambda+(1-lambda)*FFT
FFTi += FFT
FFTsi += FFTs
Needs some tuning and my setup was bit more complex (i had something like FFTs = max(FFTs*0.9, FFT_smoothed_a_tiny_bit)).
You can also make the speed of the exponential decay independent of the framerate by applying math :)
The integrated ones can be used to animate geometry: for example: rotating an object by FFTsi[0]-FFTsi[5] will make it turn left on the bass and right on the hihats. And go nuts on the dubstep wobble.
I mean you might as well just use http://glsl.heroku.com/e at that point, no?
oh noez, webgl...
Maali: we used my tool. It's called "keksfabrik" for some reason. A week before revision is way too short for changing plans though...
A short video of the tool from when I had just added FFT support is here: http://vimeo.com/65411868 (shows the code editor and a little bit how the integrated FFT is used).
Oh, and i forgot another (really unimportant) request:
3. multi-pass rendering: generate mipmaps from all passes. provide the outputs from all other passes as textures to the current pass (shaders defaulting to pass-through). Have the last pass output on the screen. For postprocessing or feedback effects.
Damn, now i feel guilty of "Oh, you did that thing. Great, but you should have done it MY way. That I didn't tell you about.". Sorry.
A short video of the tool from when I had just added FFT support is here: http://vimeo.com/65411868 (shows the code editor and a little bit how the integrated FFT is used).
Oh, and i forgot another (really unimportant) request:
3. multi-pass rendering: generate mipmaps from all passes. provide the outputs from all other passes as textures to the current pass (shaders defaulting to pass-through). Have the last pass output on the screen. For postprocessing or feedback effects.
Damn, now i feel guilty of "Oh, you did that thing. Great, but you should have done it MY way. That I didn't tell you about.". Sorry.
nice tool, been playing with this most of today. All I'm saying is, this live coding compo is gunna be a lot of fun :)
Gargaj: nah, you need music input... and even if glsl.heroku had that (does it, maybe?), you'd still need the integrated fft. synched animation without any state is... difficult.
dfox: alright, that's something. I hope they crank up the font size.
dfox: alright, that's something. I hope they crank up the font size.