Raymarching Beginners' Thread
category: code [glöplog]
... and assuming you can generate an image like you posted above, the remaining thing to do is to perform a distance transform (educate me beautiful), and voilà heightmap.
I agree with IQ, first person to do erosion realtime wins ALL the siggraphs.
CHALLENGE ACCEPTED
Cool! I did something a bit like that a while back (using a height map, then using a separate texture to add water with a shader moving the water downhill to form rivers + lakes... not nearly so successfully as that demo though).
I think the idea was to do erosion in a single step though, rather than iteratively. It's not hard to do it iteratively :)
I think the idea was to do erosion in a single step though, rather than iteratively. It's not hard to do it iteratively :)
Hum, in looked those papers quickly... They explicitly simulate an erosion process. One can have a fast enough simulation for real-time, one can throw enough computing power at it, I do not contradict this.
However, I'm looking for an other kind of princess. It would be a 2d => 1d function which when added to a height-map, erode it. That function would not explicitly simulate an erosion process. If one use distance transform, as explained by xTr1m, the problem is reduced to have a function able to represent some diffusion-aggregation pattern, *without* simulating diffusion-aggregation. (disclaimer : it's for the sport, I don't claim it would be useful)
However, I'm looking for an other kind of princess. It would be a 2d => 1d function which when added to a height-map, erode it. That function would not explicitly simulate an erosion process. If one use distance transform, as explained by xTr1m, the problem is reduced to have a function able to represent some diffusion-aggregation pattern, *without* simulating diffusion-aggregation. (disclaimer : it's for the sport, I don't claim it would be useful)
btw. I'm tinkering with raymarching again and ran into a problem using the blend() operator (colors == normals). This is the blend between a cube and a sphere:
this is the same thing with min()
What I'd expect to get is a sphere and a cube and when they are in a certain distance the "merge", but the sphere also disappears sometimes...
Another problem is that the raymarching function needs like 128-256 steps to NOT leave empty parts in the objects (read: return vec4(0, 0, 0, i)) and I'm not sure why.
this is the same thing with min()
Code:
float sdSphere(vec3 p, float s) {
return length(p) - s;
}
float sdBox(vec3 p, vec3 s) {
vec3 di = abs(p) - s;
float mc = maxcomp(di);
return mc<0.0 ? mc : length(max(di,0.0));
}
vec3 grad(vec3 p, float eps) {
vec2 e = vec2(eps, 0.0);
return vec3(field(p+e.xyy), field(p+e.yxy), field(p+e.yyx)) / e.x;
}
vec4 rm(vec3 p, vec3 d, float maxSteps, float maxDist) {
vec3 startPos = p;
float i=0.;
float l=1.;
float dist = 1.;
float steps = 0.;
float epsilon = .0001;
for(; i<1. && l>epsilon && dist < maxDist; i+=1.0/maxSteps) {
l = blend(sdSphere(p + somemovementshit), sdBox(p + somemovementshit));
p += (l * d * .5);
dist = length(startPos - p);
steps += 1.;
}
//calculate normalized z
if (l < epsilon) {
vec3 n = grad(p, dist * .01);
return vec4(abs(n), i);
}
else {
return vec4(0, 0, 0, i);
}
}
void main() {
vec4 r = rm(eye, dir, 256, 10.);
}
What I'd expect to get is a sphere and a cube and when they are in a certain distance the "merge", but the sphere also disappears sometimes...
Another problem is that the raymarching function needs like 128-256 steps to NOT leave empty parts in the objects (read: return vec4(0, 0, 0, i)) and I'm not sure why.
raer: tried a bigger epsilon value? If you need a large value to avoid holes, it implies that the ray is taking too many steps to reach the surface.. which implies that you need to increase the epsilon or the march step.
Not sure on the blend part, it's not something I've gotten as far as using, but how does it work out if you replace it with a straight min()?
Not sure on the blend part, it's not something I've gotten as far as using, but how does it work out if you replace it with a straight min()?
The normal colors does not seem to make sense about what direction that they represent when you compare the two images (for example, the blue on the sphere - is it Z or Y?)
I changed my raymarching loop to
And the holes are mostly gone. The blend operator is still an issue...
Code:
vec4 rm(vec3 origin, vec3 dir, float maxSteps, float maxDist) {
float l=1.;
float dist = 0.1;
float steps = 0;
float epsilon = .0001;
vec3 p;
for(;steps < maxSteps; steps++) {
p = origin + dir * dist;
l = field(p);
if ((abs(l) < epsilon) || dist > maxDist) {
break;
}
dist += l * 0.75;
}
//calculate normalized z
float i = steps / maxSteps;
if (abs(l) < epsilon) {
vec3 n = grad(p, maxDist * l);
return vec4(n, i);
}
else {
return vec4(0, 0, 0, i);
}
}
And the holes are mostly gone. The blend operator is still an issue...
ignore the normal colors.
Some images again:
blend
min
blend
min
same size/camera/settings, just the operator changed...
What do you use as blend operator?
Finding a proper blend operator is not as trivial as one might think.
If somebody has a properly working one - tell us - I tried several formulas from several papers - they all sucked.
Finding a proper blend operator is not as trivial as one might think.
If somebody has a properly working one - tell us - I tried several formulas from several papers - they all sucked.
I just realised, I have all images on pouet blocked for safety (at work). I might have missed something :)
DOH. Sorry. I forgot the blend operator while copy-pasting...
Code:
float opBlend(float d1, float d2) {
float dd = cos((d1 - d2) * PI);
return mix(d1, d2, dd);
}
Lol. I was thinking there was some glsl blend() function I'd forgotten about :D
ok... opBlend only seems to work when the objects are pretty close to each other. I tried "a13X_B"s metaball-formula from the toolbox thread and it works pretty well.
If you're interested in figuring out why that blend op didn't work, i see a pretty big flaw in it: the blending needs to depend on the distance between the two objects, but it doesn't have that info. You're just telling it how far away they are from the current position. They could be the same distance from the point but in opposite directions, so actually far apart.
Quote:
Quote:
I agree with IQ, first person to do erosion realtime wins ALL the siggraphs.
It seems like you all lost.
Oh, and a WebGL demo is here.
i never said anything about realtime. what i sad is that still today nobody has been able to do procedural erosion. procedual as in perlin noise. all solutions are based on simulation, and therefore need a grid (bitmap).
the problem of procedural erosion is actually deeper. erosion is a global process (soil moves from one part to another, etc) yet procedural functions only have (in principle) local knowledge, as they are a function of the point under consideration, f(x,y,z). of course, f could of course cast rays or run some global process internally, but then we are again probably on the domain of simulation, not proceduralism.
there is only one person so far that once claimed to have solved the problem of procedural erosion, but never gave actual code or paper or demo, so the problem pretty much stays unsolved. whoever solves it, wins siggraph :D
you mean that "erosion fractal" - thing?
iq - I just wanted to give some pointers on how to implement such a thing...
Yet I don't see how a simulated/iterative process can be easily implemented as a procedural function.
Yet I don't see how a simulated/iterative process can be easily implemented as a procedural function.
what about a combination of: turtle graphics, but instead of drawing lines, draw a FBM deformed line, in terms of: push, rotate x deg left, draw fbm forward, pop, rotate x deg right, draw fbm forward, pop... and/or make a tree of FBM lines using a grammar with L-systems... That should generate a picture similar to what las posted:
Finally perform a distance transform. I'd like to see the result of this proposal, anyone got some spare time to test this? :)
Finally perform a distance transform. I'd like to see the result of this proposal, anyone got some spare time to test this? :)
Using fBm and twisting it a lot => I tinkered a bit in that direction, but so far failed. The problem is that fBm makes planar-graph-like structure, not tree-like structures.
L-Systems => *yes*. They can be evaluated in a procedural fashion. Procedural distance transform on such a beast, still no can do for me...
L-Systems => *yes*. They can be evaluated in a procedural fashion. Procedural distance transform on such a beast, still no can do for me...
Looks like IQ is up to something! https://twitter.com/iquilezles/status/247802128951832577/photo/1