Raymarching Beginners' Thread
category: code [glöplog]
Hmm.. maybe cool for 2002.. but looking at the news, all that's been added since is reflections and AA. I mean, seriously, a choice of just spheres? A plane primitive would fix the ugly floor AND speed it up.
This made me laugh: "It shows realistic 3D world" "It's alien world which is made of balls."
This made me laugh: "It shows realistic 3D world" "It's alien world which is made of balls."
AND they are touching.
:D
Obviously, balls not touching are not realistic ;)
In a gay alien world, yes.
just to get you guys up to date, how realtime ray tracing can actually look today, and no need for ugly selftuned fake BRDF models and heaps to track a selfmade stack for secondary rays.. ;) ;)
http://www.youtube.com/user/jaccobikker#p/a/u/0/OHWxsUUataw
http://www.youtube.com/user/jaccobikker#p/a/u/0/OHWxsUUataw
yumeji: Perhaps not relevant, but I was doing realtime "raytracing" on the Amiga at one time. Never got beyond simple prototyping, but it does work.
So yeah, you can at least approximate the shape of the reflection of one sphere on another. I only made a rough approximation and fitted it to a deformed ellipse, but I imagine you could make it look decent with some processing power. And maybe you could even get the envmapping right! :)
(Yes I know, one of the reflections is the wrong colour, but that's what you get.)
So yeah, you can at least approximate the shape of the reflection of one sphere on another. I only made a rough approximation and fitted it to a deformed ellipse, but I imagine you could make it look decent with some processing power. And maybe you could even get the envmapping right! :)
(Yes I know, one of the reflections is the wrong colour, but that's what you get.)
Toxie: interesting! I had a thought to start something like that during lunch today. Only got as far as a lit sphere on a plane so far, and an attempt at shadows (which failed miserably until I ran out of lunch :( ).
Are you doing some kind of "fast/interactive" + "slow/stationary" rendering there? It's hard to tell from youtube, but it looked slightly noisy when moving, then perfect one it stops.
Are you doing some kind of "fast/interactive" + "slow/stationary" rendering there? It's hard to tell from youtube, but it looked slightly noisy when moving, then perfect one it stops.
364fps? :|
Looks like textured polys with a (fake?) reflection. Is any of it actually traced doom?
(And if it is all traced, I imagine it's 364spf :D )
(And if it is all traced, I imagine it's 364spf :D )
I did it :)
Can somebody give me some feedback on my code?
Can somebody give me some feedback on my code?
xernobyl: It's easier to do screenshots in WinUAE ;)
@pinda: At a first glance it looks ok. What's wrong with it?
pinda: looks ok from a quick glance through. Is it drawing your plasma? If so, next task is to raymarch a simple cube/sphere/whatever :)
psionice: It does an inverse trace of four points per sphere per level of recursion.
I.e. I set up the equation for the full trace: (u,v) -> reflect -> reflect -> (x,y,z)_on_target_sphere. Then I solve for (u,v) given (x,y,z) as opposed to the other way around.
The world-space coordinates I use for each sphere are educated guesses as to which points on it will form the outline of its reflection, based on where the sphere is with respect to its parent in the reflection hierarchy, (i.e. points from sphere A's horizon as seen from sphere B). I only take four samples which is enough to define the squashed ellipse shape that is then rendered. A better version would sample more points for a more accurate reflection.
It also looks especially shit in a still image because I was testing it with a crude disc mesh and a very crappy rasteriser. If I'd finished it I would have fixed that and added tint to the reflections and so on.
I.e. I set up the equation for the full trace: (u,v) -> reflect -> reflect -> (x,y,z)_on_target_sphere. Then I solve for (u,v) given (x,y,z) as opposed to the other way around.
The world-space coordinates I use for each sphere are educated guesses as to which points on it will form the outline of its reflection, based on where the sphere is with respect to its parent in the reflection hierarchy, (i.e. points from sphere A's horizon as seen from sphere B). I only take four samples which is enough to define the squashed ellipse shape that is then rendered. A better version would sample more points for a more accurate reflection.
It also looks especially shit in a still image because I was testing it with a crude disc mesh and a very crappy rasteriser. If I'd finished it I would have fixed that and added tint to the reflections and so on.
@psonice: while i do program similar things for a living ;), the video wasn't made by me, but by jacco, who is an expert in ultra-fast highly tweaked path tracing on GPUs by now..
what he does though is simple progressive rendering, i.e. use a fixed amount of samples per frame, thus in navigation it still looks a bit noisy while on static camera it can progressively refine the picture with more and more samples..
what he does though is simple progressive rendering, i.e. use a fixed amount of samples per frame, thus in navigation it still looks a bit noisy while on static camera it can progressively refine the picture with more and more samples..
doom: cunning. Just the bare minimum amount of tracing necessary.
toxie: ah, ok. I was thinking before that the scene is a bit simple for that kind of technique, but on watching again, I see the lighting, especially around the glass spheres.. plus the other videos he's uploaded. Good shit :)
toxie: ah, ok. I was thinking before that the scene is a bit simple for that kind of technique, but on watching again, I see the lighting, especially around the glass spheres.. plus the other videos he's uploaded. Good shit :)
he just chose that simple scene because its as famous in the community as the nowadays boring cornell box.. so more interesting stuff is definetly possible (although, of course it still has to be made up of simple primitives in this engine)..
maybe so, but he'd doing some pretty complex scenes in his other videos :)
pinda
Didn't try to compile - looks not that bad at all, but you should read also how to print shader error output (see lighthouse3d glsl tutorial).
toxie: Try pathtracing in 4k with cool content... ;)
archee demonstrated something like that some time ago, but the scene was kinda static (static == boring).
And self tuning fake BRDFs and fake stacks are hell of a fun - at least if you don't give a fuck about what "reality" is like.
Creating fake materials nobody has seen before is the fun part.
Didn't try to compile - looks not that bad at all, but you should read also how to print shader error output (see lighthouse3d glsl tutorial).
toxie: Try pathtracing in 4k with cool content... ;)
archee demonstrated something like that some time ago, but the scene was kinda static (static == boring).
And self tuning fake BRDFs and fake stacks are hell of a fun - at least if you don't give a fuck about what "reality" is like.
Creating fake materials nobody has seen before is the fun part.
toxie: Ich sehe keine Kaustiken!
yeah, complex scenes = tough. I had a quick experiment with that the other day, managed to get the framerate down to <10fps very fast for a (small!) bunch of cubes :( Complex lighting is tough too, like soft shadows + caustics + reflected light.
I think mixed raytrace + raymarch is the way to go. Raytracing is just soooo much faster for many cases, and it's perfectly accurate. If only it was as easy to work with as raymarching! I just fixed up the quick raytracer I wrote at lunch time - it now renders a nice lit + reflective sphere on a textured plane, with working (sharp :( ) shadows. It runs at 43fps, at 1920x1200. On a radeon 2600. That's kind of unthinkable with a raymarcher :D
I think mixed raytrace + raymarch is the way to go. Raytracing is just soooo much faster for many cases, and it's perfectly accurate. If only it was as easy to work with as raymarching! I just fixed up the quick raytracer I wrote at lunch time - it now renders a nice lit + reflective sphere on a textured plane, with working (sharp :( ) shadows. It runs at 43fps, at 1920x1200. On a radeon 2600. That's kind of unthinkable with a raymarcher :D
...and so, any suggestions on how do deform objects when raytracing? The problem is finding the value of the intersection when say you add perlin noise to a sphere.
When raymarching, you give the value of the ray's current position as the perlin seed value, and it naturally gives you the correct answer because ray gets closer to the surface until it hits. At that point, the seed value is the intersection position.
When you trace, what do you use as that seed for the perlin function? If you give it the ray origin, that's the camera - and it changes when the camera moves. If you use the intersection point of a regular sphere as the seed, then it's still wrong when the camera moves because the angle of the ray changes, and it still hits a different point on the surface. (This was the best result I got though, a nice smoothly distorted sphere.. but it animates when the camera rotates)
I guess really we have to calculate the intersection of the distorted sphere directly. Which is hard. Any clues on how to calculate that efficiently?
Slightly related: normally the whole OS slows down when stuff like this is rendering if the framerate is low, because the GPU is too busy to update the screen. But the OS is running very slow and laggy while this simple raytrace test is running, and it's doing 90+fps in my current window. Why might that be?!
When raymarching, you give the value of the ray's current position as the perlin seed value, and it naturally gives you the correct answer because ray gets closer to the surface until it hits. At that point, the seed value is the intersection position.
When you trace, what do you use as that seed for the perlin function? If you give it the ray origin, that's the camera - and it changes when the camera moves. If you use the intersection point of a regular sphere as the seed, then it's still wrong when the camera moves because the angle of the ray changes, and it still hits a different point on the surface. (This was the best result I got though, a nice smoothly distorted sphere.. but it animates when the camera rotates)
I guess really we have to calculate the intersection of the distorted sphere directly. Which is hard. Any clues on how to calculate that efficiently?
Slightly related: normally the whole OS slows down when stuff like this is rendering if the framerate is low, because the GPU is too busy to update the screen. But the OS is running very slow and laggy while this simple raytrace test is running, and it's doing 90+fps in my current window. Why might that be?!
psonice : can you ray march the segment of the ray that intersect with your sphere?
This is how I was planning on doing it :)
This is how I was planning on doing it :)
hmm.. that's possible yeah I guess. With the bounding sphere you remove most of the march at least. I'd like to do it in one step though if possible :)
Another thing I figured out: if it's not possible to distort the sphere directly, you can distort the normals very easily, and as texturing is quite easy you could also do normal mapping or even parallax occlusion and stuff. And from there, you can still raytrace reflections.. just have to watch out for lack of self-shadow/reflection. So fine details don't need to be raytraced anyway.
Another thing I figured out: if it's not possible to distort the sphere directly, you can distort the normals very easily, and as texturing is quite easy you could also do normal mapping or even parallax occlusion and stuff. And from there, you can still raytrace reflections.. just have to watch out for lack of self-shadow/reflection. So fine details don't need to be raytraced anyway.