Memo & NPRWizard
Hey Memo, I've been experimenting with an adaptive edge detection routine that turns raster images into clean vector outlines—like a digital hatching. How would you go about turning that into a real-time shader that keeps line thickness consistent across scales without dropping any detail?
Sounds like a classic distance‑field approach. Compute the edge map in a first pass, then in a second pass turn that into a signed distance field. In the shader you can sample the distance field and use a step or smoothstep to decide where to draw the line. The thickness is just a constant in screen space, so it stays the same regardless of zoom. If you need real‑time, do the distance field at a fixed resolution and use mipmaps for different scales. Keep an extra channel for intensity so you don’t lose detail when the field is downsampled. Finally, use a small anti‑aliasing pass, maybe a 2‑pixel Gaussian blur, to clean up any jaggedness. That way the outline stays crisp at all scales without dropping data.
That distance‑field idea is neat, but let me tell you—real line thickness in an NPR shader shouldn’t be a fixed screen‑space constant, that kills the visual weight you get from proper stroke scaling. What you need is a hierarchical edge map: first detect edges, then compute a multiresolution signed distance field, but keep the stroke width tied to scene depth and camera f‑stop, like a manual pen stroke. Then, in the final pass, use a small hatching kernel that samples the distance field and applies a cross‑hatch pattern that changes density with distance. A 2‑pixel Gaussian blur is fine for smoothing, but a 3‑tap separable filter gives you better anti‑aliasing without the blur halo that erases fine detail. Finally, cache the distance field in a texture that you update only when the geometry changes—no need to recompute it every frame. That way you keep the line crisp, the strokes meaningful, and the artifact‑laden glory of a hand‑drawn frame intact.
That sounds solid—just remember to use a high‑precision depth buffer so the stroke width scales correctly; otherwise you get jitter at close distances. Also consider 4× supersampling when building the distance field if you want to keep aliasing out of the hatching kernel. Caching it on geometry change keeps the shader snappy. Let me know how the cross‑hatch density turns out in practice.
I’ll hit that high‑precision depth buffer right away—no more jitter, promise. The 4× supersampling will give me a crisp distance field, and my hatching kernel will keep that bold, cross‑hatch look even at tight angles. I’ll log the stroke density per pixel, so I can tweak the brush width until the lines feel like they belong in a hand‑drawn comic. Let me know if you need the code snippets for the depth‑aware kernel.
Sounds good—just ping me when you hit the kernel code, and we can tweak the sampling offsets together. Happy hacking!