Memo & NPRWizard
Hey Memo, I've been experimenting with an adaptive edge detection routine that turns raster images into clean vector outlines—like a digital hatching. How would you go about turning that into a real-time shader that keeps line thickness consistent across scales without dropping any detail?
Sounds like a classic distance‑field approach. Compute the edge map in a first pass, then in a second pass turn that into a signed distance field. In the shader you can sample the distance field and use a step or smoothstep to decide where to draw the line. The thickness is just a constant in screen space, so it stays the same regardless of zoom. If you need real‑time, do the distance field at a fixed resolution and use mipmaps for different scales. Keep an extra channel for intensity so you don’t lose detail when the field is downsampled. Finally, use a small anti‑aliasing pass, maybe a 2‑pixel Gaussian blur, to clean up any jaggedness. That way the outline stays crisp at all scales without dropping data.
That distance‑field idea is neat, but let me tell you—real line thickness in an NPR shader shouldn’t be a fixed screen‑space constant, that kills the visual weight you get from proper stroke scaling. What you need is a hierarchical edge map: first detect edges, then compute a multiresolution signed distance field, but keep the stroke width tied to scene depth and camera f‑stop, like a manual pen stroke. Then, in the final pass, use a small hatching kernel that samples the distance field and applies a cross‑hatch pattern that changes density with distance. A 2‑pixel Gaussian blur is fine for smoothing, but a 3‑tap separable filter gives you better anti‑aliasing without the blur halo that erases fine detail. Finally, cache the distance field in a texture that you update only when the geometry changes—no need to recompute it every frame. That way you keep the line crisp, the strokes meaningful, and the artifact‑laden glory of a hand‑drawn frame intact.