Nicolas Vizerie took a look at Intel’s paper on MorphoLogical Anti-Aliasing (MLAA) recently and became intrigued, but noticed that the original algorithm wasn’t well suited for use with GPU Pixel Shaders.  He did some work, and now has a demonstration working on an nVidia 8700MGT.

The original technique is not very suitable to GPU with pixel shaders alone, so some adaptation was needed. The reason is that the algorithm scans edges and patches pixel based on the edge length, and the configuration at edge extremities (to sum up). Edges extremities can be far from the current pixel, so using a pixel shader (pure parallel model) requires each pixel to recompute the distance from itself to the edge extremities. For an edge of length N, the complexity becomes O(N^2), which can lead to performance problems. The obvious solution is to compute a bilateral distance texture. T

via MLAA (MorphoLogical AntiAliasing) on the GPU using Direct3D9.0 – GameDev.Net Discussion Forums.