So I’ve gone over why using matcaps can be an efficient way to get some good-looking materials in 3d graphics, but we also came across one of the main downsides of matcaps: they’re camera relative and that’s pretty limiting. So if we’re hell-bent on using matcaps we either live with this limitation – obviously unacceptable – or we hack the **** out of it!
For those unfamiliar, a matcap (material capture) is a single image that describes the way an object (generally a sphere) made of some sort of material appears within some sort of environment. I have no idea where and when matcaps drew their first breaths but it was when zBrush rolled onto the scene that I first encountered them. I was blown away by how a program that appeared to simulate millions of data points could also make an object appear to be made of light-scattering materials like marble, jade, and skin. That was a pretty big mind-bend in a day when real-time skin shaders were in their infancy and desperately (and very often poorly) trying to replicate their pre-rendered cousins.
Last year I was jamming around with Unity on Android and was desperate to get some more interesting materials on my objects. I’m certainly not the kind of person who wouldn’t want to blow an entire game’s graphics budget on a single subsurface shader, but my phone’s GPU just didn’t seem willing to get on the same wavelength. Luckily, from the dusty caverns of the ol’ skull-box I remembered about those matcap thingies… Continue reading