by mh » Sun Apr 15, 2012 1:22 pm
As I understand it, you've got your depth buffer and your screen buffer. The screen buffer contains a set of palette indexes, and the depth buffer contans - well - depths. If you know the fog colours, you can then use them to build a lookup table - make it 256x256 or something - then cross-reference depth with palette index on it and get a new palette index for the fog colour.
That assumes an 8-bit screen buffer. If you've got a 32-bit screen buffer you can just plug in the standard blend equation ((colour * factor) + (fog * (1 - factor))) and out comes the end result.
You can also do the same with a hardware accelerated engine by copying your depth buffer to a texture then blending it over the scene as a full-screen quad, using a shader to get your blend factor from the depth. It's slower than just using standard fixed-pipeline fog, and also slower than putting a fog equation into a shader, but it works.
One thing I would lose with this kind of setup is the ability to have fullbrights shining through fog. You can do it with software, using the same colour for all 256 entries of the relevant columns of your table, but hardware doesn't give you that kind of fine-grained control, and I can't see a straightforward way of doing it via a shader.