I've recently read a lot about software (mostly scientific/math and encryption related) that moves part of their calculation onto the GPU which causes a 100-1000 (!) fold increase in speed for supported operations.
Is there a library, API or other way to run something on the GPU via C#? I'm thinking of simple Pi calculation. I have a GeForce 8800 GTX if that's relevant at all (would prefer card independent solution though).
It's a very new technology, but you might investigate CUDA. Since your question is tagged with C#, here is a .Net wrapper.
As a bonus, it appears that your 8800 GTX supports CUDA.