It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:
If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.
EDIT:
I've used compute shaders from JavaScript in Chrome using WebGL to solve the travelling salesman problem as a distributed set of smaller optimization problems solved in the fragment shader, and in a few other genetic optimization problems.
Problems:
You can put floats in (r: 1.00, g: 234.24234, b: -22.0) but you can only get integers out (r: 255, g: 255, b: 0). This can be overcome by encoding a single float into 4 integers as an output per fragment. This is actually so heavy an operation that it almost defeats the purpose for 99% of problems. Your better to solve problems with simple integer or boolean sub-solutions.
Debugging is a nightmare of epic proportions and the community is at the time of writing this actively.
Injecting data into the shader as pixel data is VERY slow, reading it out is even slower. To give you an example, reading and writing the data to solve a TSP problem takes 200 and 400 ms respectively, the actual 'draw' or 'compute' time of that data is 14 ms. In order to be usable your data set has to be large enough in the right way.
JavaScript is weakly typed (on the surface...), whereas OpenGL ES is strongly typed. In order to interoperate we have to use things like Int32Array or Float32Array in JavaScript, which feels awkward and constraining in a language normally touted for it's freedoms.
Big number support comes down to using 5 or 6 textures of input data, combining all that pixel data into a single number structure (somehow...), then operating on that big number in a meaningful way. Very hacky, not at all recommended.