I'm looking to parallelize some complex math, and webgl looks like the perfect way to do it. The problem is, you can only read 8 bit integers from textures. I would ideally like to get 32 bit numbers out of the texture. I had the idea of using the 4 color channels to get 32 bits per pixel, instead of 4 times 8 bits.
My problem is, glsl doesn't have a "%" operator or any bitwise operator!
TLDR: How do I convert a 32bit number to 4 8bit numbers by using the operators in glsl.
Some extra info on the technique (using bitwise operators):
How to store a 64 bit integer in two 32 bit integers and convert back again
You can bitshift by multiplying/dividing by powers of two.
As pointed out in the comments the approach I originally posted was working but incorrect, here's one by Aras Pranckevičius, note that the source code in the post itself contains a typo and is HLSL, this is a GLSL port with the typo corrected:
const vec4 bitEnc = vec4(1.,255.,65025.,16581375.);
const vec4 bitDec = 1./bitEnc;
vec4 EncodeFloatRGBA (float v) {
vec4 enc = bitEnc * v;
enc = fract(enc);
enc -= enc.yzww * vec2(1./255., 0.).xxxy;
return enc;
}
float DecodeFloatRGBA (vec4 v) {
return dot(v, bitDec);
}