Note bene: I realize this is an immensely complicated question with about a million levels of nuance that I'm trying to reduce to a single number...
I'm about to undertake a large video encoding project using H.264 encoding. We are trying to create multiple bitrate profiles in order to accommodate streaming across internet connections, processors, devices, etc.
Generally speaking, what kind of compression ratio should I be expecting to see (while staying within a reasonable level of quality)?
For example, a 640x360 (16:9) pixel video file @ 24 frames per second and 16-bit color should yield an uncompressed file that is approximately 33 MB/s.
I've been told that, for that file, 500 Kbits/second (or 62 KB/s) is not an unreasonable video bitrate. That seems insane - more than 530:1 compression? That's 99.8% compression. Is my math wrong?
I'm just looking for a rough outer guide for quality, like "more than 500x compression is crazy" or "less than 400x is a waste of bandwidth". I've looked everywhere, and nothing gives me any kind of expected compression...
In a quite interesting document called H.264 Primer, a simple formula is given as an hint to compute the `ideal' output file bitrate, based on the video's characteristics:
[image width] x [image height] x [framerate] x [motion rank] x 0.07 = [desired bitrate]
where the image width and height is expressed in pixels, and the motion rank is an integer between 1 and 4, 1 being low motion, 2 being medium motion, and 4 being high motion (motion being the amount of image data that is changing between frames, see the linked document for more information).
So for instance, if we take a 1280x720 video at 24 FPS, with medium motion (movie with slow camera movements, not many scene changes...), the expected ideal bitrate would be:
1280 x 720 x 24 x 2 x 0.07 = 3,096,576 bps => approximatively 3000 kbps
This is purely a hint, and in my opinion, the only way to accurately find the ideal bitrate is trial by error :)