I can't seem to find any documentation online about this, and what I am googling is giving me a lot of conflicting information...
From iphonedevsdk.com:
The accelerometers used in the first and second generation iPhones (and I assume also the iPod touches) operate in two modes: +/- 2 g, and +/- 8 g. (Actually, as you observed, they report accelerations somewhat outside the nominal range. Accuracy is not spec'ed outside that range, though.)
Apple operates them in the +/- 2 g mode. There is a tradeoff: The current resolution is nominally 0.018 g, according to the datasheet (though my first generation iPhone uses 0.018168604, according to a modified version of AccelerometerGraph). In the +/- 8 g mode, the resolution would be four times cruder.
I assume Apple decided that the finer resolution would be more useful than the wider range. (I'd rather see finer resolution than 0.018 g. So neither of us is fully satisfied.)
You cannot change the mode with any published feature of the APIs. Since they are passing acceleration as a double, they could theoretically allow a change in mode, and simply look at the mode when rescaling the A/D value, before reporting acceleration. (The obvious place to set the mode would be in the call which sets up the application to receive accelerometer information.) However, for backward compatibility, the OS would have to set the accelerometer mode to +/- 2 g at the beginning of the application. And none of the background processes could be allowed to set the accelerometer mode.