Can someone point me to a good definition of Gauge32 vs Counter32? I understand that Counter32 can wrap, but Gauge32 can't.
I'm trying to understand their semantics. For example, I've heard you should take the difference between two Counter32 readings to get a value/second. Is there something like that for a Gauge32 value?
Thanks for any insight.
The best definition of these (i.e. the definition) is in the sections of the RFC that defines them: RFC 2578.
As the RFC says, a Counter32 has no defined initial value, so a single reading of Counter32 has no information content. This is why you have to take two (or more) readings to make sense of it. An example of this would be the number of packets received on an ethernet interface. If you take a reading and get back 4 million packets, you haven't learned anything: the wire could have been pulled out of the interface for the past year, or it could be passing millions of packets per second. You have to take multiple readings to know anything.
A Gauge32 on the other hand, measures some quantity at a moment in time and may go up or down. You can't necessarily make meaningful observations about two (or more) readings over time. An example of this is free disk space. You can fetch the value now, and an hour from now, and find that the change is zero -- but you can't draw the conclusion that nothing has been written to disk over the course of the hour. It's possible that the disk is getting hammered with constant additions and deletions that do not result in a net change in free space.