I'm trying to implement an X.509 certificate generator from scratch (I know about the existing ones, but I need yet another one). What I cannot understand is how to calculate the SHA-1 (or any other) fingerprint of the certificate.
The RFC5280 says that the input to the signature function is the DER-encoded tbsCertificate field. Unfortunately, the hash that I calculate differs from the one produced by OpenSSL. Here's a step-by-step example.
openssl x509 -fingerprint
sha1sum
utilityNow, the hashes I get at steps 2 and 3 are different. Can someone please give me a hint what I may be doing wrong?
Ok, so it turned out that the fingerprint calculated by OpenSSL is simply a hash over the whole certificate (in its DER binary encoding, not the ASCII PEM one!), not only the TBS part, as I thought.
For anyone who cares about calculating certificate's digest, it is done in a different way: the hash is calculated over the DER-encoded (again, not the PEM string) TBS part only, including its ASN.1 header (the ID 0x30 == ASN1_SEQUENCE | ASN1_CONSTRUCTED and the length field). Please note that the certificate's ASN.1 header is not taken into account.