I am trying to implement simple validation of credit card numbers. I read about the Luhn algorithm on Wikipedia:
- Counting from the check digit, which is the rightmost, and moving left, double the value of every second digit.
- Sum the digits of the products (e.g., 10: 1 + 0 = 1, 14: 1 + 4 = 5) together with the undoubled digits from the original number.
- If the total modulo 10 is equal to 0 (if the total ends in zero) then the number is valid according to the Luhn formula; else it is not valid.
On Wikipedia, the description of the Luhn algorithm is very easily understood. However, I have also seen other implementations of the Luhn algorithm on Rosetta Code and elsewhere (archived).
Those implementations work very well, but I am confused about why they can use an array to do the work. The array they use seems to have no relation with Luhn algorithm, and I can't see how they achieve the steps described on Wikipedia.
Why are they using arrays? What is the significance of them, and how are they used to implement the algorithm as described by Wikipedia?
Unfortunately none of the codes above worked for me. But I found on GitHub a working solution
// takes the form field value and returns true on valid number
function valid_credit_card(value) {
// accept only digits, dashes or spaces
if (/[^0-9-\s]+/.test(value)) return false;
// The Luhn Algorithm. It's so pretty.
var nCheck = 0, nDigit = 0, bEven = false;
value = value.replace(/\D/g, "");
for (var n = value.length - 1; n >= 0; n--) {
var cDigit = value.charAt(n),
nDigit = parseInt(cDigit, 10);
if (bEven) {
if ((nDigit *= 2) > 9) nDigit -= 9;
}
nCheck += nDigit;
bEven = !bEven;
}
return (nCheck % 10) == 0;
}