I have a series of ASCII flat files coming in from a mainframe to be processed by a C# application. A new feed has been introduced with a Packed Decimal (COMP-3) field, which needs to be converted to a numerical value.
The files are being transferred via FTP, using ASCII transfer mode. I am concerned that the binary field may contain what will be interpreted as very-low ASCII codes or control characters instead of a value - Or worse, may be lost in the FTP process.
What's more, the fields are being read as strings. I may have the flexibility to work around this part (i.e. a stream of some sort), but the business will give me pushback.
The requirement read "Convert from HEX to ASCII", but clearly that didn't yield the correct values. Any help would be appreciated; it need not be language-specific as long as you can explain the logic of the conversion process.
I have been watching the posts on numerous boards concerning converting Comp-3 BCD data from "legacy" mainframe files to something useable in C#. First, I would like to say that I am less than enamoured by the responses that some of these posts have received - especially those that have said essentially "why are you bothering us with these non-C#/C++ related posts" and also "If you need an answer about some sort of COBOL convention, why don't you go visit a COBOL oriented site". This, to me, is complete BS as there is going to be a need for probably many years to come, (unfortunately), for software developers to understand how to deal with some of these legacy issues that exist in THE REAL WORLD. So, even if I get slammed on this post for the following code, I am going to share with you a REAL WORLD experience that I had to deal with regarding COMP-3/EBCDIC conversion (and yes, I am he who talks of "floppy disks, paper-tape, Disc Packs etc... - I have been a software engineer since 1979").
First - understand that any file that you read from a legacy main-frame system like IBM is going to present the data to you in EBCDIC format and in order to convert any of that data to a C#/C++ string you can deal with you are going to have to use the proper code page translation to get the data into ASCII format. A good example of how to handle this would be:
StreamReader readFile = new StreamReader(path, Encoding.GetEncoding(037); // 037 = EBCDIC to ASCII translation.
This will ensure that anything that you read from this stream will then be converted to ASCII and can be used in a string format. This includes "Zoned Decimal" (Pic 9) and "Text" (Pic X) fields as declared by COBOL. However, this does not necessarily convert COMP-3 fields to the correct "binary" equivelant when read into a char[] or byte[] array. To do this, the only way that you are ever going to get this translated properly (even using UTF-8, UTF-16, Default or whatever) code pages, you are going to want to open the file like this:
FileStream fileStream = new FileStream(path, FIleMode.Open, FIleAccess.Read, FileShare.Read);
Of course, the "FileShare.Read" option is "optional".
When you have isolated the field that you want to convert to a decimal value (and then subsequently to an ASCII string if need be), you can use the following code - and this has been basically stolen from the MicroSoft "UnpackDecimal" posting that you can get at:
I have isolated (I think) what are the most important parts of this logic and consolidated it into two a method that you can do with what you want. For my purposes, I chose to leave this as returning a Decimal value which I could then do with what I wanted. Basically, the method is called "unpack" and you pass it a byte[] array (no longer than 12 bytes) and the scale as an int, which is the number of decimal places you want to have returned in the Decimal value. I hope this works for you as well as it did for me.
private Decimal Unpack(byte[] inp, int scale)
{
long lo = 0;
long mid = 0;
long hi = 0;
bool isNegative;
// this nybble stores only the sign, not a digit.
// "C" hex is positive, "D" hex is negative, and "F" hex is unsigned.
switch (nibble(inp, 0))
{
case 0x0D:
isNegative = true;
break;
case 0x0F:
case 0x0C:
isNegative = false;
break;
default:
throw new Exception("Bad sign nibble");
}
long intermediate;
long carry;
long digit;
for (int j = inp.Length * 2 - 1; j > 0; j--)
{
// multiply by 10
intermediate = lo * 10;
lo = intermediate & 0xffffffff;
carry = intermediate >> 32;
intermediate = mid * 10 + carry;
mid = intermediate & 0xffffffff;
carry = intermediate >> 32;
intermediate = hi * 10 + carry;
hi = intermediate & 0xffffffff;
carry = intermediate >> 32;
// By limiting input length to 14, we ensure overflow will never occur
digit = nibble(inp, j);
if (digit > 9)
{
throw new Exception("Bad digit");
}
intermediate = lo + digit;
lo = intermediate & 0xffffffff;
carry = intermediate >> 32;
if (carry > 0)
{
intermediate = mid + carry;
mid = intermediate & 0xffffffff;
carry = intermediate >> 32;
if (carry > 0)
{
intermediate = hi + carry;
hi = intermediate & 0xffffffff;
carry = intermediate >> 32;
// carry should never be non-zero. Back up with validation
}
}
}
return new Decimal((int)lo, (int)mid, (int)hi, isNegative, (byte)scale);
}
private int nibble(byte[] inp, int nibbleNo)
{
int b = inp[inp.Length - 1 - nibbleNo / 2];
return (nibbleNo % 2 == 0) ? (b & 0x0000000F) : (b >> 4);
}
If you have any questions, post them on here - because I suspect that I am going to get "flamed" like everyone else who has chosen to post questions that are pertinent to todays issues...
Thanks, John - The Elder.