I have some CLOB columns in a database that I need to put Base64 encoded binary files in. These files can be large, so I need to stream them, I can't read the whole thing in at once.
I'm using org.apache.commons.codec.binary.Base64InputStream
to do the encoding, and I'm running into a problem. My code is essentially this
FileInputStream fis = new FileInputStream(file);
Base64InputStream b64is = new Base64InputStream(fis, true, -1, null);
BufferedReader reader = new BufferedReader(new InputStreamReader(b64is));
preparedStatement.setCharacterStream(1, reader);
When I run the above code, I get one of these during the execution of the update
java.io.IOException: Underlying input stream returned zero bytes
, it is thrown deep in the InputStreamReader code.
Why would this not work? It seems to me like the reader
would attempt to read from the base 64 stream, which would read from the file stream, and everything should be happy.
This appears to be a bug in Base64InputStream
. You're calling it correctly.
You should report this to the Apache commons codec project.
Simple test case:
import java.io.*;
import org.apache.commons.codec.binary.Base64InputStream;
class tmp {
public static void main(String[] args) throws IOException {
FileInputStream fis = new FileInputStream(args[0]);
Base64InputStream b64is = new Base64InputStream(fis, true, -1, null);
while (true) {
byte[] c = new byte[1024];
int n = b64is.read(c);
if (n < 0) break;
if (n == 0) throw new IOException("returned 0!");
for (int i = 0; i < n; i++) {
System.out.print((char)c[i]);
}
}
}
}
the read(byte[])
call of InputStream
is not allowed to return 0. It does return 0 on any file which is a multiple of 3 bytes long.