I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project.
I'm getting the following error:
java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)
According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam.gserviceaccount.com should have access to all resources within the same project if he's got "Editor" permissions.
When I run the same Dataflow job with DirectRunner everything works fine.
This is the code I'm using:
private static String JDBC_URL = "jdbc:mysql://myip:3306/mydb?verifyServerCertificate=false&useSSL=true";
PCollection < KV < String, Double >> exchangeRates = p.apply(JdbcIO. < KV < String, Double >> read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", JDBC_URL)
.withUsername(JDBC_USER).withPassword(JDBC_PW))
.withQuery(
"SELECT CurrencyCode, ExchangeRate FROM mydb.mytable")
.withCoder(KvCoder.of(StringUtf8Coder.of(), DoubleCoder.of()))
.withRowMapper(new JdbcIO.RowMapper < KV < String, Double >> () {
public KV < String, Double > mapRow(ResultSet resultSet) throws Exception {
return KV.of(resultSet.getString(1), resultSet.getDouble(2));
}
}));
EDIT:
Using the following approach outside of beam within another dataflow job seems to work fine with DataflowRunner which tells me that the database might not be the problem.
java.sql.Connection connection = DriverManager.getConnection(JDBC_URL, JDBC_USER, JDBC_PW);
Following these instructions on how to connect to Cloud SQL from Java:
https://cloud.google.com/sql/docs/mysql/connect-external-app#java
I managed to make it work.
This is what the code looks like (you must replace MYDBNAME, MYSQLINSTANCE, USER and PASSWORD with your values.
Heads up: MYSQLINSTANCE format is project:zone:instancename.
And I'm using a custom class (Customer) to store the values for each row, instead of key-value pairs.
p.apply(JdbcIO. <Customer> read()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(
"com.mysql.jdbc.Driver",
"jdbc:mysql://google/MYDBNAME?cloudSqlInstance=MYSQLINSTANCE&socketFactory=com.google.cloud.sql.mysql.SocketFactory&user=USER&password=PASSWORD&useUnicode=true&characterEncoding=UTF-8"
)
)
.withQuery( "SELECT CustomerId, Name, Location, Email FROM Customers" )
.withCoder( AvroCoder.of(Customer.class) )
.withRowMapper(
new JdbcIO.RowMapper < Customer > ()
{
@Override
public Customer mapRow(java.sql.ResultSet resultSet) throws Exception
{
final Logger LOG = LoggerFactory.getLogger(CloudSqlToBq.class);
LOG.info(resultSet.getString(2));
Customer customer = new Customer(resultSet.getInt(1), resultSet.getString(2), resultSet.getString(3), resultSet.getString(3));
return customer;
}
}
)
);
I hope this helps.