I'm using Python and its MySQLdb module to import some measurement data into a Mysql database. The amount of data that we have is quite high (currently about ~250 MB of csv files and plenty of more to come).
Currently I use cursor.execute(...) to import some metadata. This isn't problematic as there are only a few entries for these.
The problem is that when I try to use cursor.executemany() to import larger quantities of the actual measurement data, MySQLdb raises a
TypeError: not all arguments converted during string formatting
My current code is
def __insert_values(self, values):
cursor = self.connection.cursor()
cursor.executemany("""
insert into values (ensg, value, sampleid)
values (%s, %s, %s)""", values)
cursor.close()
where values
is a list of tuples containing three strings each. Any ideas what could be wrong with this?
Edit:
The values are generated by
yield (prefix + row['id'], row['value'], sample_id)
and then read into a list one thousand at a time where row is and iterator coming from csv.DictReader
.
In retrospective this was a really stupid but hard to spot mistake. Values is a keyword in sql so the table name values needs quotes around it.
def __insert_values(self, values):
cursor = self.connection.cursor()
cursor.executemany("""
insert into `values` (ensg, value, sampleid)
values (%s, %s, %s)""", values)
cursor.close()