I am new to JSON and SQLBulkCopy. I have a JSON formatted POST data that I want to Bulk Copy/Insert in Microsoft SQL using C#.
JSON Format:
{
"URLs": [{
"url_name": "Google",
"url_address": "http://www.google.com/"
},
{
"url_name": "Yahoo",
"url_address": "http://www.yahoo.com/"
},
{
"url_name": "FB",
"url_address": "http://www.fb.com/"
},
{
"url_name": "MegaSearches",
"url_address": "http://www.megasearches.com/"
}]
}
Classes:
public class UrlData
{
public List<Url> URLs {get;set;}
}
public class Url
{
public string url_address {get;set;}
public string url_name {get;set;}
}
How can I do that efficiently?
Since you need to load just
from 10 to 50 urlsthere's obviously no need to use
SqlBulkCopy
- it's for thousands of inserts. Except if you'll be needed to repeat this operation many times.
So, if you have a list of urls, i.e. List, then just loop through all URL from list and insert them to database, e.g.
string insertQuery = "insert into TUrls(address, name) values(@address, @name)";
foreach (URL url in listOfUrls)
{
SqlCommand cmd = new SqlCommand(insertQuery);
cmd.Parameters.AddWithValue("@name", url.url_name);
cmd.Parameters.AddWithValue("@address", url.urld_address);
// don't forget to take care of connection - I omit this part for clearness
cmd.ExecuteNonQuery();
}
But if you really need to use SqlBulkCopy
you need to convert your objects of class URL
to DataTable
. To do this look at Marc Gravell's answer:
Here's a nice 2013 update using FastMember from NuGet:
IEnumerable<SomeType> data = ... DataTable table = new DataTable(); using(var reader = ObjectReader.Create(data)) { table.Load(reader); }
Yes, this is pretty much the exact opposite of this one; reflection would suffice - or if you need quicker,
HyperDescriptor
in 2.0, or maybeExpression
in 3.5. Actually,HyperDescriptor
should be more than adequate.For example:
// remove "this" if not on C# 3.0 / .NET 3.5 public static DataTable ToDataTable<T>(this IList<T> data) { PropertyDescriptorCollection props = TypeDescriptor.GetProperties(typeof(T)); DataTable table = new DataTable(); for(int i = 0 ; i < props.Count ; i++) { PropertyDescriptor prop = props[i]; table.Columns.Add(prop.Name, prop.PropertyType); } object[] values = new object[props.Count]; foreach (T item in data) { for (int i = 0; i < values.Length; i++) { values[i] = props[i].GetValue(item); } table.Rows.Add(values); } return table; }
So you can use one of Marc's solutions to create DataTable
from your List<URL>
. Then you just need to write table to destination table on server:
string csDestination = "put here connection string to database";
using (SqlConnection destinationConnection = new SqlConnection(csDestination))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.DestinationTableName = "TUrls";
bulkCopy.WriteToServer(dataTableOfUrls);
}
Hope it helps.
UPD
Of course you can, because BulkCopy works as just insert command, but does it a little bit different, that's all. I'd recommend you to use intermediate tables in case operation has high probability to go wrong (and you want to busy your destination table as little time as possible) or you need to do some data transformations, but only if you feel the need of it.