I'm using NHibernate currently. I have a situation where I need to save a bunch of records to the database like this:
var relatedTopics = GetRelatedTopics(topic);
foreach (var relatedTopic in relatedTopics /* could be anywhere from 10 - 1000+ */)
{
var newRelatedTopic = new RelatedTopic { RelatedTopicUrl = relatedTopic, TopicUrl = topic.Name };
_repository.Save(newRelatedTopic);
}
When there are a ton of records to save this is obviously very taxing having to hit the database that many times. What's a better approach? Is there some sort of batch update I can do? Am I better off using a DataSet?
Thanks
setting adonet.batch_size could improve the situation.
For that you have to
Example:
m_sessionFactory = Fluently
.Configure()
.Database(MsSqlConfiguration
.MsSql2005
.ConnectionString(c => c.FromConnectionStringWithKey("testme"))
)
.Mappings(m => m.FluentMappings
.AddFromAssemblyOf<TestImpl>())
.ExposeConfiguration(config =>
{
config.SetProperty("adonet.batch_size", "1");
m_configuration = config;
})
.BuildSessionFactory();
set the batch size on the session just before the save
using (ISession session = m_nhibernateSessionFactory.GetSession())
using (var tx = session.BeginTransaction())
{
session.SetBatchSize(1000);
foreach (var server in serverz)
{
session.SaveOrUpdate(server);
}
tx.Commit();
}