保存1000+记录到数据库的时间
问题描述:
我使用NHibernate目前。我有一个情况我需要一束记录保存到这样的数据库:
I'm using NHibernate currently. I have a situation where I need to save a bunch of records to the database like this:
var relatedTopics = GetRelatedTopics(topic);
foreach (var relatedTopic in relatedTopics /* could be anywhere from 10 - 1000+ */)
{
var newRelatedTopic = new RelatedTopic { RelatedTopicUrl = relatedTopic, TopicUrl = topic.Name };
_repository.Save(newRelatedTopic);
}
当有一吨的记录保存这显然非常繁重不必访问数据库,很多次。什么是更好的方法?是否有某种形式的批量更新我能做到吗?我是最好使用DataSet?
When there are a ton of records to save this is obviously very taxing having to hit the database that many times. What's a better approach? Is there some sort of batch update I can do? Am I better off using a DataSet?
感谢
答
设置adonet.batch_size可以改善这一状况。
setting adonet.batch_size could improve the situation.
有关,你必须
- 在NH配置设置adonet.batch_size
例如:
m_sessionFactory = Fluently
.Configure()
.Database(MsSqlConfiguration
.MsSql2005
.ConnectionString(c => c.FromConnectionStringWithKey("testme"))
)
.Mappings(m => m.FluentMappings
.AddFromAssemblyOf<TestImpl>())
.ExposeConfiguration(config =>
{
config.SetProperty("adonet.batch_size", "1");
m_configuration = config;
})
.BuildSessionFactory();
-
之前设置本届会议的批量保存
-
set the batch size on the session just before the save
using (ISession session = m_nhibernateSessionFactory.GetSession()) using (var tx = session.BeginTransaction()) { session.SetBatchSize(1000); foreach (var server in serverz) { session.SaveOrUpdate(server); } tx.Commit(); }
-