Skip to content

baa001/XPO_how-to-import-a-large-data-set-using-xpo-efficiently-within-a-transaction-t333879

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Files to look at:

How to import a large data set using XPO efficiently within a transaction

When you are required to import a large data set into a database as XPO persistent objects, the straightforward approach might be inappropriate. Specifically, if you would create objects objects one by one and commit them individually, you cannot roll back changes if one object failed to commit. If you use an XPO transaction or unit of work, changes can be rolled back, but it will require a lot of memory to keep all objects until the final commit.

The solution demonstrated in this example commits objects in small batches by creating a unit of work for each batch and disposing of it after it is committed. To be able to roll back all batches at once, it utilizes the database-level transaction using the XPO data layer's command channel. Although XPO provides a public API for using database transactions (Using Explicit Transactions), it cannot be used in this scenario because explicit transactions belong to sessions, but here we use separate sessions for each batch.

Below is a managed memory allocation chart produced by this example if you log GC.GetTotalMemory(true) values in the CreatePersistentObject method:



About

.NET, Frameworks (XAF & XPO), eXpress Persistent Objects

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C# 100.0%