We all know that ef has poor support for batch operations, if you use ef to insert multiple pieces of data in batches or update or delete multiple statements in batches, it is a waste of time, so how to optimize ef operations for a large amount of data?
No nonsense, take a screenshot first, there is no comparison chart before and after optimization!
Statistics in the case of inserting the same 3814 pieces of data and the database is SQL Server
Before optimization:The average time was 2479 seconds
After optimization:The average time was 149 seconds
The insert code for the call is as follows:
Code before optimization:
Optimized code:
Other test code:
Optimization Scheme:
We use a third-party extension "Z.EntityFramework.Extensions", official homepage: http://entityframework-extensions.net/
Introduce:
Entity Framework: Bulk Insert, BulkSaveChanges, Bulk Update, Bulk Delete, Bulk Merge, and Bulk Sync.
Supports: SQL Server, SQL Azure, SQL Compact, Oracle, MySQL, SQLite, and PostgreSQL.
This library is not free. The trial period always ends at the end of the month.
nuget install command:
|