Introduction

Importing data from Excel into a SQL Server database is a common task in many applications. Whether you are dealing with small datasets or large volumes of data, understanding the different methods available for this task can help you choose the most efficient and effective approach for your needs. In this blog post, we will explore three methods for importing Excel data into SQL Server using ASP.NET: Individual Insert, Sql Bluk Copy, and Batch Bulk Insertion.

Prerequisites

Before diving into the implementation, ensure you have the following prerequisites set up:

Visual Studio: Install Visual Studio 2019 or later.

SQL Server: Set up a SQL Server instance.

ASP.NET Project: Create an ASP.NET Web Application (.NET Framework or .NET Core).

Library: Add following NuGet package .

Method 1: Individual Insert

The Individual Insert method involves reading each row from the Excel file and inserting it into the SQL Server database one at a time. While this method is straightforward, it can be slow and inefficient for large datasets due to the overhead of multiple database calls.

Step-by-Step Guide

  1. Read the Excel File: Use a library ExcelDataReader and ExcellDataReader.DataSet to read the data from excel sheet.
  2. Open a Database Connection: Establish a connection using Database.
  3. Insert Rows Individually: Loop through the rows in the Excel file and insert each one into the database.
public async Task<DataSet> ReadExcelFileAsync(Stream fileStream)
{
    using (var reader = ExcelReaderFactory.CreateReader(fileStream))
    {
        var result = reader.AsDataSet(new ExcelDataSetConfiguration()
        {
            ConfigureDataTable = (_) => new ExcelDataTableConfiguration()
            {
                UseHeaderRow = true
            }
        });

        return result;
    }
}

 public async Task<APIResponseDto<string>> InsertDataToSqlInsertAsync(IFormFile file)
 {
     var response = new APIResponseDto<string>();
     try
     {
//if need to delete the exting entries for master data for your table 
         var exitingEntries = _context.Orders.ToList();
         if(exitingEntries.Count > 0)
         {
             _context.TestTable.RemoveRange(exitingEntries);
             await _context.SaveChangesAsync();
         }

         // Reseed identity column
         await ReseedIdentityAsync();
//Read the excel file using the ReadExcelFileAsync method

         var dataSet = await ReadExcelFileAsync(file.OpenReadStream());
         var dataTable = dataSet.Tables[0];

         var testEntry = dataTable.AsEnumerable().Select(row => new Test
         {
             CustomerName = row.Field<string>("CustomerName"),
             CustomerId = row.Field<object>("CustomerId") != DBNull.Value ? Convert.ToInt32(row.Field<object>("CustomerId")) : (int?)null,
             CustomerEmail = row.Field<string>("CustomerEmail"),
             OrderDate = ConvertToDateTime(row.Field<object>("OrderDate")),
             ShippedDate = ConvertToDateTime(row.Field<object>("ShippedDate")),
             TotalAmount = ConvertToDecimal(row.Field<object>("TotalAmount")),
             Discount = ConvertToDecimal(row.Field<object>("Discount")),
             IsShipped = row.Field<object>("IsShipped") != DBNull.Value && Convert.ToBoolean(row.Field<object>("IsShipped")),
             IsPaid = row.Field<object>("IsPaid") != DBNull.Value ? Convert.ToBoolean(row.Field<object>("IsPaid")) : (bool?)null
         }).ToList();



         _context.Orders.AddRange(testEntry);
         await _context.SaveChangesAsync();

         response.HasError = false;
         response.ResponseCode = ResponseCode.Success;
         response.ResponseMessage = "Data inserted successfully";
         response.Result = "Success";
     }
     catch (Exception ex)
     {
         response.HasError = true;
         response.ResponseMessage = $"Error retrieving data: {ex.Message}";
         response.Result = null;
     }

     return response;
 }

private DateTime? ConvertToDateTime(object value)
{
    if (value == DBNull.Value || value == null)
        return null;

    if (DateTime.TryParse(value.ToString(), out DateTime dateValue))
        return dateValue;

    return null;
}
private decimal ConvertToDecimal(object value)
{
    if (value == DBNull.Value || value == null)
        return 0m;

    if (value is double)
        return (decimal)(double)value;

    return Convert.ToDecimal(value);
}

//reseeding the identity column
private async Task ReseedIdentityAsync()
{
    var sql = "DBCC CHECKIDENT ('Orders', RESEED, 0);";
    await _context.Database.ExecuteSqlRawAsync(sql);
}
action image for individual insert
proof image of the data saved in the data base table.

Method 2: BulkSQLCopy

The BulkSQLCopy method uses the SqlBulkCopy class to efficiently bulk load a large number of rows into the SQL Server database. This method is significantly faster than individual inserts, making it ideal for large datasets.

Step-by-Step Guide

  1. Read the Excel File into a DataTable: Use a library like ExcelDataReader to populate a DataTable.
  2. Bulk Insert into SQL Server: Use the SqlBulkCopy class to insert the DataTable into the database.
    public async Task<APIResponseDto<string>> InsertDataWithBulkCopy(IFormFile file)
{
        var response = new APIResponseDto<string>();
        try
        {
            var dataSet = await ReadExcelFileAsync(file.OpenReadStream());
            string connectionString = _configuration.GetConnectionString("DefaultConnection");
            await using(var connection = new SqlConnection(connectionString))
            {
                var exitingEntries = _context.Orders.ToList();
                if(exitingEntries.Count>0)
                {
                    _context.TestTable.RemoveRange(exitingEntries);
                    await _context.SaveChangesAsync();
                }
                // Reseed identity column
                await ReseedIdentityAsync();

                await connection.OpenAsync();
                var dataTable = dataSet.Tables[0];
                
                int rowCount = dataTable.Rows.Count; // Count rows in DataTable
                                                    
                
                using (var blukCopy = new SqlBulkCopy(connection))
                {
                    blukCopy.DestinationTableName = "TestTable";
                    foreach(DataColumn column in dataTable.Columns)
                    {
                        //Both parameters are the same because we're assuming that the column names in the DataTable are
                        //identical to the column names in the SQL Server table.
                        blukCopy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
                    }
                    blukCopy.WriteToServer(dataTable);

                    response.HasError = false;
                    response.ResponseCode = ResponseCode.Success;
                    response.ResponseMessage = $"Data inserted successfully. Total rows inserted: {rowCount}.";
                    response.Result = "Sucess";
                }

            }
        }
        catch(Exception ex)
        {
            response.HasError = true;
            response.ResponseMessage = $"Error retrieving data: {ex.Message}";
            response.Result = null;
        }
        return response;
    }
image for bluk batch copy

Method 3: Batch SQL Copy

The Batch SQL Copy method is a hybrid approach where you insert data in batches rather than all at once or one by one. This balances memory usage and performance.

Step-by-Step Guide

  1. Read the Excel File: Use a library like ExcelDataReader to populata dataset.
  2. Insert Data in Batches: Collect data in batches and insert them using SqlBulkCopy or a series of individual inserts.
public async Task<APIResponseDto<string>> InsertDataWithBatchBulkCopy(IFormFile file, int batchSize)
{
    var response = new APIResponseDto<string>();
    string connectionString = _configuration.GetConnectionString("DefaultConnection");

    try
    {
        await using (var connection = new SqlConnection(connectionString))
        {
            var exitingEntries = _context.Orders.ToList();
            if (exitingEntries.Count > 0)
            {
                _context.TestTable.RemoveRange(exitingEntries);
                await _context.SaveChangesAsync();
            }
            // Reseed identity column
            await ReseedIdentityAsync();

            await connection.OpenAsync();

            var dataSet = await ReadExcelFileAsync(file.OpenReadStream());
            var dataTable = dataSet.Tables[0];

            var totalRows = dataTable.Rows.Count;
            int startRow = 0;

            while (startRow < totalRows)
            {
                using (var bulkCopy = new SqlBulkCopy(connection))
                {
                    bulkCopy.DestinationTableName = "TestTable";
                    bulkCopy.BatchSize = batchSize;

                    // Define column mappings dynamically
                    ConfigureBulkCopyColumnMappings(bulkCopy, dataTable);

                    var batchTable = dataTable.Clone();

                    // Add rows to batchTable for the current batch
                    for (int i = startRow; i < startRow + batchSize && i < totalRows; i++)
                    {
                        batchTable.ImportRow(dataTable.Rows[i]);
                    }

                    try
                    {
                        // Perform the bulk copy operation
                        await bulkCopy.WriteToServerAsync(batchTable);
                    }
                    catch (Exception ex)
                    {
                        response.HasError = true;
                        response.ResponseMessage = $"Error during bulk copy operation: {ex.Message}";
                        response.Result = null;
                        return response; 
                    }
                }

                // Move to the next batch 
                startRow += batchSize;
            }

            // Success response
            response.HasError = false;
            response.ResponseCode = ResponseCode.Success;
            response.ResponseMessage = $"Data inserted successfully. Total rows processed: {totalRows}.";
            response.Result = "Success";
        }
    }
    catch (Exception ex)
    {
        response.HasError = true;
        response.ResponseMessage = $"Error retrieving data: {ex.Message}";
        response.Result = null;
    }

    return response;
}

private void ConfigureBulkCopyColumnMappings(SqlBulkCopy bulkCopy, DataTable dataTable)
{
    foreach (DataColumn column in dataTable.Columns)
    {
        // Map the columns assuming the column names in the DataTable are identical to those in the SQL Server table
        bulkCopy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
    }
}
image for bluk batch copy

Conclusion

Each method for importing Excel data into SQL Server has its strengths:

Choosing the right method depends on your dataset size, performance needs, and memory constraints. By understanding these methods, you can select the most efficient approach for your application. Happy coding.

References

  1. Fast SQL Bulk Inserts with C# and EF Core
  2. Bulk Insert as Batch-Sized into SQL Database from .NET Web Application
  3. Upload and Save Excel File Data into the Database in ASP.NET Core Web API

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.