Friday, July 15, 2016

How to pass parameters to the DbContext.Database.ExecuteSqlCommand method?

Depending on your underlying database provider, you can use either of the following.
Database.ExecuteSqlCommand(
    "exec SetElementFrequency {0}, {1}",
    elementType, frequency); 
or
Database.ExecuteSqlCommand("exec SetElementFrequency ?, ?", elementType, frequency); 
You may also specify elementType and frequency as DbParameter-based objects to provide your own names via the ParameterName property.

DirectoryInfo.getFiles beginning with


I've come across some strange behavior trying to get files that start with a certain string.

Please would someone give a working example on this:

I want to get all files in a directory that begin with a certain string, but also contain the xml extension.

for example:

 apples_01.xml
 apples_02.xml
 pears_03.xml
I want to be able to get the files that begin with apples.

So far I have this code

 DirectoryInfo taskDirectory = new DirectoryInfo(this.taskDirectoryPath);
 FileInfo[] taskFiles = taskDirectory.GetFiles("*.xml");



Solution:

FileInfo[] taskFiles = taskDirectory.GetFiles("apples*.xml");




Friday, July 8, 2016

Bulk Insert from Generic List into SQL Server with minimum lines of code

Entity Framework and even classic ADO.NET is very slow when you need to perform a large number of inserts into the database.
Calling 1000 times INSERT INTO is not efficient because for every INSERT operation, SQL Server will also need to write to the transaction log.
A nice workaround for this is to use Bulk insert or SQLBulkCopy class in C#.
However SQLBulkCopy method WriteToServer does not accept List<T> or IEnumerable, IList as parameter.

Implement extension method AsDataTable for IEnumerable<T>

A nice workaround is to implement extension method AsDataTable to IEnumerable because WriteToServer accepts DataTable as parameter.
Here is a simple implementation to do this:
public static class IEnumerableExtensions
   {
       public static DataTable AsDataTable<T>(this IEnumerable<T> data)
       {
           PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
           var table = new DataTable();
           foreach (PropertyDescriptor prop in properties)
               table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
           foreach (T item in data)
           {
               DataRow row = table.NewRow();
               foreach (PropertyDescriptor prop in properties)
                   row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
               table.Rows.Add(row);
           }
           return table;
       }
   }
After this we can use this extension method to Bulk insert data from List<T> into SQL Server.
var listPerson = new List<Person>
    {
         new Person() {Id = 1},
         new Person() {Id = 2}
    };
 
using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SomeConnectionString"].ConnectionString))
    {
         connection.Open();
         SqlTransaction transaction = connection.BeginTransaction();
 
         using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
         {
            bulkCopy.BatchSize = 100;
            bulkCopy.DestinationTableName = "dbo.Person";
            try
            {
                bulkCopy.WriteToServer(listPerson.AsDataTable());
            }
            catch (Exception)
            {
                transaction.Rollback();
                connection.Close();
            }
          }
 
          transaction.Commit();
    }
Optimal BatchSize depends on the number of items you need to insert, row size, network bandwidth and latency.
After doing some testing I can recommend BatchSize of 100 when you need to insert few thousand items into the database.
The best way to find optimal BatchSize value is to experiment yourself.