SitePoint Sponsor

User Tag List

Results 1 to 9 of 9
  1. #1
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Prepared statements with Sql Server 2008

    I need to do lots of updates/inserts with small data (1k) in one transaction. However I'm thinking that doing 1000 updates is much slower than doing 1 update with 1MB data. The table would have only 2 columns (id, data).

    But I don't know how to implement it. Using prepared statements means I would have to do those many small updates and I can't send to sql server an array of data.

    Any advice would be welcome.

  2. #2
    SitePoint Author silver trophybronze trophy
    wwb_99's Avatar
    Join Date
    May 2003
    Location
    Washington, DC
    Posts
    10,623
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    You would have to test to be sure, and the local conditions on your server could definitely play a big role, but I'd start with something like:

    Code:
    public function DoBigHonkingUpdate(List<DataItems> data)
    {
          using (DbConnection conn = WhateverFactoryIamUsingToday.GetConnection());
          {
               using (DbCommand cmd = conn.CreateCommand())
               {
                     cmd.CommandText="INSERT INTO SomeTable (ID, Data) VALUES(@ID, @Data)"
                     DbParameter idParam=cmd.Parameters.Add("Id", DbType.Int32);
                     DbParameter dataParam = cmd.Parameters.Add("Data", DbType.Int32);
                     cmd.Open();
                     using (DbTransaction tran=conn.BeginTransaction())
                     {
                        cmd.Transaction = tran;
                        try 
                          {
                          foreach (DataItem item in data)
                          {
                              idParam.Value = item.ID;
                              dataParam.Value = item.Data;
                              cmd.ExecuteNonQuery();
                          }
                          tran.Commit();
                          conn.Close();
                      catch
                      {
                          tran.Rollback();
                          throw;
                      }
                   }
              }
          }
    }
    [nb: code written off cuff, might not compile and is meant for example purposes only]

    Pretty ugly, but munging data usually is. Basically, the above opens a connection and a creates a single command object which you re-use it repeatedly within a single database transaction. It will take advantage of Sql server's query optimizer on that end of the pond, making it essentially behave like a prepared statement.

  3. #3
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Thank you for the suggestion. I was thinking to use a sproc with 2 params: a string made up from the ids and a byte[] made up from all the data and then split them to create the queries inside sql server.

    What do you think about this idea performance wise?

  4. #4
    SitePoint Author silver trophybronze trophy
    wwb_99's Avatar
    Join Date
    May 2003
    Location
    Washington, DC
    Posts
    10,623
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Probably not--SQL server is for munging data, not splitting strings. That is also going to be much more painful to debug and manage in the long term. And, any way you want to carry it, you need to send 1mb of data. Split or one big chunk probably isn't going to matter unless the networking situation there is fubar.

  5. #5
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    A clr sproc should be better ?

  6. #6
    SitePoint Author silver trophybronze trophy
    wwb_99's Avatar
    Join Date
    May 2003
    Location
    Washington, DC
    Posts
    10,623
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    That might be better, but materially faster--probably not. Moreover, you have to now deal with deploying sql CLR stuff, which is a special form of hell. What is the operational goal with the speed here? There are a number of ways to skin this cat.

  7. #7
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Well I've thought about it and perhaps I can save the data fewer times ( 1 time on application_end and one time every few hours just in case) so the update speed won't matter very much.

    Thank you for the help.

  8. #8
    SitePoint Author silver trophybronze trophy
    wwb_99's Avatar
    Join Date
    May 2003
    Location
    Washington, DC
    Posts
    10,623
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    No problem. One word of warning--don't rely upon Application_End. If things crash hard, it will *never* fire.

  9. #9
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Yeah, that's why I'll save from time to time but not very often.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •