SitePoint Sponsor

User Tag List

Results 1 to 22 of 22
  1. #1
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Calling a SP MULTIPLE times from an ASP page

    hello,
    how many times could i call a SP from an ASP page, to insert the variables from the user options. Using MS SQL

    I have an application that loops every time, a user enters multiple options in my ASP page seperated by a comma.

    Am just wondering if its tidy, and efficient to loop with a SP or there i neater way of executng the above statement

    thanks
    afrika

  2. #2
    SitePoint Guru asterix's Avatar
    Join Date
    Jun 2003
    Posts
    847
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    There is no limit to the number of times you call an SP from an ASP page. It is much more efficient if you just call the SP once, and pass in a longer string separated by commas.

    Let's say the input string is "george, fred, mary". You would call the SP and pass in the whole string to one of the parameters. Then in the SP you split the value and loop on each element:

    Code:
    use tempdb
     Go
     
     create procedure ProcessArray (
     	@InArray as Varchar(8000), @Delimter Char(1)
     )
     AS
     
     --declarations
     declare @Element Varchar(8000)
     declare @continue int
     declare @cPos int
     
     --
     set @continue = 1
     
     --add a trailing delimiter to the array
     select @InArray = @InArray + ','
     
     --create a temporary table to hold each element
     create table #tmpArray (idx smallint identity(0,1) Primary Key, elementValue varchar(8000))
     
     	WHILE (@continue >= 1)
     	BEGIN
     		select @cPos = CHARINDEX(@Delimter, @InArray)
     		select @Element = Rtrim(Ltrim(SUBSTRING(@InArray,1, @cPos -1)))
     		
     		BEGIN
     			INSERT #tmpArray (elementValue) VALUES (@Element)
     		END
     		
     		select @InArray = SUBSTRING(@InArray, @cPos + 1, DATALENGTH(@InArray))
     		select @continue = CHARINDEX(@Delimter, @InArray)
     	END
     
     --Return the element values, we could do some more work on the values here
     select * from #tmpArray
     drop table #tmpArray
     GO
     
     --Run the stored procedure
     Declare @RC int
     EXEC @RC = ProcessArray @InArray = 'george, fred, mary', @Delimter = ','
     Go
     
     --get rid of the clutter, it was just an example
     drop procedure ProcessArray
     GO

    Which gives you this resultset

    Code:
     0 george	
     1 fred	
     2 mary

  3. #3
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Very very good,
    thanks a lot asterix.

    actually sounds simpler than what i was doing, i was working with a table CSV script. But its much neater to use a SP

  4. #4
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I'm not sure that creating temp tables for such things is a good idea. IIRC temporary tables have quite a big overhead and are not very scalable. I would rather run the SP multiple times to from the middleware then rely on temp tables.

    You can also split the field using a UDF and insert the data using
    Code:
    insert into [tab] (a1, a2, a3) select 'v1', 'v2', Value from dbo.Split ('george, fred, mary', ',')
    That should have lower overhead then temp table. Or you can modify asterix's code to not create a temp table but just to loop cutting first part of the string until it is empty.
    Martin Pernecky

  5. #5
    SitePoint Guru asterix's Avatar
    Join Date
    Jun 2003
    Posts
    847
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by puco
    I'm not sure that creating temp tables for such things is a good idea. IIRC temporary tables have quite a big overhead and are not very scalable. I would rather run the SP multiple times to from the middleware then rely on temp tables.

    You can also split the field using a UDF and insert the data using
    Code:
     insert into [tab] (a1, a2, a3) select 'v1', 'v2', Value from dbo.Split ('george, fred, mary', ',')
    That should have lower overhead then temp table. Or you can modify asterix's code to not create a temp table but just to loop cutting first part of the string until it is empty.
    You are joking, right?

    You think that preparing an ADO call, executing it across a network or pipe, having the SQL Server compile and analyze the SQL and generate a query plan, do I/O to retrieve and sort records, then bundle the records up and marshall them back to the client N times is somehow faster than doing all that once, even if a temp table is used?

    Please post your test code and performance timings.

    Also, how do you think the function named dbo.Split actually works? I will bet you some rep that it works by using temp tables.

  6. #6
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by asterix
    You are joking, right?

    You think that preparing an ADO call, executing it across a network or pipe, having the SQL Server compile and analyze the SQL and generate a query plan, do I/O to retrieve and sort records, then bundle the records up and marshall them back to the client N times is somehow faster than doing all that once, even if a temp table is used?

    Please post your test code and performance timings.

    Also, how do you think the function named dbo.Split actually works? I will bet you some rep that it works by using temp tables.
    OK, now you made me curious, going to test it right away. BOL states that table data type has less overhead then using temporary tables. As for the perf. data I test and post it in few hours.
    Martin Pernecky

  7. #7
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Battle line is drawn

    Guess i am the referee
    :-)

  8. #8
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I analyzed only the SQL Server performance. For now I didn't analyze the multiple ADO calls. So I post here my test code and results from query analyzer (Client Statistics). My configuration is SQL Server 2000 Developer, running on notebook Pentium-M 1.7, 768MB RAM, 5400RPM HDD. I had three scripts: InsertMultinames1, InsertMultinames1a, InsertMultinames2.

    InsertMultinames1 is your modified script that doesn't use a temp table but inserts it directly into the test table. I thought that this could have some overhead when inserting into table with many records.

    InsertMultinames1a is your original script. The last staement dumps the content of temp table into my test table. This should mitigate the overhead when inserting into populated table (IMO).

    InsertMultinames2 is my proposed method with Split function using a table data type.

    Here is complete listing (long):
    Code:
    create table Names
    (
    	nid int not null identity (1,1) primary key,
    	n varchar (100)
    )
    go
    
    create function Split
    (
    	@string nvarchar(4000),
    	@delimiter nvarchar(4000) = N','
    )
    returns @parts table (i int identity(1,1), part nvarchar (4000))
    as
    begin
    	declare @index int, @len int, @total int, @currid int
    	set @index = 0
    	set @len = 1
    	set @total = len (@string)
    
    	if (charindex (@delimiter, @string, 0) = 0)
    	begin
    		insert into @parts values (@string)
    		return
    	end
    
    	while (@len >= 0)
    	begin
    		set @len = charindex (@delimiter, @string, @index) - @index
    
    		if (@len < 0)
    			insert into @parts values (substring (@string, @index, @total-@index+1))
    		else
    			insert into @parts values (substring (@string, @index, @len))
    
    
    		set @index = @index + @len + len (@delimiter)
    	end
    	return
    end
    go
    
    create procedure InsertMultiNames1 (@InArray as Varchar(8000))
    AS
     
     --declarations
     declare @Delimter Char(1)
     declare @Element Varchar(8000)
     declare @continue int
     declare @cPos int
     
     --
     set @continue = 1
     set @delimter = ','
    
     --add a trailing delimiter to the array
     select @InArray = @InArray + @Delimter
     
     --create a temporary table to hold each element
     WHILE (@continue >= 1)
     BEGIN
     select @cPos = CHARINDEX(@Delimter, @InArray)
     select @Element = Rtrim(Ltrim(SUBSTRING(@InArray,1, @cPos -1)))
     
     BEGIN
     INSERT Names (n) VALUES (@Element)
     END
     
     select @InArray = SUBSTRING(@InArray, @cPos + 1, DATALENGTH(@InArray))
     select @continue = CHARINDEX(@Delimter, @InArray)
     END
    GO
     
    create procedure InsertMultiNames1a (@InArray as Varchar(8000))
    AS
     
     --declarations
     declare @Delimter Char(1)
     declare @Element Varchar(8000)
     declare @continue int
     declare @cPos int
     
     --
     set @continue = 1
     set @delimter = ','
    
     --add a trailing delimiter to the array
     select @InArray = @InArray + @Delimter
     
    create table #tmpArray (idx smallint identity(0,1) Primary Key, elementValue varchar(8000))
     
     WHILE (@continue >= 1)
     BEGIN
     select @cPos = CHARINDEX(@Delimter, @InArray)
     select @Element = Rtrim(Ltrim(SUBSTRING(@InArray,1, @cPos -1)))
     
     BEGIN
     INSERT #tmpArray (elementValue) VALUES (@Element)
     END
     
     select @InArray = SUBSTRING(@InArray, @cPos + 1, DATALENGTH(@InArray))
     select @continue = CHARINDEX(@Delimter, @InArray)
     END
     
     --Return the element values, we could do some more work on the values here
     insert into Names (n) select elementValue from #tmpArray
     drop table #tmpArray
    GO
     
    create procedure InsertMultiNames2 (@InArray as Varchar(8000))
    AS
    insert into Names (n) select part from dbo.Split (@InArray, ',')
    GO
    
    -- test with 16 names
    execute InsertMultinames1 'joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim'
    execute InsertMultinames1a 'joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim'
    execute InsertMultinames2 'joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim,joe,jane,george,tim'
    
    -- test with 4 names
    execute InsertMultinames1 'joe,jane,george,tim'
    execute InsertMultinames1a 'joe,jane,george,tim'
    execute InsertMultinames2 'joe,jane,george,tim'
    
    -- management
    delete Names
    select count (*) from Names
    insert Names (n) select n from Names
    insert Names (n) select n from NBAK
    select * into NBAK from Names
    select * into Names from Names
    truncate table Names
    I run three test sets. First was inserting 16 names separated by ',' into an empty table (truncated before each run). Second set was inserting 4 names. The third was inserting 16 names into a table that had approx 40k rows. The table was inserted into a new backup table, truncated and all data was reinserted into the table from backup table before each instance.

    I made only one run for each test, so the results are not statistically represntative. I didn't post average values from Query Analyzer, only the actual ones.

    --- Original comment ----
    The results are in the attachment. From the results (which are not very objective IMO) results that my solution performes the best. The problem of you solution is that 1. temp table has quite high overhead compared to table data type; 2. You have many separate inserts into temp/normal table, which is quite intensive since the indexes mustbe updated. IIRC this doesn't occur in table data type variables.

    --- Revised comment 1 ---
    The time statistics looks best for method 1 (no temp table, only iteration) which is quite understandable. The difference between 1a are practically non-existent on my machine. Solution 2 has on the other hand very low Application Profile statistics (inserts, rows affected etc.) so this could result in better response times in more complex tables maybe.

    I would like your comments to this test or your test results asterix.
    Attached Files Attached Files
    Martin Pernecky

  9. #9
    SitePoint Guru asterix's Avatar
    Join Date
    Jun 2003
    Posts
    847
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Fantastic!
    Give me some time to check this over...

  10. #10
    SitePoint Guru asterix's Avatar
    Join Date
    Jun 2003
    Posts
    847
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    OK, here goes...

    Your SProc InsertMultiNames1a...
    It doesn't need a temp table, because on each iteration you simply perform the activity (insert the element into the Names table). Fair enough, if I code a SProc which takes a delimited list I would do the same, no need to actuallwere to y return the element list to the client at all.

    Your Split function:
    Nice, instead of creating a #tempTable you create a table variable. Essentially however, your Split function and my SProc do the same thing, they use a table to simulate an array. IIRC table variables outperform temp tables for small data sizes (as we have here), but ultimately if the data size increases then the DBEngine will have to page the data out to disk (eventually). This is probably the best way to do it, since you create nice modular code which can be reused easily.

    Your Sproc InsertMultiNames2
    Is very minimalist, and is based mainly on your Split function.

    But I thought that the point was the following:
    Calling an SProc n times from the client for each element in the list is a whole lot slower than calling an SProc once (and iterating through the list within the SProc). Your examples, unfortunatley, don't show that.

    But thanks, you put a lot of work into that analysis!

    Here is the kind of client script that could be used to measure relative performance

    (VB script, save as test.vbs and run it)
    Code:
       Dim rst, conn, cmd, connString
       
       connString = "Provider=sqloledb;Data Source=asterix;Initial Catalog=tempdb;Integrated Security=SSPI;"
       
       set conn = CreateObject("ADODB.Connection")
       set cmd = CreateObject("ADODB.Command")
       set rst  = CreateObject("ADODB.Recordset")
       
       
       conn.open(connString)
       rst.open "select * from names", connString
       set cmd.ActiveConnection = conn
       
       
       strCommand = ""
       strSprocName = "InsertMultinames1"
       for n = 1 to 999
       	strCommand = strCommand & n & ","
       next
       cmd.CommandText = "execute " & strSprocName & "'" & strCommand & "1000'"
       cmd.Execute
       
       for n = 1 to 1000
       	cmd.CommandText = "execute " & strSprocName & "'" & n & "'"
       	cmd.Execute
       next
       
       
       strCommand = ""
       strSprocName = "InsertMultinames1a"
       for n = 1 to 999
       	strCommand = strCommand & n & ","
       next
       cmd.CommandText = "execute " & strSprocName & "'" & strCommand & "1000'"
       cmd.Execute
       
       for n = 1 to 1000
       	cmd.CommandText = "execute " & strSprocName & "'" & n & "'"
       	cmd.Execute
       next
       
       msgbox("finished :)")
       
       cmd.CommandText = "delete from Names"
       cmd.Execute
       rst.close
       conn.close

  11. #11
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    OK, another results I ran the test you proposed (the individual commands through ADO). My tests are called ADO1, ADO1a, ADO2 and ADO-SEP. They were realized using Application Center Test with following settings:
    Simultanous Browser Connections: 100
    Warmup: 5s
    Runtime: 1:00 minute

    I measures avg RPS and Total Request Processed. This is for me a much more representative environment, since web receives many concurrent request. And I felt that the creating of temp tables might not be very scalable as I said on start. All tests were a simle aspx pages opening a db connection, running the command and closing the connection.

    The sample of test:
    Code:
    <%@ Page Language="C#" AutoEventWireup="true" %>
    <script runat="server">
    
    private void Page_Load (object sender, EventArgs e)
    {
    	using (System.Data.SqlClient.SqlConnection cnn = new System.Data.SqlClient.SqlConnection ("Data Source=localhost;Initial Catalog=tempdb;Integrated Security=SSPI;Persist Security Info=True"))
    	{
    		cnn.Open ();
    		new System.Data.SqlClient.SqlCommand ("execute InsertMultiNames1 'joe,john,dick,pat', ','", cnn).ExecuteNonQuery ();
    	}
    }
    
    </script>
    ADO1: runs InsertMultiNames1 with 4 names (iterates through list & inserts)
    ADO1a: runs InsertMultiNames1a with 4 names (creates a temp table)
    ADO2: runs InsertMultiNames2 with 4 names (using Split function)
    ADOSEP: runs 4 or 20 separate commands:

    Code:
    <%@ Page Language="C#" AutoEventWireup="true" %>
    <script runat="server">
    
    private void Page_Load (object sender, EventArgs e)
    {
    	using (System.Data.SqlClient.SqlConnection cnn = new System.Data.SqlClient.SqlConnection ("Data Source=localhost;Initial Catalog=tempdb;Integrated Security=SSPI;Persist Security Info=True"))
    	{
    		cnn.Open ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('pat')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('dick')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('john')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('george')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('pat')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('dick')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('john')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('george')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('pat')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('dick')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('john')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('george')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('pat')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('dick')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('john')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('george')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('pat')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('dick')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('john')", cnn).ExecuteNonQuery ();
    		new System.Data.SqlClient.SqlCommand ("insert into Names (n) values ('george')", cnn).ExecuteNonQuery ();
    	}
    }
    
    </script>
    Here are the results:
    Code:
    ADO1:       500RPS   29000 total
    ADO1:       370RPS   21500 total
    ADO2:       370RPS   21600 total
    ADOSEP(4):  430RPS   26000 total
    ADOSEP(20): 150RPS    9000 total
    So as I said I would rather run the commands separately was ok, for small number of items, since it should be equal to the temp table (1a) solution for approx less then 8 or 10 items. To my astinishment the difference using temp table and table datatype is negligible (100 requests in one minute in favor of table datatype). The iteration is the clear winner. But it is only 10% faster then the individual insert for 4 items. But the margin would grow if the number of items increases I guess.
    Martin Pernecky

  12. #12
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    One other test. I concatenated the inserts from ADOSEP to one long batch of inserts. Using 4 inserts, inserting 4 names the script ran over 610RPS and totalling over 35000 requests. So 20% faster then the iterative approach when inserting 4 items.

    When having 10 inserts conncatenated the scripts still runs faster then ADO1 (500RPS).
    Martin Pernecky

  13. #13
    SitePoint Guru asterix's Avatar
    Join Date
    Jun 2003
    Posts
    847
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Fantastic work Martin! Astonishing results.

    So to sum up: individual inserts with few items, approx 10, is faster than using a stored procedure to split and insert the data list.

    If you need to insert a data list with lots of values it is better to use a server side tsql solution. But I guess that this situation is rare in web applications, most people are likely to enter a small number of elements into a web form.

    It is seemingly unimportant whether you chose to use a Table variable, or to use a #temp table (the underlying mechanism is the same).

  14. #14
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    hi Guys
    really do appreciate your time and effort, BRILLIANT WORK DONE, however. My anticipated data is much more than 10 inserts, so i guess Asterix's scripts would work better in this case.

    You are right in what you are saying that most poeple insert small amounts of data, howver Contary to that, we have a lot of bulk blast sms clients on our site, so using ths serverside TSQL would work better.

    So i guess we would use the temp table.

    Once again thanks, for your advice

    Afrika

  15. #15
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If i may ask how do you implement this code ? in an asp page ?

    I notice
    1. you create an SP which you dropped later.
    2. You created a table, which you also dropped later

    Why drop the SP and the table ? Cant they be rerun over and over again ?


    Please explain this line of code, i notice @rc is declare as int , and why is it declared at the end of the code, woudl it not be neater to declare it at the start with other variables ?

    2. How do you SET A variable @rc to be equal to the original SP and execute @rc ?


    [ quote]
    EXEC @RC = ProcessArray @InArray = 'george, fred, mary', @Delimter = ','[/quote]



    Code:
    use tempdb
    Go

    create procedure ProcessArray (
    @InArray as Varchar(8000), @Delimter Char(1)
    )
    AS

    --declarations
    declare @Element Varchar(8000)
    declare @continue int
    declare @cPos int

    --
    set @continue = 1

    --add a trailing delimiter to the array
    select @InArray = @InArray + ','

    --create a temporary table to hold each element
    create table #tmpArray (idx smallint identity(0,1) Primary Key, elementValue varchar(8000))

    WHILE (@continue >= 1)
    BEGIN
    select @cPos = CHARINDEX(@Delimter, @InArray)
    select @Element = Rtrim(Ltrim(SUBSTRING(@InArray,1, @cPos -1)))

    BEGIN
    INSERT #tmpArray (elementValue) VALUES (@Element)
    END

    select @InArray = SUBSTRING(@InArray, @cPos + 1, DATALENGTH(@InArray))
    select @continue = CHARINDEX(@Delimter, @InArray)
    END

    --Return the element values, we could do some more work on the values here
    select * from #tmpArray
    drop table #tmpArray
    GO

    --Run the stored procedure
    Declare @RC int
    EXEC @RC = ProcessArray @InArray = 'george, fred, mary', @Delimter = ','
    Go

    --get rid of the clutter, it was just an example
    drop procedure ProcessArray
    GO

  16. #16
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    RE 1: It's just a sample code designed to run once and clean after itself. In your application you create the SP and then run it using
    Code:
    EXECUTE ProcessArray @InArray = 'george, fred, mary', @Delimter = ','
    RE 2: The dropping of the table is part of the stored procedure, it has to be dropped so the SP can be run several times in one connection.

    @RC is declared before its usage. The other variables (@InArray, @continue) are declared in the SP and their scope is the SP.
    Martin Pernecky

  17. #17
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    thanks Puco,
    Ok but i would like to know how this SP woudl be implemented. In the ASP page or as a SP?

    2. If the SP is dropped each time it runs, what if we have a senario whereby its being executed multiple times by different users ?

    e.g. If it runs multiple times woudl it be queued: Each requested until the execution is over. Because i noticed that if a table is created, and another users on the queue request a transaction woudl it conflict t wtih each other ?

  18. #18
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    OK, we found out in this thread that the temporary table isn't needed. So you just create the following stored procedure in your database (using Ent. Manager, Query analyzer or whatever):

    Code:
    create procedure InsertNames (@InArray as Varchar(8000))
    AS
     
     --declarations
     declare @Delimter Char(1)
     declare @Element Varchar(8000)
     declare @continue int
     declare @cPos int
     
     --
     set @continue = 1
     set @delimter = ','
    
     --add a trailing delimiter to the array
     select @InArray = @InArray + @Delimter
     
     --create a temporary table to hold each element
     WHILE (@continue >= 1)
     BEGIN
     select @cPos = CHARINDEX(@Delimter, @InArray)
     select @Element = Rtrim(Ltrim(SUBSTRING(@InArray,1, @cPos -1)))
     
     BEGIN
     ********* INSERT Names (n) VALUES (@Element)
     END
     
     select @InArray = SUBSTRING(@InArray, @cPos + 1, DATALENGTH(@InArray))
     select @continue = CHARINDEX(@Delimter, @InArray)
     END
    GO
    where you raplace the row begining with ******* with actual insert to your table with names. In @Element variabe is the current part.

    In your ASP page you then just call

    Code:
    execute InsertNames 'john,jim,dick,...'
    Martin Pernecky

  19. #19
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Originally posted by Puco
    OK, we found out in this thread that the temporary table isn't needed. So you just create the following stored procedure in your database (using Ent. Manager, Query analyzer or whatever):
    I thought it was agreed that the temp table is not needed in cases where the variables in the CSV variable is low, but in bulk conditions its ok ???

    pls confirm

  20. #20
    SitePoint Guru puco's Avatar
    Join Date
    Feb 2005
    Location
    Slovakia
    Posts
    785
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by afrika
    I thought it was agreed that the temp table is not needed in cases where the variables in the CSV variable is low, but in bulk conditions its ok ???

    pls confirm
    For few items in array (~5) the best thing is just to split the values in the web application and using ADO just run all the inserts without any SPs (or use an SP to insert one row at a time):

    Code:
    insert into tab (name) values ('jack');
    insert into tab (name) values ('john');
    When having more that that use the SP I posted in #18. The temporary table is never needed. (or maybe if you have a very complex DB schema, with many relationships or insert triggers on that table - then the approach with temp table may be justified)
    Martin Pernecky

  21. #21
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    thanks a lot Martin,
    very very very nice advice and EXCELLENT work done.

    No we dont have a complex schema, its just a simple loop for an sms bulk blast script.

    ranging from sending sms, from as little as 5 to as much as 10,000 at a go.

    Afrika

  22. #22
    Afrika
    Join Date
    Jul 2004
    Location
    Nigeria
    Posts
    1,737
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Have tested both SP and they both work very well.

    Excellent job done.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •