How to write loop below with best practice performance?

I work on SQL server 2012

I have temp table get data from excel and based on data exist on excel i insert on table inside loop

temp table always have big amount of data may be at least 5000 or 10000 or 15000 or more

I need every iteration increased by 5000 rows insert from temp table

so that i need best solutions for that according to speed and memory like that

and if there are any thing not correct as logic please tell me

my Query as below :

create table #Temp(
	DocumentPartID int identity(1,1),
	CompanyName VARCHAR(4000),
	[AffectedProduct] NVARCHAR(4000),
	[ReplacementPart] VARCHAR(4000) , 
	[ReplacementCompany] VARCHAR(4000) ,
	[Category] VARCHAR(4000) ,

    
	DocumentID int null,  
	CompanyID VARCHAR(4000) null,
	PartID int null,
	ReplacementPartID int null,
	CategoryID  int null,
	[Status]  VARCHAR(4000) null ,


)



insert into #Temp
(
CompanyName ,
[AffectedProduct],
[ReplacementPart],
[ReplacementCompany],
[Category]
)  
values
('Nokia','RF1550','RF1550','HTS','HTS'),
('IPHONE','TF1545','TF1545','Corning Incorporated','HTS2')



DECLARE @MaxValue int = ( select Max(DocumentPartID) from #Temp)
DECLARE @Currentindex int =0
DECLARE @Rows  [dbo].[Type_ValidationInPut];
	
		   while @Currentindex < @MaxValue
            begin 
			
		  
              DELETE @Rows
              INSERT  INTO @Rows
		                (
						RowNumber ,
				 GivenPartNumber ,
                  GivenManufacturer       
                ) 
           
		   
			select TOP 5000 DocumentPartID , isnull(AffectedProduct,''), isnull(CompanyName,'') FROM #Temp where 
			
			(CategoryID = 517884 or CategoryID = 1110481)  and (DocumentPartID > @Currentindex) and [Status] is null 

			    INSERT  INTO @Rows
		                (
                 RowNumber ,
				 GivenPartNumber ,
                  GivenManufacturer       
                ) 
             
		
			select TOP 5000 DocumentPartID, isnull(substring(ReplacementPart,0,70),''), isnull(ReplacementCompany,'') FROM #Temp where   
			(DocumentPartID > @Currentindex) and  [Status] is null and ReplacementPart is not null 
		

			DECLARE @NewID nVARCHAR(4000) =newID()
			insert into [ls30].[validation].[dbo].PartsData (BatchID,RowNumber,GivenPartNumber,givenmanufacturer) 
			SELECT  @NewID ,0,GivenPartNumber,GivenManufacturer from  @Rows 
			

			
			set @Currentindex = @Currentindex +5000
			DELETE @Rows
             end

Perhaps SQLServer 2012 has a similar function to MySql that imports data in seconds without having to iterate through a loop. Check this example and search for the SQL equivalent:

thank you for reply
I need enhance query and give good performance
not search for tools make that
can you help me

Sorry, I do not use SQL and cannot help with your problem.

I use the load data in file example every week, 14,000 rows typically, from a file on my computer (extracted from the office stand alone database), no need to upload it first. Just takes a few seconds.

1 Like

can you show me if possible how to do that please

DELETE
FROM mycounts
WHERE fdate>=‘2019-01-01’;
LOAD DATA LOCAL INFILE
“C:/Users/dr john/documents/My Documents/logpile/AccountDataExport_2020-01-28.txt”
INTO TABLE mycounts;
DELETE
FROM myflights
WHERE fdate>=‘2019-01-01’;
LOAD DATA LOCAL INFILE
“C://Users/dr john/documents/My Documents/logpile/FlightDataExport_2020-01-28.txt”
INTO TABLE myflights;

The delete parts mean I don’t need to remember the last time an update was done, I just remove all the data back to a given start date that lasts a year before being changed, then insert all the data from that start date.
Thousands of rows are inserted in a few seconds.
This is in mysql. Other variants of sql will have their equivalent code.
I use HeideSQL on my laptop to connect to the database and run the query. HeidiSQL is free to use.

hello fellow HeidiSQL user

may i offer my congratulations to your esteemed self for the wisdom you obviously possess as remarkably demonstrated by your keen choice of mysql interface application

Well you did recommend HeidiSQL many years ago,