What is the best way to loop through large arrays

I have a list of subscribers in a table with a total rows of 400,000
The table structure has just four columns
id | amount | balance | date

Now the amount and balance column values are encrypted using ECB encryption.

I want to update their values of amount and balance by 2.5%

I have two options:

  1. I can run a sql to decrypt and adjust the balance in the table but i don’t know how reliable this will always be especially for rows that much, and how long it will take to complete the update.

  2. Fetch all data, loop through it decrypt it value. and upload back.

Option two looks very rough but i still need to know if I want to go for options 2, what will be the best way to loop without timeout.

I have heard of using generators but i don’t know how to use it and how effective can it be for large dataset

Is the database and the webserver in the same machine?

For particularly sensitive information like that, my impulse would always be to move it as few times as possible. Limits the exposure.

Yes the database and webserver is in the same machine.

Actually the amount and balance are encrypted data at rest.

Yes, the data is encrypted. But what’s more protected, an envelope in a safe that gets moved from building to building in a truck, or an envelope in a safe that never sees the light of day?

In any case, if the database and webserver are running on the same machine, transportation isnt the concern.

If you can run a SQL to decrypt and adjust, it’s probably going to be faster than importing 400,000 rows, decrypting, and adjusting, right?

The SQL server and webserver are running on the same machine, so they have the same amount of machine resources available for processing… i’m still inclined to let the SQL server do the lifting…

my observations too, but just curious to know how effective and proficient can mysql handle such large decrpypt, calculate and encrypt back for 300,000 rows without errors

I dont know how to quantify “effective” or “proficient”, as those would be subjective evaluations, and even if i tried, the answer would depend entirely on your server. Running the most efficient software in the world on a potato isnt going to meet any metric that could be produced.

on that note, my machine takes large chunks of the determining factor as to how effective and proficient the results can be.

If thats the case , giving the figurative expectancy of the tasks, what machine or composition of a machine or server I should be looking forward to for faster and better performance.

I dont… really know how to answer that either.

More bigger numbers = gooder.

The question is in avoidance of clamouring for high end turbor servers whereas by default a normal less expensive or robust server could handle such dataset without blinking

A potato can handle it. Can it handle it quickly? No. What defines quick enough? Not anyone on this side of the computer screen.

Im that case this ignites a quest of finding the exact match of a server by scaling up the moment a slow response or performance is detected.

The fastness or speed isn’t much of a concern for me because that could be easily traced or fished out.

My major worry is wrong or uncompleted calculations, seeing this has to do with figures.

Can cause loss of funds to both the company and the user if the sql fails to complete the total number as expected.

Just like you wrote a update query, and can’t tell on the first glance if the update was successful or not, until a user submits a complain that his data remained unchanged after the r

rowCount() will likely return only the number of rows affected.

Maybe a combination of total number of rows against rowCount() can give me a clue if the update completed successfully.

But on the second hand since am going to follow the analysis using php to know the outcome and then email admin incase of such occurrence, if it fails due to system or server crash then the rowCount won’t return anything and my corresponding actions won’t work because the server currently needs oxygen and CPR