I have a function where i delete multiple rows in a database using foreach loops.
Here is the code below.
$ids = array(1, 3, 7, 8, 9, 32);
foreach ($ids as $id) {
$sql = mysql->query("DELETE FROM tablename WHERE id = '$id' " );
}
This works in deleting all the rows that had the ids in the array, but my question is, how effective can this be especially if the ids are over 2000, can such looping handle over thousands of ids in an array without affecting the database?
Or is there a better way of handling this?
Thanks @m_hutley but the arrays are coming from a checkbox so you don’t know which ids the user will select, so programmatically i return the selected ids into an array and use it to run a loop
Now, there IS a limit to a query size - if memory serves, by default you can only send 1 MB worth of query at once to most databases.
Is there something about those ID’s in particular that make you want to delete them? Like… are you trying to delete all rows where… some condition is true?
Do you really expect a user to select 2000 checkboxes?
Implode() or join() will do the magic, am working on an email lists so possibility of users having more 20000 list is there, so if they want to delete some lists i will allow them select a maximum of 1000 ids and then run that bulk delete in batches.
If i limit their selection to 100 or 200 then is loads of work deleting emails list of 20000
I mean, at that point, i’d think you’d just purge the entire list (DELETE FROM list_emails WHERE list_id = 1)…
a thousand checkboxes on a single page is going to be a very long page, i’d be concerned about clients’ readability at that point; but, you can still chunk the array into smaller arrays, and then loop through the smaller arrays - multiple queries, but at least you can create a ratio (instead of 1000 queries, you could do it in say, 5, taking chunks of 200 entries at a time)
Thanks alot @m_hutley i think i can go for maximum 200 in a single query, so the user can then run as many times as he wants, even mysql had a row view limit of 500 other ones can be paginated just as @igor_g suggested, so i go for 20, 50, 100, and 200 maximum row view filters,
So if user wants to delete more he can put the table view in the 200 filter option and do his selection in 200 per batch depending on how many lists he wants deleted.
But then is there a better php table limit function with pagination that i can use?
As using jquery or boostrap datatables can be removed once client browser don’t have Java or he disabled it
Well, pagination via PHP is certainly doable - there are many examples out there. Basically you would put a LIMIT on your query, using PHP to keep track of the page size and current page. (The starting point of the limit is current-page - 1 * page size; and the limit size would be your page size.)
You’d need a second query to be aware of the number of records in the table. Your pagination ‘page bar’ then is a set of links 1…X where X is the ceiling of the number of records divided by page size.