Time to iterate through array

In general, how long would it take to iterate through an array using a ForEach loop if the array contained…

100 records?

1,000 records?

10,000 records?

100,000 records?

1,000,000 records?

Is this something I even need to worry about?



Depends where the array is coming from (database, file, whatever…), how many indexes in each record, the total size of the array, etc etc.

And yes, it’s definitely something to keep an eye on

Some background on my question…

For each Comment posted on my website, I am running a query on the article_comment_review table to see if the current logged-in member has already done a review or not. (I stick the results in an array.)

Originally my plan was to run one query that would bring back all Reviews for Article #1, and then stick them in an array. Then down in my HTML section, as I displayed each Comment, I would iterate through this array, and see if for the current Comment and current User, if there was a Review. If so, then I would hide the Survey, otherwise display it.

Follow me?

My fear is this…

If a given Article had 1,000 Comments, and 10,000 Users “reviewed” each Comment, then my array would be 10 Million records!!

And so that is what is behind my OP.

I suppose I could run a query for each Comment, but that seems more costly than to just return a recordset/array that is reasonably close to what I am checking.

Follow me?

What should I do?



This should give you a nudge in the right direction (scroll down to the bottom) –


Also, if you happen to be on PHP 5.5 or newer, if you are concerned about potentially huge data sets, this is the perfect time to use a generator


Right, but the counter to that is… What kind of performance hit do I take if I have to iterate through an array of (scroll to my OP)? :slight_smile:



If you use generators, you take practically no performance hit…you’d need a huge amount of data (or a reaally crappy server) to bog it down then.
Otherwise, it’s not iterating through an array that causes issues, it’s gathering the data that is slow (and what makes generators so fast in comparison to a regular loop).

For something like that, I would add a table for user reviews, with columns something like “user_id”, “comment_id” and when a user reviews a comment or whatever, add that user/comment ID combo to that table. Then you can do a quick check to see if a user has a reviewed that comment in a fashion similar to “SELECT * FROM reviews WHERE user_id = $user_id AND comment_id = $comment_id”. Of course you could also do a $comments_reviewed = “SELECT comment_id FROM reviews WHERE user_id = $user_id” to get a list of all comments that user has already commented on.

Lol don’t copy/paste any of that as it’s not functional… it’s just to give a general idea of the process.

I don’t know what that means… :-/


I think the heat is warping my brain today… :frowning:

While waiting on your responses, I came to a similar obvious conclusion!

My article_comment_review table looks like this…

- id
- article_id
- commenter_id
- comment_created_on
- reviewer_id
review stuff here

So I could write a query like this…

Show all records for Article #1 where the Reviewer is the current logged-in Member.

With that approach, the largest the recordset would be is the # of Comments posted by the given Member for the Article, right?

(And I don’t expect there to be more than 500 Comments - among all Members - per Article.)

How does all of that sound? :-/



Sorry, you responded too quick lol…I edited that post and gave a link to the PHP docs

That’s correct, if you are only intending to show that particular member’s comments.
Then when you want to show all comments, you reuse this table, and just omit the where the Reviewer is the current logged-in Member part.

Oh, okay.

Fancy trick there.

Probably more than I need this time.

Let me know what you think of my latest solutions.



Not fancy at all…you’re going to be foreach’ing the array anyway. Just remember, it’s the gathering and structuring of data that takes all the time.
But if you’re deadset against using one, then consider just paginating the data ( i.e., fetching 20 rows at a time, rather than all of it at once ) if having a lot of records being returned is a concern

How much extra work/overhead am I creating by sticking query results into an array, so I can work with the array instead?


Well if you’re using PDO it’s as simple as using the fetchall method.

VIA: http://php.net/manual/en/pdostatement.fetchall.php

$sth = $dbh->prepare("SELECT name, colour FROM fruit");

/* Fetch all of the remaining rows in the result set */
print("Fetch all of the remaining rows in the result set:\
$result = $sth->fetchAll();

You lost me…

I last asked how much overhead there way taking the results set from my Prepared Statement and sticking it into an array…


Best thing to do is just paginate the results…all performance concerns taken care of, and users won’t have to scroll through a giant wall of comments

Right, but that has nothing to do with what this thread is about…


Well, when you grab results from the database, prepared statements or otherwise, you’re storing those results either in an object or an array to process them. To the best of my knowledge there isn’t much (if any) difference in overhead between objects and arrays. Both are just storing that data in memory while you work with it, clearing it either when you unset the obj/array or when the script ends. So as far as performance goes, storing it in an array should be just fine.

It has everything to do with it.