Now, both 0 and 1 here are two IDs, from a Database table. I use a foreach-loop in PHP,
to render something, a certain amount of items. Now, my question is more related to performance e.g.
is it a good practice to run another query and parse additional results using the IDs I get from my first result?
I know I would do like this to query for more data using the IDs in the group concatenated result:
, column FROM table WHERE id = IN ($result)"
Do you ask if it is more efficient to use GROUP_CONCAT to create a list of items (string), or to create the list using a while loop in PHP? If, then GROUP_CONCAT will be the most efficient solution, but note that if you pull thousands or records, it is an idea to run multiple queries, just pulling for example 1000 records in each. Limiting the size of the GROUP_CONCAT string. The exact number of records will depend on the size of the column you put into the string, so run some tests to find your sweet spot for best performance.
Or are you asking if it is more efficient to instead use a JOIN on the first query instead of doing additional queries in the loop from the result you received from the GROUP_CONCAT? If then using a JOIN on the first query and using a while loop to process the results will give a significantly better performance in most cases.
Though note in rare cases, if you have many joins on a table or the joins are very inefficient, it can be more efficient to divide it into several queries. However this is something you will notice when looking into the performance of the query.
The answer depends on how much traffic the website will have and how many records that has to be processed, and how it will be displayed.
The SQL solution in the thread you linked to, will be faster than a php solution in most cases.
When you also add some caching, it should not be a performance issue caused by traffic. And when you have too many records that matches, it has to be changed, but again then that show that the site has become successful, so ironically it will be a good problem to fix at that time (With that said, I recommend putting this feature in its own class or method, so its easy to refactor how it work down the road).
From what I understand, you want to display a news feed from any users I follow.
If this is the case, I would suggest changing your query to this:
INNER JOIN follows
follows.user_id = '0'
, stream.stream_date DESC
This will give you all of the data, sorted by user and date. You then parse this on the PHP side before displaying it. You would do this by user_id and date, combining all the items matching this etc.
The stockexchange issue, is slightly different than what you mention now. If I understand your requirements correctly, the suggestion above will by far be the one with most performance. Then of course add a cache layer on it as well, so you do not recalculate this on every page load.
On the stockexchange topic, they join users who has purchased the same products. With other words, if user A and B bought the same product, their news feed would say “User A and User B bought product A and product D”, then under “User A bought product B” since User B did not purchase this.
On the query you have on sqlfiddle, you only group on the records per user I follow. This means the entire query is useless, and expensive for what it does. Since you anyway need to pull these records to find out what they did.
If it had been enough for you with the specific value that was “group_concat” to know what they did, then it would be useful, but since you cant, it is more efficient to do a similar query that the one I mention, joining the tables you need to get the result set. I.e. since you need to do that afterwards with the sqlfiddle query anyway.
What about grouping a users’ activities, e.g. UserA posted two posts, and then within that <li></li>
list two divs, that represents those posts in DESC order. Should this be done with PHP? Or is there a faster way?
It will make more sense, in a human reading code sense, to parse the people individually and search for activity for each person individually. But in a large traffic/activity site, it might make sense to get all the data in one large array, then parse the array accordingly - it’s just going to be “less clean” when reviewing the code and harder to maintain cleanly.
That’s a little different - if you look, both of those queries are against the same table, so the one query makes sense, and the indexing of the array isn’t too bad. It has nothing to do with querying within a foreach loop…