Optmizing a while loop that gets info from a DB

A while loop I am using goes through each row in a database table and puts it in an html table. Fairly simple. But the contents of two columns in the html table are found in other database tables, so I have to include at least 2 additional mysql queries WITHIN the loop. The html table is about 1500 rows right now, and each row requires at least 2 additional mysql queries. So there are at least 3000 separate queries on this page. Is there a way to optimize this? (I don’t want to use pagination.)

You do not want to do db queries within loops.

However what you are describing is too general to help, better off showing code.

Maybe you can join your inner queries to your original query. Can you post some code?

Sure. This is a slimmed downversion just to illustrate the problem since the real table is so big. I actually get many more fields from each db table, and I used made up data that is easier to understand for illustration.

$sql = "SELECT id, name, age FROM users";

$result = mysqli_query($link, $sql);

echo "<table>";

while ($row = mysqli_fetch_assoc($result))
{
	echo "<tr>" . $row['id'] . "<tr>";
	echo "<tr>" . $row['name'] . "<tr>";
	echo "<tr>" . $row['age'] . "<tr>";
	
	$sql2 = "SELECT state, zipcode FROM location WHERE name = '{$row['id']}'";
	
	$result2 = mysqli_query($link, $sql2);
	
	$row2 = mysqli_fetch_assoc($result2);
	
	echo "<tr>" . $row2['state'] . "<tr>";
	echo "<tr>" . $row2['zipcode'] . "<tr>";
	
	$sql3 = "SELECT capital FROM statecapitals WHERE state = '{$row2['state']}'";
	
	$result3 = mysqli_query($link, $sql3);
	
	$row3 = mysqli_fetch_assoc($result3);
	
	echo "<tr>" . $row3['capital'] . "<tr>";

	
	
}

echo "</table>"

Thanks for any help!

You need to join the table together like this

SELECT u.id,u.name,u.age,loc.state,loc.zipcode,sc.capital FROM users u
LEFT JOIN location l ON loc.name = u.id
LEFT JOIN statecapitals sc ON sc.state = loc.state

Then your $row will have all the values you need already.

That worked wonders. Script execution is down by ~half a second for each 1000 rows. Thanks!