I switched from using file_get_contents to cURL for communicating with the Twitter API to fetch the users latest tweets.
cURL is great since it doesn't produce an error when the fetch fails to return anything but I'm wondering what I could do when cURL doesn't return anything either.

It seems to retrieve, then every time I refresh my webpage it calls the function (below) to fetch the data again but may not return anything at all and hence I'm left with a blank space where my Tweets should be.

Here's the function I call to fetch the data. Of course this is called every time I refresh my webpage. Using a caching system is sometimes worse because it will cache the webpages at a time they did not have the Twitter data.

Code PHP:
//Gets the users latest Tweets from Twitter.
function get_latest_tweets()
	$options = get_option('website_options');
	$userID = $options['twitter_user'];
	$url = 'http://twitter.com/statuses/user_timeline/' . $userID . '.xml?count=3';
	$ch = curl_init();
	curl_setopt ($ch, CURLOPT_URL, $url);
	curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
	curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 5);
	$file_contents = curl_exec($ch);
	$xml = new SimpleXMLElement($file_contents);
	foreach ($xml->status as $status) 
		$tweet = $status->text;
		$id = $status->id;
		$date = $status->created_at;
		$tweet = //Format the Tweet
		$formatted_date = date( "M jS g:i a", strtotime($date) );						
		echo "<p class='tweet'>" . $tweet . "<a href='http://twitter.com'" . $userID . "/status/" . $id . "'> - " . $formatted_date . "</a></p>";

Is there anything simple I could do about this? I thought of storing the returned Tweets to Array until a certain timeframe passes then refreshing that array but I'm not sure if it's random that the data is retrieved or it's Twitter limiting their bandwidth.

Thanks for the help.