How can it be possible to work with crawlers like Google and Bing to set URLs from a sitemap file that contains URLs that will expire in 1 week?
Scenario:
Consider a hotel booking URL; where the URL is valid for booking a hotel within 1 week.
After the date, the URL will not be valid anymore and booking cannot be done.
But Google will not crawl the sitemap in each week or day, normally.
What is the best way to deal this case because I want fresh lists of hotels that can be booked within 1 week, forever (for all running weeks).
Is there any reason why the URLs should change each week? If I’ve understood you right, you will have a page showing hotels where rooms are available within one week from today, another page where rooms will be available within two weeks from today, and so on. Within each of those time frames, can you not simply use the same URL?
So, your URLs might be:
mysuperbookingservice/1week
mysuperbookingservice/2weeks
etc.
Each time a visitor requests a hotel room within a given number of weeks, your server-side code would query your database, generate an HTML page that contains the required data, and serve it from the URL in question. That way, the URLs will never change, and you will have no problem with search bots.
Also, don’t get over-obsessed with sitemaps. A sitemap is only needed if you have pages which can’t be reached through your normal navigation. Whether or not you have a sitemap won’t make any difference to the particular issue you described.
It is a legacy system, that inputs searching availability for booking by dates (individual date needed).
Here, I needed sitemaps because the website does not have “browse” feature.
And more often looks like an application, where one has to supply the dates.
So, I wanted to produce sitemaps with the dates parameters in the URLs.
URLs are now like: details.php?date_from=YYYY-MM-DD and other parameters, with the dynamic values in the dates.