I am trying to find a best way to achieve this:
I need to create a web crawler that constantly crawls a page, picks the links, crawls those links before further crawling more links. I was wondering if PHP would be a good choice to accomplish to this. Can a PHP script be coded to run forever (may be by setting the timeout to infinite)? is it going to be a good idea? are there better options for this kind of thing?