There is nothing in roboat that should cause a problem. A site: search shows three pages indexed on Google, which is what your sitemap shows, so no issue there.
It looks like more an Ahrefs problem.
HTTP 503 may mean a rate limit was hit. Do you know if you have a rate limit anywhere in your site, either in the code, the webserver or anything in front (like a load balancer)?
It wouldn’t surprise me if Ahrefs hits such a limit, as their bot is pretty poorly coded.