Hi all, I haven't come across this before, so I'm hoping someone can help.
Two weeks ago, I noticed a client's PHP website was hacked. I easily found the rogue JS file, and removed it, which resulted in all of the Japanese pages being removed. There was approximately 5 thousand additional pages. This was two weeks ago. All of the pages now return 404s.
However searches in Google still return the results OVER TWO WEEKS LATER.
After one week, I edited the robots.txt file to block these pages. I can verify that the 'extra pages' (that now are removed, and do not exist) are blocked, by checking within Google Search Console.
Does anyone have any ideas on what I'm missing? Why are these removed pages still showing in Google, and why is Google not respecting the disallow instructions in robots.txt?