You robots.txt is mis-configured - the “allow” directive is only required to counter a specific disallow directive - since you have no disallow, that line is redundant.
Dude - you do realize that bots.txt file are only recommendation
of which “unnecessary bandwidth wastage.” are you talking in the name of JC ?
you website should have millions upon million visitors every month to actually waste all your bandwidth more ever - 99.99% of the hosts now days offer unlimited bandwidth …
Now if you uses robot.txt file and you dont know how to… then you can only screw your progress up.
robot.txt file tells google which of your pages not to crawl-not to “index”
unless you know exactly which directory to block. dont touch it.
just thinks on all the directories you dont need and add them if still wonna play with bot.txt files