Is it bad to block filetypes from Google Bot?


Apologies if this is the wrong forum, but I really don’t know where else to post this…

So, via Robots.txt I disallow crawling of a folder I have of a particular file type (OK I will admit to everyone, it is full of SWF files).

A. I don’t want them indexed since Google considers Flash bad. My thought was to block access to the folder, then the googlebot doesn’t know what’s in there and won’t ding my site for having flash.

B. But, when a page is crawled that links to content in that folder, Search console reports that there is blocked content. Is this ok to ignore?

I’ve also been trying to think of a strategy where the Mobile crawler can’t see content that links to swf files, while the Desktop crawler still would, but that’s probably a whole other topic to post…