Google Custom Search

Is it possible to exclude content on a page from the search crawl? Example the text in the standard footer?

Yes you can exclude them by using robot.txt file. It is the most used method to prevent parts of your site from being indexed by search engines.

Web definition of robot.txt file:
A “robots.txt” file is a text file placed on your server which contains a list of robots and “disallows” for those robots. Each disallow will prevent any address that starts with the disallowed string from being accessed.

Unfortunately the “official” terminology used is misleading.

disallow does NOT prevent search bots from crawling those pages. It informs honest bots that bother to check the robots.txt file of pages you don’t want them to crawl.

It is more a suggestion or a voicing of your wishes than it is a type of blocker.