Blocking auto generated pages with robots.txt


I am having problems with blocking auto generated pages that create duplicate content on my site. I have a question and answer feature on my site that generates a new page for each question and answer. However, it also adds several addition copies of that page.

For example, the following is created for the same page:

I would like to make sure I block this properly with robots.txt. Of course, I want
to be allowed / not disallowed and BLOCK

If I add Disallow: /ask-a-trustee/?show to my robots.txt file will that properly block the extra added page with ?show=6301, while also still allowing /ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company/ to be crawled by Google and other search engines?

Thanks in advance!

Friend Why you are not using 301 redirection