Blocking auto generated pages with robots.txt

Hello,

I am having problems with blocking auto generated pages that create duplicate content on my site. I have a question and answer feature on my site that generates a new page for each question and answer. However, it also adds several addition copies of that page.

For example, the following is created for the same page:
www.mysite.com/ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company/
www.mysite.com/ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company?show=6301

I would like to make sure I block this properly with robots.txt. Of course, I want
www.mysite.com/ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company/
to be allowed / not disallowed and BLOCK
www.mysite.com/ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company?show=6301

If I add Disallow: /ask-a-trustee/?show to my robots.txt file will that properly block the extra added page with ?show=6301, while also still allowing /ask-a-trustee/6300/rights-unsecured-creditor-court-appointed-bankruptcy-company/ to be crawled by Google and other search engines?

Thanks in advance!

Friend Why you are not using 301 redirection