Disable sub domain indexing in robots.txt in virtual hosts

i am making a subdomain to be used in my website for example:

example.com - test.example.com

but i am using on single robots file because it is a virutal sub domain only

so how in my robots file to enable example.com indexing but disable test.example.com at the same time

because when in search i type “contact us” it gives results for example.com/contact-us and test.example.com/contact-us

i want only to display example.com results and don’t index test.example.com in order to disable its appearance in search result

so i removed the links in google search console but i have to edit robots.txt

Google recommends you use either “noindex” meta tags or a password-protected directory to ensure pages are not crawled.

https://support.google.com/webmasters/answer/6062608?hl=en

is this a good solution?

i added to htaccess those lines
RewriteCond %{HTTP_HOST} test.example.com
RewriteRule /robots.txt /subdomainRobots.txt [L,NC,QSA]

and in subdomainRobots in made disallow for all

both main domain and subdomain are looking on same files and same robots.txt

it is only a url name that the subdomain is using but the website is the same

so i can not put tags coz it will reflect both links the original and the subdomain

can you please check my other comment as a solution? is it helpful

If you have the same content accessed by two different URLs, then the right approach would be to use canonical URLs.

https://support.google.com/webmasters/answer/139066?hl=en