I want to block the My account page, Basket (cart page) and Check out page etc. in robots.txt from being crawled by search engines.
We have 20 languages and slugs are translated.
My questions is: Is it true that it is possible to use wildcard even when slugs are translated? According to ChatGpt using wildcard is possible and better (safer), however I cannot find this specifically being described in your documentation.