You could increase these internet pages into the file to become explicitly dismissed. Robots.txt information use one thing called the Robots Exclusion Protocol. This Site will very easily produce the file in your case with inputs of webpages to be excluded. For example, if you wished a specific website page https://rankingseotoolz.com/robots-txt-generator