Robots TXT file: order matters, to disallow all except some bots
If you are trying to guess how you would exclude all bots from some pages, yet allow specific bots to visit even these pages, you need to be careful on the order of the directives in your Robots.txt file.
For example, a robots.txt file containing these lines:
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /feed/
would fail to let Mediapartners-Google to index even the /feed/ pages. To give full permission to Mediapartners-Google you need to reverse the order above:
User-agent: * Disallow: /feed/ User-agent: Mediapartners-Google Disallow:
You should always put general directions to the top of the robots.txt file, then provide directions for specific bots.