# Production Robots.txt file User-agent: YandexBot Crawl-delay: 10 # Baidu doesnt support Crawl-delay but added anyways in case they ever do User-agent: Baiduspider Crawl-delay: 10 User-agent: Bingbot Crawl-delay: 1 # www.apptus.com's crawler generates many exceptions User-agent: apptus_tce Disallow: / # www.apptus.com's crawler generates many exceptions User-agent: apptus_tce/1.2 Disallow: / # MSRBOT causes some exceptions (its a microsoft research test bot) User-agent: MSRBOT Disallow: / # Twiceler causes exceptions User-agent: twiceler Disallow: / # This thing is generating exceptions User-agent: VSynCrawler/1.0 Disallow: / # This thing comes with a bogus sc1 cookie User-agent: iCrossing - Apollo - icrossing.com Disallow: / # Sitemap Sitemap: https://www.shopstyle.ca/sitemap-index.https.xml ### CAREFUL with the whitespace - We can't have blank lines within this one big block for UA=* User-agent: * Disallow: /shop-instagram* Disallow: /util/request-headers Disallow: /*wenhao* Disallow: /trend/ Disallow: /action/ Disallow: /search Disallow: /*crawl=no* Disallow: /users Disallow: /lists Disallow: /layouts Disallow: /track Disallow: /collective/ Disallow: /featured-looks Disallow: /p/ Disallow: /l/