# Policy for running scrapers/crawlers: # # 1) Send a meaningful User-Agent header, which should include: # - the substring "bot" # - a name for your program # - your company name or a contact email # 2) Handle errors (such as HTTP 429 or 5xx): # - use exponential backoff # - respect the Retry-After response header if present # # Failure to comply may result in your traffic being blocked. User-agent: * Disallow: /*/?*filter_* Disallow: /*/cart/partial/ Disallow: /*/cart/products/ Disallow: /*/cart/toggle/ Disallow: /*/delivery/ajax/delivery-availability/ Disallow: /*/package/ Disallow: /*/recipes/*/print/ Disallow: /*/shopping-lists/ajax/0/products/ Disallow: /*/shopping-lists/ajax/new/ Disallow: /*/support/product-suggestion/ Disallow: /*/user/login/? Disallow: /*/user/verify/? # Block misbehaving bots User-agent: MJ12bot Disallow: / Sitemap: https://www.mathem.se/sitemap.xml