# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-agent: *
# Disallow: /
# By adding the following line to a robots.txt file on www.beautiful.ai, we will be able to prevent search engines from crawling the presentations.
# This is important because google is unable to render the actual content in the presentation and all they see is a blank page with different titles.
# Also presentations private presentations could contain sensitive data, we want to make sure that they do not get crawled and indexed.
# This should also push Google and other search engines to crawl other, more valuable pages on Beautiful.ai more frequently.
User-agent: *
Disallow: /player
# It seems that Google has indexed some api routes, such as /api/user/permissions/:id
# I'm not sure how this makes sense, but I think we can safely disallow /api
User-agent: *
Disallow: /api