robots.txt file controls what pages of your allplications gets crawled by the serach engine bots. As part of robots.txt file
we can allow or disallow single pages or directories within the application.
robots.txt file lives in the root folder of your application. A simple, static solution in Rails would be, to put a robots.txt
file into you /public folder in your rails app but then you can't dynamically set the content of this file.
If you want a different file for staging and production server or want some dynamic routes in your robots.txt,
then you need to generate this file with rails. Make sure you remove/rename robots.txt from /public folder.