If Staging is enabled, then Robots.txt cannot be configured on the live site because the Site pages cannot be configured independently, but Robots.txt is not published on Staging publishes. The correct behavior should be for Robots.txt to also be published with the pages.
Steps to reproduce:
1. Configure two Liferay DXP bundles for Remote Live staging, both bundles can be on the same machine
2. For the default Site, go to Public Pages / Configure / Advanced.
3. In the Robots section add:
4. Publish to Remote staging
5. View the robots.txt file by appending /robots.txt to the site url. ex: http://localhost:9080/robots.txt
Expected result: The robots.txt configured on the Staging site will transfer to the Live site
Actual result: The robots.txt file is not transferred to the Live site