A robots.txt
is a plain text file that consists of one or more rules that block or allow access. It follows the Robots Exclusion Standard.
Webmasters can create a robots.txt
file and upload it to a Paligo HTML5 layout. When the layout is used for publishing, Paligo will include the robots.txt
file in the output. The entire output can then be placed on a web server, so there is no need to add the file manually after publishing.
To upload a robots.txt file:
-
Create a
robots.txt
file in a third-party application. You must name the file robots.txt. -
Select Layout in the top menu.
-
Select the Layout you want to update or Create a Layout.
Tip
You can copy the URL of the Layout Editor and paste it into a new tab in your browser. This can be useful if you frequently switch between your Paligo content and the Layout settings.
-
Select CSS, JS, logos and other assets in the sidebar.
-
Select Upload in the Add a robots.txt file section.
-
Drag and drop your
robots.txt
file onto the upload dialog. -
Select Save.
When you use this layout to publish content, Paligo includes the robots.file in the output .zip
file.
Comments
0 comments
Article is closed for comments.