Documentation
-
Home
-
Getting Started
-
Common Tasks
-
The Servd Plugin
-
Project Settings
-
Domains & SSL
-
Importing & Cloning
-
Assets
-
Logs
-
Plans and Billing
-
Addons
-
Team & Project Management
-
Project Security
-
Your Account
-
Troubleshooting
-
Restart Instances
-
Repair Database
-
Craft isn't installed yet
-
Speeding Up Your Project
-
My Uploaded Assets Are Disappearing
-
My Sessions Are Expiring Too Soon
-
Why Are My Environments Behaving Differently?
-
No URL found for submodule path
-
Composer and Private Repositories
-
Composer out of memory errors
-
An SSL certificate for a domain has failed to generate or renew
-
Animated GIFs Displaying Incorrectly
-
Out of memory whilst running CLI tasks
-
CloudFlare: domain already exists
-
Permission Denied Whilst Generating PDFs
-
"server reached max_children setting (5), consider raising it"
-
Missing composer "allow-plugins" config
-
Unexpected robots.txt Content
-
Why is Servd is slower than my old VPS server?
-
Error: headers already sent
-
Where's the node_modules directory?
-
-
Cookbook
-
Now & Next
-
The Small Print
Unexpected robots.txt Content
The robots.txt file is used to inform search engines about which URLs they should attempt to index, and which they should ignore. It can be used to prevent specific paths or domains from being added to search engine indexes and showing up in public search results.
When a robots.txt file is accessed using a vanity servd.dev domain, we inject a special response which prevents search engines from performing any crawling. This is to prevent the vanity domain from being indexed and negatively impacting the SEO standing of your 'real' domains by being detected as duplicate content.
When accessing your project using any domain other than those ending in servd.dev the robots.txt file will be processed as normal, either using a static file from within your repo, or being delegated to Craft and handled by a plugin such as SEOMatic.