Browse articles

Editing Robots.txt

What is robots.txt?

Robots.txt is a file on your website that tells search engine crawlers (like Googlebot) which pages they’re allowed to access and which they should skip. Every website has one at yoursite.com/robots.txt.

Vacation Labs automatically generates a default robots.txt for your website. You can add your own custom rules on top of this default.

Default robots.txt rules

Every Vacation Labs website includes these default rules, which cannot be removed:

User-agent: *
Disallow: /itineraries/bookings
Disallow: /search
Disallow: /*/dates_and_rates
Disallow: /*/date_based_upcoming_departures

Sitemap: https://www.yoursite.com/sitemap_index.xml

Here’s what each rule does:

  • Disallow: /itineraries/bookings — Prevents indexing of booking pages, which are transactional and shouldn’t appear in search results.
  • Disallow: /search — Blocks search result pages to avoid indexing duplicate content that dilutes SEO value.
  • Disallow: /*/dates_and_rates — Stops indexing of pricing pages, which change frequently and aren’t useful in search results.
  • Disallow: /*/date_based_upcoming_departures — Prevents indexing of departure listing pages that quickly become outdated.

How to add custom rules

  1. Go to SEO Center > Settings.
  2. Scroll to the Robots.txt section. You’ll see a text editor showing your current custom directives. SEO Center Settings page showing robots.txt textarea, tight crop of the robots.txt editor section, red box around the textarea, desktop
  3. Add your custom directives in the text editor.
  4. Click Save.
  5. Verify your changes by visiting yoursite.com/robots.txt in your browser.

Your custom directives are appended below the platform’s default rules. The defaults cannot be removed — this ensures important pages like booking forms and search results always remain blocked from crawlers.

Example custom rules

Block a specific crawler:

User-agent: AhrefsBot
Disallow: /

Block a specific page from all crawlers:

User-agent: *
Disallow: /pricing

Allow only specific search engines:

User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /

Avoid blocking critical pages such as tour listings, collection pages, or other key content. Blocking important pages will reduce your visibility in search results and impact traffic.

Related articles