Sumeru Cyber Security
  • Sumeru Cyber Security
  • Workarounds for prevalent vulnerabilities
    • Version Disclosure
    • Host Header Attack
    • HttpOnly and Secure Flag
    • Security Headers
    • Clickjacking
    • Weak Password
    • Username Enumeration
    • jQuery Outdated
    • Cross-Origin Resource Sharing
    • AWS S3 Bucket Misconfiguration
    • Directory Listing
    • Laravel Debug Enabled
    • Autocomplete and Remember Password Enabled
    • Brute Force Attack
    • Cross Site Request Forgery
    • SQL Injection
    • PhpMyAdmin page Available Publicly
    • Implementation of BASIC Authentication
    • Cache Browsing
    • Insecure Direct Object Reference
    • Active mixed content over https
    • Improper forgot password implementation
    • ASP.NET Debug Enabled
    • Sensitive Data Sent in GET Request
    • Weak CAPTCHA Implementation
    • Csv Injection
    • Cross Site Scripting
    • Web Server Robot.txt Information Disclosure
    • SSL Related Issues
    • Local File Inclusion
    • Weak CAPTCHA Implementation
    • Automated Form Submission
    • Php.ini File Available Publicly
    • ITLP
    • Internal Path Disclosure
    • Insecure Direct Object Reference
    • Access Token Not Expiring After Logout
  • OWASP A09-Security Logging and Monitoring Failures
  • OWASP API09-Improper Inventory Management v1.0
Powered by GitBook
On this page
  • Introduction
  • How to Test
  • How to Fix
  • HTML
  • PHP
  • References

Was this helpful?

  1. Workarounds for prevalent vulnerabilities

Web Server Robot.txt Information Disclosure

PreviousCross Site ScriptingNextSSL Related Issues

Last updated 5 years ago

Was this helpful?

Introduction

The remote server contains a file named 'robots.txt' that is intended to prevent web 'robots' from visiting certain directories in a website for maintenance or indexing purposes. A malicious user may also be able to use the contents of this file to learn of sensitive documents or directories on the affected site and either retrieve them directly or target them for other attacks.

How to Test

We can find robots.txt file publicly. Just type robots.txt file after the URL and check whether it is available or not.

How to Fix

Review the contents of the site's robots.txt file, use Robots META tags instead of entries in the robots.txt file, and/or adjust the web server's access controls to limit access to sensitive material.

HTML

<meta name="robots" content="noindex, nofollow" > 

PHP

header('X-Robots-Tag: noindex, nofollow'); 

References

  1. https://www.searchenginejournal.com/best-practices-setting-meta-robots-tags-robots-txt/188655/#close

https://searchengineland.com/google-to-stop-supporting-noindex-directive-in-robots-txt-319003