Robots.txt

Parser

synopsis:Specialized ptp.libptp.parser.AbstractParser classes for the robots.txt files.
class ptp.tools.robots.parser.RobotsParser(pathname, filename='*robots.txt', light=False, first=True)[source]

Robots specialized parser.

__tool__ = 'robots'
__format__ = 'txt'
classmethod is_mine(pathname, filename='*robots.txt', light=False, first=True)[source]

Check if it can handle the report file.

Parameters:
  • pathname (str) – Path to the report directory.
  • filename (str) – Regex matching the report file.
  • light (bool) – True to only parse the ranking of the findings from the report.
  • first (bool) – Only process first file (True) or each file that matched (False).
Raises:
  • IOError – when the report file cannot be found.
  • OSError – when the report file cannot be found.
Returns:

True if it supports the report, False otherwise.

Return type:

bool

parse_metadata()[source]

Parse the metadata of the report.

Returns:The metadata of the report.
Return type:dict
parse_report()[source]

Parser the results of a Robots.txt report.

Returns:List of dicts where each one represents a vuln.
Return type:list

Signatures

synopsis:Robots.txt might contains interesting Disallow entries. This file tries to define a ranking for them.
ptp.tools.robots.signatures.SIGNATURES = {'/phpmyadmin': 1, '/admin': 1, '/backend': 1, '/private': 1, '/secret': 1, '/login': 1, '/logon': 1}
Data:dict of the methods with their rank.