all options
bullseye  ] [  bookworm  ] [  trixie  ] [  sid  ]
[ Source: parsero  ]

Package: parsero (0.0+git20140929.e5b585a-4)

Links for parsero

Screenshot

Debian Resources:

Download Source Package parsero:

Maintainers:

External Resources:

Similar packages:

Audit tool for robots.txt of a site

Parsero is a free script written in Python which reads the Robots.txt file of a web server through the network and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo... This is the way the administrator have to not share sensitive or private information with the search engines.

Parsero is useful for pentesters, ethical hackers and forensics experts. It also can be used for security tests.

Other Packages Related to parsero

  • depends
  • recommends
  • suggests
  • enhances

Download parsero

Download for all available architectures
Architecture Package Size Installed Size Files
all 11.5 kB41.0 kB [list of files]