How to Install and Uninstall parsero Package on Kali Linux
Last updated: December 18,2024
1. Install "parsero" package
This guide let you learn how to install parsero on Kali Linux
$
sudo apt update
Copied
$
sudo apt install
parsero
Copied
2. Uninstall "parsero" package
Please follow the instructions below to uninstall parsero on Kali Linux:
$
sudo apt remove
parsero
Copied
$
sudo apt autoclean && sudo apt autoremove
Copied
3. Information about the parsero package on Kali Linux
Package: parsero
Version: 0.81~git20140929-0kali1
Architecture: all
Maintainer: Kali Developers
Installed-Size: 20
Depends: python3, python3-urllib3, python3-bs4
Homepage: https://github.com/behindthefirewalls/Parsero
Priority: optional
Section: utils
Filename: pool/main/p/parsero/parsero_0.81~git20140929-0kali1_all.deb
Size: 7080
SHA256: 0f04c4ccf62b7efca0d7a641556b5ca5695d6de4a636a1e7fbee3a9d086006f5
SHA1: ff3360c4add37a53d87714403262c2d48bfed669
MD5sum: 719186d8ff8a2b4c5282c4418ff403ca
Description: Robots.txt audit tool
Parsero is a free script written in Python which reads the Robots.txt file of
a web server and looks at the Disallow entries. The Disallow entries tell the
search engines what directories or files hosted on a web server mustn't be
indexed. For example, "Disallow: /portal/login" means that the content on
www.example.com/portal/login it's not allowed to be indexed by crawlers like
Google, Bing, Yahoo... This is the way the administrator have to not share
sensitive or private information with the search engines.
Description-md5:
Version: 0.81~git20140929-0kali1
Architecture: all
Maintainer: Kali Developers
Installed-Size: 20
Depends: python3, python3-urllib3, python3-bs4
Homepage: https://github.com/behindthefirewalls/Parsero
Priority: optional
Section: utils
Filename: pool/main/p/parsero/parsero_0.81~git20140929-0kali1_all.deb
Size: 7080
SHA256: 0f04c4ccf62b7efca0d7a641556b5ca5695d6de4a636a1e7fbee3a9d086006f5
SHA1: ff3360c4add37a53d87714403262c2d48bfed669
MD5sum: 719186d8ff8a2b4c5282c4418ff403ca
Description: Robots.txt audit tool
Parsero is a free script written in Python which reads the Robots.txt file of
a web server and looks at the Disallow entries. The Disallow entries tell the
search engines what directories or files hosted on a web server mustn't be
indexed. For example, "Disallow: /portal/login" means that the content on
www.example.com/portal/login it's not allowed to be indexed by crawlers like
Google, Bing, Yahoo... This is the way the administrator have to not share
sensitive or private information with the search engines.
Description-md5: