How to Install and Uninstall libstring-tokenizer-perl Package on Ubuntu 21.10 (Impish Indri)

Last updated: May 05,2024

1. Install "libstring-tokenizer-perl" package

Here is a brief guide to show you how to install libstring-tokenizer-perl on Ubuntu 21.10 (Impish Indri)

$ sudo apt update $ sudo apt install libstring-tokenizer-perl

2. Uninstall "libstring-tokenizer-perl" package

This is a short guide on how to uninstall libstring-tokenizer-perl on Ubuntu 21.10 (Impish Indri):

$ sudo apt remove libstring-tokenizer-perl $ sudo apt autoclean && sudo apt autoremove

3. Information about the libstring-tokenizer-perl package on Ubuntu 21.10 (Impish Indri)

Package: libstring-tokenizer-perl
Architecture: all
Version: 0.06-1
Priority: optional
Section: universe/perl
Origin: Ubuntu
Maintainer: Ubuntu Developers
Original-Maintainer: Debian Perl Group
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 36
Depends: perl
Filename: pool/universe/libs/libstring-tokenizer-perl/libstring-tokenizer-perl_0.06-1_all.deb
Size: 13434
MD5sum: 85395379303d96fd1ddb01efcbc566bd
SHA1: 3a830ec641b098267236fba830bf4687ff91f666
SHA256: 17637155a2f032d819f4cf26a26569361fad1ef686aac1bbcf51a53540fcc40c
SHA512: 9ca0095033e0356f7a5ecded3b48d193d9a924d9e177421eb01db9ea58e378d6929c1ee0699d118359ac0aef4ee9afe23388258febac3d5626ce42bc7c934db0
Homepage: https://metacpan.org/release/String-Tokenizer
Description-en: simple string tokenizer
String::Tokenizer is a simple string tokenizer which takes a string and splits
it on whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows for
splitting the string in many different ways.
.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs, but
it spans a gap between simple split / /, $string and the other options that
involve much larger and complex modules.
.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its input into
specific chunks, a lexical analyzer classifies those chunks. Sometimes these
two steps are combined, but not here.
Description-md5: f418e575fe22d78a7cce08a624b4abb0