How to Install and Uninstall ensmallen-devel.x86_64 Package on CentOS Stream 8
Last updated: November 01,2024
1. Install "ensmallen-devel.x86_64" package
In this section, we are going to explain the necessary steps to install ensmallen-devel.x86_64 on CentOS Stream 8
$
sudo dnf update
Copied
$
sudo dnf install
ensmallen-devel.x86_64
Copied
2. Uninstall "ensmallen-devel.x86_64" package
This is a short guide on how to uninstall ensmallen-devel.x86_64 on CentOS Stream 8:
$
sudo dnf remove
ensmallen-devel.x86_64
Copied
$
sudo dnf autoremove
Copied
3. Information about the ensmallen-devel.x86_64 package on CentOS Stream 8
Last metadata expiration check: 5:22:42 ago on Sun Feb 25 03:03:59 2024.
Available Packages
Name : ensmallen-devel
Version : 2.19.0
Release : 2.el8
Architecture : x86_64
Size : 214 k
Source : ensmallen-2.19.0-2.el8.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.
Available Packages
Name : ensmallen-devel
Version : 2.19.0
Release : 2.el8
Architecture : x86_64
Size : 214 k
Source : ensmallen-2.19.0-2.el8.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.