How to Install and Uninstall ensmallen-devel.x86_64 Package on CentOS 8 / RHEL 8
Last updated: November 14,2024
1. Install "ensmallen-devel.x86_64" package
Here is a brief guide to show you how to install ensmallen-devel.x86_64 on CentOS 8 / RHEL 8
$
sudo dnf update
Copied
$
sudo dnf install
ensmallen-devel.x86_64
Copied
2. Uninstall "ensmallen-devel.x86_64" package
This guide let you learn how to uninstall ensmallen-devel.x86_64 on CentOS 8 / RHEL 8:
$
sudo dnf remove
ensmallen-devel.x86_64
Copied
$
sudo dnf autoremove
Copied
3. Information about the ensmallen-devel.x86_64 package on CentOS 8 / RHEL 8
Last metadata expiration check: 1 day, 4:18:12 ago on Sun May 9 13:03:46 2021.
Available Packages
Name : ensmallen-devel
Version : 2.14.2
Release : 1.el8
Architecture : x86_64
Size : 188 k
Source : ensmallen-2.14.2-1.el8.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.
Available Packages
Name : ensmallen-devel
Version : 2.14.2
Release : 1.el8
Architecture : x86_64
Size : 188 k
Source : ensmallen-2.14.2-1.el8.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.