How to Install and Uninstall ensmallen-devel.x86_64 Package on AlmaLinux 9
Last updated: November 25,2024
1. Install "ensmallen-devel.x86_64" package
Please follow the step by step instructions below to install ensmallen-devel.x86_64 on AlmaLinux 9
$
sudo dnf update
Copied
$
sudo dnf install
ensmallen-devel.x86_64
Copied
2. Uninstall "ensmallen-devel.x86_64" package
Please follow the instructions below to uninstall ensmallen-devel.x86_64 on AlmaLinux 9:
$
sudo dnf remove
ensmallen-devel.x86_64
Copied
$
sudo dnf autoremove
Copied
3. Information about the ensmallen-devel.x86_64 package on AlmaLinux 9
Last metadata expiration check: 0:39:12 ago on Wed Mar 13 07:41:12 2024.
Available Packages
Name : ensmallen-devel
Version : 2.19.0
Release : 2.el9
Architecture : x86_64
Size : 195 k
Source : ensmallen-2.19.0-2.el9.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.
Available Packages
Name : ensmallen-devel
Version : 2.19.0
Release : 2.el9
Architecture : x86_64
Size : 195 k
Source : ensmallen-2.19.0-2.el9.src.rpm
Repository : epel
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.