How to Install and Uninstall ensmallen-devel.x86_64 Package on Fedora 36
Last updated: November 26,2024
1. Install "ensmallen-devel.x86_64" package
In this section, we are going to explain the necessary steps to install ensmallen-devel.x86_64 on Fedora 36
$
sudo dnf update
Copied
$
sudo dnf install
ensmallen-devel.x86_64
Copied
2. Uninstall "ensmallen-devel.x86_64" package
Please follow the instructions below to uninstall ensmallen-devel.x86_64 on Fedora 36:
$
sudo dnf remove
ensmallen-devel.x86_64
Copied
$
sudo dnf autoremove
Copied
3. Information about the ensmallen-devel.x86_64 package on Fedora 36
Last metadata expiration check: 2:47:12 ago on Thu Sep 8 02:05:26 2022.
Available Packages
Name : ensmallen-devel
Version : 2.17.0
Release : 2.fc36
Architecture : x86_64
Size : 189 k
Source : ensmallen-2.17.0-2.fc36.src.rpm
Repository : fedora
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.
Available Packages
Name : ensmallen-devel
Version : 2.17.0
Release : 2.fc36
Architecture : x86_64
Size : 189 k
Source : ensmallen-2.17.0-2.fc36.src.rpm
Repository : fedora
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.