How to Install and Uninstall ensmallen-devel.i686 Package on Fedora 35
Last updated: November 01,2024
1. Install "ensmallen-devel.i686" package
In this section, we are going to explain the necessary steps to install ensmallen-devel.i686 on Fedora 35
$
sudo dnf update
Copied
$
sudo dnf install
ensmallen-devel.i686
Copied
2. Uninstall "ensmallen-devel.i686" package
This is a short guide on how to uninstall ensmallen-devel.i686 on Fedora 35:
$
sudo dnf remove
ensmallen-devel.i686
Copied
$
sudo dnf autoremove
Copied
3. Information about the ensmallen-devel.i686 package on Fedora 35
Last metadata expiration check: 4:37:38 ago on Wed Sep 7 02:25:42 2022.
Available Packages
Name : ensmallen-devel
Version : 2.17.0
Release : 1.fc35
Architecture : i686
Size : 190 k
Source : ensmallen-2.17.0-1.fc35.src.rpm
Repository : fedora
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.
Available Packages
Name : ensmallen-devel
Version : 2.17.0
Release : 1.fc35
Architecture : i686
Size : 190 k
Source : ensmallen-2.17.0-1.fc35.src.rpm
Repository : fedora
Summary : Header-only C++ library for efficient mathematical optimization
URL : https://www.ensmallen.org
License : BSD
Description : ensmallen is a header-only C++ library for efficient mathematical optimization.
: It provides a simple set of abstractions for writing an objective function to
: optimize. It also provides a large set of standard and cutting-edge optimizers
: that can be used for virtually any mathematical optimization task. These
: include full-batch gradient descent techniques, small-batch techniques,
: gradient-free optimizers, and constrained optimization.