How to Install and Uninstall python3-opt-einsum Package on openSUSE Leap
Last updated: December 25,2024
1. Install "python3-opt-einsum" package
Please follow the instructions below to install python3-opt-einsum on openSUSE Leap
$
sudo zypper refresh
Copied
$
sudo zypper install
python3-opt-einsum
Copied
2. Uninstall "python3-opt-einsum" package
Please follow the instructions below to uninstall python3-opt-einsum on openSUSE Leap:
$
sudo zypper remove
python3-opt-einsum
Copied
3. Information about the python3-opt-einsum package on openSUSE Leap
Information for package python3-opt-einsum:
-------------------------------------------
Repository : Main Repository
Name : python3-opt-einsum
Version : 3.1.0-bp155.2.12
Arch : noarch
Vendor : openSUSE
Installed Size : 557.3 KiB
Installed : No
Status : not installed
Source package : python-opt-einsum-3.1.0-bp155.2.12.src
Upstream URL : https://github.com/dgasmith/opt_einsum
Summary : Optimizing numpys einsum function
Description :
Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
`np.einsum`,`dask.array.einsum`,`pytorch.einsum`,`tensorflow.einsum`)
by optimizing the expression's contraction order and dispatching many
operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized
einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch,
Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially
any library which conforms to a standard API. See the
[**documentation**](http://optimized-einsum.readthedocs.io) for more
information.
-------------------------------------------
Repository : Main Repository
Name : python3-opt-einsum
Version : 3.1.0-bp155.2.12
Arch : noarch
Vendor : openSUSE
Installed Size : 557.3 KiB
Installed : No
Status : not installed
Source package : python-opt-einsum-3.1.0-bp155.2.12.src
Upstream URL : https://github.com/dgasmith/opt_einsum
Summary : Optimizing numpys einsum function
Description :
Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
`np.einsum`,`dask.array.einsum`,`pytorch.einsum`,`tensorflow.einsum`)
by optimizing the expression's contraction order and dispatching many
operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized
einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch,
Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially
any library which conforms to a standard API. See the
[**documentation**](http://optimized-einsum.readthedocs.io) for more
information.