How to Install and Uninstall python310-azure-ai-contentsafety Package on openSuSE Tumbleweed
Last updated: November 16,2024
1. Install "python310-azure-ai-contentsafety" package
Please follow the step by step instructions below to install python310-azure-ai-contentsafety on openSuSE Tumbleweed
$
sudo zypper refresh
Copied
$
sudo zypper install
python310-azure-ai-contentsafety
Copied
2. Uninstall "python310-azure-ai-contentsafety" package
Here is a brief guide to show you how to uninstall python310-azure-ai-contentsafety on openSuSE Tumbleweed:
$
sudo zypper remove
python310-azure-ai-contentsafety
Copied
3. Information about the python310-azure-ai-contentsafety package on openSuSE Tumbleweed
Information for package python310-azure-ai-contentsafety:
---------------------------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : python310-azure-ai-contentsafety
Version : 1.0.0-1.1
Arch : noarch
Vendor : openSUSE
Installed Size : 546.4 KiB
Installed : No
Status : not installed
Source package : python-azure-ai-contentsafety-1.0.0-1.1.src
Upstream URL : https://github.com/Azure/azure-sdk-for-python
Summary : Microsoft Azure AI Content Safety Client Library for Python
Description :
Azure AI Content Safety detects harmful user-generated and AI-generated content in
applications and services. Content Safety includes text and image APIs that allow
you to detect material that is harmful.
---------------------------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : python310-azure-ai-contentsafety
Version : 1.0.0-1.1
Arch : noarch
Vendor : openSUSE
Installed Size : 546.4 KiB
Installed : No
Status : not installed
Source package : python-azure-ai-contentsafety-1.0.0-1.1.src
Upstream URL : https://github.com/Azure/azure-sdk-for-python
Summary : Microsoft Azure AI Content Safety Client Library for Python
Description :
Azure AI Content Safety detects harmful user-generated and AI-generated content in
applications and services. Content Safety includes text and image APIs that allow
you to detect material that is harmful.