# Data Poisoning

**IRI:** https://folio.openlegalstandard.org/RBMj5dbvLFgFGrPZ3MvFGYl

## Labels

**Alternative Labels:**

- Adversarial Data Manipulation
- Dataset Compromise
- Malicious Data Tampering
- Poisoning Attack
- Training Data Corruption

## Definition

Data Poisoning is a type of adversarial attack where malicious actors intentionally alter or manipulate the training data of a machine learning model to compromise its integrity and performance. This can lead to the model learning incorrect patterns or making erroneous predictions.

## Sub Class Of

- https://folio.openlegalstandard.org/RBHMad8oNmYXkYHOHZLCgqv

**Deprecated:** False

