The 8088 The 8088 ← All news
arXiv cs.LG AI Research 11h ago

An Analysis of Active Learning Algorithms using Real-World Crowd-sourced Text Annotations

★★★★★ significance 2/5

This research examines how active learning algorithms perform when faced with imperfect or noisy labels from real-world crowd-sourced workers. The study compares eight common active learning techniques using text classification datasets to understand how human error and refusal to label affect model training.

Why it matters Reliable model performance depends on understanding how algorithmic selection interacts with the inherent noise of human-provided data.
Read the original at arXiv cs.LG

Tags

#active learning #crowdsourcing #machine learning #text classification #noisy labels

Related coverage