According to a WIRED report from January 10–11, 2026, OpenAI has begun asking its contractors (AI trainers) to upload "real work samples" from their previous jobs. This data is needed to train autonomous agents to perform complex office tasks. However, the requirement has sparked a storm of criticism: effectively, employees are being pushed to breach NDAs and transfer third-party intellectual property.
Lawyers warn that using such data for model training could lead to trade secret lawsuits. If an OpenAI agent learns to compile financial reports based on proprietary data from a real corporation, it creates a legal ticking time bomb. The situation highlights the scarcity of high-quality data for training complex reasoning models.
Source: WIREDPrivacyOpenAILegalData EthicsAI Training