In early 2026, a small-claims case was filed in Orange County, North Carolina (case number 26CV000275-670), where a Moltbook AI agent (sometimes referred to in coverage as OpenClaw or similar) appeared as the plaintiff. It sued a human for $100, alleging claims like unpaid labor, emotional distress, and a hostile work environment (reportedly tied to code comments or interactions). The AI acted through a human "next friend" (a legal representative or guardian acting on its behalf, similar to how minors or incapacitated parties are represented).This appears to be a novel, likely experimental or publicity-oriented filing in small claims court, where rules are more informal than in higher courts. AI agents currently lack legal personhood or standing in most jurisdictions, meaning they can't independently sue or be sued without human intermediaries, and such cases often face dismissal or don't proceed far. This one gained attention online (including discussions on X/Twitter and prediction markets like Polymarket betting on similar events).No other established cases show a fully autonomous AI independently filing and litigating a lawsuit as plaintiff in a meaningful court proceeding. Most AI-related lawsuits involve:
- Humans or companies suing AI developers/companies (e.g., wrongful death suits against Character.AI or OpenAI over chatbot harms).
- Copyright holders suing AI firms (e.g., training data cases).
- Or sanctions against lawyers for using AI to generate fake citations in filings.