Browse 9 exciting jobs hiring in Content Moderation now. Check out companies hiring such as Digital, Mangone Law Firm, Raya in Winston-Salem, Nashville-Davidson, Los Angeles.
Support and protect brand trust by delivering timely, empathetic social media customer support and triage during Pacific Time coverage for a global audience.
Mangone Law Firm is hiring a bilingual (Spanish-English) Community Manager to lead social engagement, moderation, and content efforts that strengthen the firm’s digital presence and support client growth.
Lead Community Manager, Trust & Safety to own policy enforcement, incident response, privacy casework and training across a global community, working from Los Angeles or remotely.
Archive seeks a detail-oriented Trust & Safety Specialist I to investigate fraud and policy violations, manage chargebacks, and help maintain a safe marketplace for buyers, sellers, and brands.
Handshake AI is hiring a contract Red Teaming Domain Expert to craft adversarial prompts and stress-test LLMs for safety and robustness across real-world edge cases.
Pinterest is hiring a Safety Specialist II to lead critical-harms content moderation efforts, manage BPO performance, and improve policy enforcement processes across Trust & Safety.
Support a global brand as a remote Community Engagement & Moderation Specialist focused on fostering inclusive, constructive community interactions and improving engagement.
Lead cross-functional Trust & Safety programs at Character.AI to design and scale safer, more reliable AI experiences across product, engineering, and vendor ecosystems.
Samsara seeks an experienced Senior Community Manager to architect and launch a scalable B2B customer community that drives engagement, advocacy, and measurable business impact.
Below 50k*
1
|
50k-100k*
2
|
Over 100k*
4
|