Track This Job
Add this job to your tracking list to:
- Monitor application status and updates
- Change status (Applied, Interview, Offer, etc.)
- Add personal notes and comments
- Set reminders for follow-ups
- Track your entire application journey
Save This Job
Add this job to your saved collection to:
- Access easily from your saved jobs dashboard
- Review job details later without searching again
- Compare with other saved opportunities
- Keep a collection of interesting positions
- Receive notifications about saved jobs before they expire
AI-Powered Job Summary
Get a concise overview of key job requirements, responsibilities, and qualifications in seconds.
Pro Tip: Use this feature to quickly decide if a job matches your skills before reading the full description.
Crisis & Response Management Specialist / Data Labeling Analyst
- Location: Dublin Onsite 100% - 5 days per week
- Duration: 11 months
- Salary: €73,000/annum pro rata
We are seeking a highly skilled and detail-oriented Content Moderation and Quality Assurance Specialist. The successful candidate will be responsible for training cutting edge AI models for content moderation, for ensuring the quality and accuracy of content moderation decisions, creating golden sets, calibrating vendor sites, and managing vendor relationships.
This role will involve exposure to potentially graphic and/or objectionable content including but not limited to graphic images, videos and writings, offensive or derogatory language, and other potential objectionable material. I.e. child exploitation, graphic violence, animal abuse, self-injury and other content which may be considered offensive or disturbing.
Key Responsibilities:
- Content Moderation Quality Assurance: Review and evaluate content moderation decisions made by AI to ensure accuracy, consistency, and adherence to policies and guidelines.
- Golden Set Creation: Develop and maintain golden sets of content to test and calibrate AI and vendor performance, ensuring that they meet our quality standards.
- Vendor Site Calibration: Collaborate with vendor sites to identify areas for improvement, provide feedback, and implement calibration plans to enhance their performance.
- Identify opportunities to improve AI and content moderation processes, workflows, and tools, and collaborate with internal stakeholders to implement changes.
- Analyze data and metrics to identify trends, patterns, and insights that inform quality assurance strategies and vendor performance.
- Participate in training and development programs to stay up-to-date on policies, procedures, and industry best practices.
- Address sensitive content issues, including but not limited to graphic images, videos and writings, offensive or derogatory language, and other objectionable material as needed
- Anticipate the societal effects of different forms of online speech and behavior, and test hypotheses and assumptions about online activity in an operations setting
Requirements:
- 4+ years of experience in content moderation, quality assurance, or a related field.
- Strong understanding of content moderation policies, procedures, and guidelines.
- Experience reviewing AI generated conversations
- Experience in online customer support (chatbots)
- Excellent analytical, problem-solving, and communication skills.
- Ability to work independently and collaboratively as part of a team.
- Strong attention to detail and ability to maintain accuracy in a fast-paced environment.
- Experience working with vendors or external partners preferred.
- Familiarity with data analysis and reporting tools preferred.
Key Skills
Ranked by relevanceReady to apply?
Join RECRUITERS and take your career to the next level!
Application takes less than 5 minutes

