Most job postings are the same (and can be pretty boring, right?!). That's why we want to start out by telling you what's in it for you:
- We have an amazing platform that maximizes revenue for thousands of healthcare organizations across the country!
- We embrace diversity in a serious way! We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our work will be.
- We celebrate and promote career growth and advancement.
- We have an awesome on-demand learning program.
- We do fun stuff like remote Cooking Classes, Yoga Sessions & Mixology Classes because we like to have fun!
- We have an awesome benefits package with Medical, Dental & Vision Coverage & 401K (with company match).
- We have an unlimited vacation policy - that's right, take vacation when you want and come back to work refreshed!
- We have cool Peer Nominated Awards & Recognition because we like to celebrate our employees!
We are currently seeking a Data Operations Engineer to join our team! The Data Operations Engineer helps build and operate reliable data pipelines that load customer data into the MDaudit™ platform. You’ll own day-to-day pipeline operations (monitoring, SLAs, incident response, runbooks), perform fixes and small enhancements, and automate repetitive tasks to improve stability and speed. You’ll collaborate with customer-facing, product, and support teams to ensure accurate, on-time data delivery and smooth handoffs. As needed, you’ll assist with data governance and observability. Occasional travel (=15%) may be required.
ESSENTIAL DUTIES AND RESPONSIBILITIES
- Monitor daily jobs, SLAs, stages/queues, and alerts; triage and resolve incidents with clear runbooks and ticket updates.
- Perform root-cause analysis, document post-incident notes, and drive durable fixes; maintain an ops backlog.
- Execute releases and environment promotions following change control; validate and communicate outcomes.
- Manage extracts/loads/exports and verify row counts, manifests, and file integrity.
- Implement small pipeline enhancements and automate health checks.
- Contribute to performant data models and transformations supporting reliability and speed.
- Maintain SOPs/runbooks, mapping specs, and data dictionaries.
- Add/review quality checks and observability dashboards/alerts.
- Support data retention tasks in line with security/compliance policies.
- Provide concise status to stakeholders; support customer/solutions teams on questions.
- Other duties as needed.
Nice to have – BI Development
• Support basic semantic modeling and dashboarding (Power BI, ThoughtSpot, or Tableau), define/validate KPI logic, and assist with data onboarding for analytics users.
QUALIFICATIONS
- Bachelor’s degree in a related field or equivalent experience.
- 2–4 years in data engineering/analytics engineering with cloud warehouses and data pipelines.
- Strong SQL (T-SQL/Snowflake SQL) and at least 1–2 years Python for data tasks/automation.
- Hands-on with Snowflake and AWS S3; Azure experience a plus.
- Experience with orchestration of ETL processes (Airflow preferred).
- Familiarity with operational concepts: SLAs, incident management, change control, basic cost governance.
- Healthcare data familiarity (EDI 835/837, EHR) preferred but not required.
- Exposure to SSO/identity and basic AWS IAM concepts a plus.
- Experience with command line tools (AWS CLI, PowerShell) a plus.
REQUIRED SKILLS
- Collaborative, customer-centric communicator who documents clearly.
- Strong troubleshooting and analytical skills; high attention to detail.
- Comfortable in a fast-paced environment managing multiple workstreams.
- Experience with SaaS applications and working with technical/non-technical stakeholders.