About Veryon
Veryon is the leading global provider of aviation software and information services, trusted by over 5000+ customers, 75,000 maintenance professionals, and 100+ OEMs across nearly 150 countries. Our mission is to deliver smarter, predictive technology solutions, including our AI platform, Veryon AIRE that maximizes aircraft uptime and operational efficiency for the world’s most demanding aviation organizations.
At Veryon, we are an AI-forward company focused on driving innovation and efficiency through emerging technologies. We prioritize hiring individuals who embrace AI, think creatively about its application, and are excited to continuously evolve alongside it.
About the Role
The Data Engineer is responsible for building and maintaining reliable, governed data pipelines that support finance-critical and revenue-critical workflows across Veryon.
This role ensures that data across systems is accurate, consistent, and structured to enable analytics, reporting, and AI-ready data foundations.
The position plays a key role in enabling data reliability, scalability, and governance across enterprise systems, ensuring stakeholders across the organization can trust and use data effectively.
Key Responsibilities
- Build and maintain data pipelines across CRM, billing, and accounting systems
- Ensure data reconciliation, consistency, and accuracy across multiple enterprise platforms
- Design and maintain core data models for revenue, customer, and financial datasets
- Build and optimize semantic data layers to unify data across disparate systems
- Implement data quality checks, monitoring, and validation frameworks
- Configure, maintain, and optimize ETL/ELT pipelines for multiple data sources
- Support analytics and AI data foundations to enable downstream reporting and automation use cases
- Provide administration and support for enterprise BI tools and reporting platforms
- Collaborate with cross-functional teams to ensure data availability, integrity, and usability
- Bachelor’s degree in Data Science, Computer Science, Engineering, Big Data, or related field (Master’s preferred)
- 3–6+ years of experience in Data Engineering or Analytics Engineering roles
- Strong expertise in SQL and data modeling (star/snowflake schemas, dimensional modeling)
- Experience building and maintaining data pipelines (ETL/ELT workflows)
- Hands-on experience with cloud data warehouse/lake platforms (e.g., Snowflake, BigQuery, Redshift, Databricks)
- Experience working with finance or revenue systems data (billing, accounting, ERP systems such as Oracle DB or similar)
- Experience integrating and working with CRM data systems (Salesforce preferred)
- Strong understanding of data governance, validation, and quality frameworks
- Ability to work in cross-functional, remote or distributed teams
- Strong problem-solving skills with attention to data accuracy and reliability
- Ability to communicate technical concepts to non-technical stakeholders
Preferred Skills
- Exposure to BI tools (Power BI, Tableau, Looker, etc.) administration and support
- Familiarity with data orchestration tools (Airflow, dbt, Prefect, or similar)
- Understanding of data lakehouse architectures and modern data stacks
- Exposure to machine learning data pipelines or AI/ML readiness frameworks
- Experience supporting revenue operations or finance analytics use cases
- Knowledge of data observability and monitoring tools
AI Competency Expectations
- Understanding of how data pipelines support AI/ML models and predictive analytics
- Experience enabling AI-ready datasets for automation or analytics use cases
- Familiarity with embedding structured data for AI consumption (semantic layers, feature stores)
Our Core Values:
- Fueled by Customers: Customers are at the core of every decision.
- Win Together: Collaboration is our competitive edge.
- Make It Happen: No excuses. Just outcomes.
- Innovate to Elevate: We boldly challenge what’s standard and lift what’s possible.