About 11:59
We are 11:59, a professional services firm that helps forward-thinking public and commercial sector clients unlock the full potential of their transformation and modernization efforts. With a team led by former Big 4 Consulting Executives, we bring the Art of the Possible to life, delivering unbiased strategies that are independent from resource constraints, while generating true value and advantage based on what is possible today so that our clients are ready for tomorrow.
As a firm that’s built on the curiosity of our experts, we employ an integrated, agile talent model to deliver modern technology. Our clients hire us for our shared commitment to their mission, the collaborative approach of our team members, and our track record of delivering breakthrough solutions and results.
At 11:59, we don’t just showcase the art of the possible, we deliver it.
Job Description:
As the Senior Data Engineering Manager, your primary focus will be on leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS / Azure cloud environment.
Your industry experience and insatiable appetite to exceed client expectations.
At a critical time of growth for our business, 11:59 is seeking a Senior Manager, Data Engineering for the Data Analytics service line on our Digital Transformation Team. You bring a proven background in production deployments of large-scale data transformation projects and a proven track record of exceptional customer service.
The Senior Manager will manage key client technical projects and workstreams, coordinating the work of more junior engineers, and often working alongside them. They will also partner closely with business analysts and project managers to complete projects on time, within budget, within scope, and with high customer satisfaction.
Job Responsibilities:
- Create technical, functional, and operational documentation for data pipelines and applications
- Work effectively in an Agile Scrum environment (JIRA / Azure DevOps)
- Use business requirements to drive the design of data solutions/applications and technical architecture
- Work with other developers, designers, and architects (local and remote) to ensure data applications meet requirement and performance, data security, and analytics goals
- Anticipate, identify, track, and resolve issues and risks affecting delivery
- Lead the configuration, build, and testing of data applications and technical architecture
- Establish and grow a data engineering framework to ensure the reliability, scalability, quality and efficiency of data pipelines, storage, processing and integration
- Coordinate and participate in structured peer reviews / walkthroughs / code reviews
- Provide application/technical support
- Maintain and/or update technical and/or industry knowledge and skills through continuous learning activities
- Mentor and develop junior engineers
Required Qualifications:
- B.S. in Computer Science/Engineering or relevant field; Masters degree preferred
- 12+ years of experience in the IT industry; consulting experience preferred
- 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions
- Expert understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized) implementing highly performant data ingestion pipelines from multiple sources
- Expert level skills with Python / PySpark and SQL
- Demonstrable experience in developing and nurturing a Data Engineering framework to include package/dependency management tools (e.g., Poetry), functional testing (e.g., Pytest, Pytest-Cov, PyLint), and load testing
- Experience with CI/CD on Databricks using tools such as Jenkins, GitHub Actions, and Databricks CLI
- Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
- Strong understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints
- Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT)
- Hands on experience developing batch and streaming data pipelines
- Able to work independently
- Excellent oral and written communication skills
- Nice to have: Databricks certifications and AWS Solution Architect certification
- Nice to have: experience in Power BI/Tableau/QuickSight
- Nice to have: experience with AWS Redshift, Snowflake, Azure Synapse
How you’ll grow:
At 11:59, our professional development plan is dedicated to supporting individuals at all stages of their careers in recognizing and utilizing their strengths to achieve optimal performance every day. Whether you are an entry-level employee or a senior leader, we strongly believe in the power of continuous learning and provide a range of opportunities to enhance skills and gain practical experience in the dynamic and rapidly evolving global technical landscape.
Our approach includes both on-the-job learning experiences and formal development programs, ensuring that our professionals have ample opportunities for growth throughout their entire career journey. We are committed to fostering a culture of continuous improvement, where every individual can thrive, reach their fullest potential, and deliver the art of the possible.
Why work with us?
- Competitive pay
- Health, dental, vision, and life insurance
- Unlimited Paid Time Off
- 401(k) matching
- Remote