11:59 is a business and technology consulting firm focused on delivering mission-critical work for private and public sector organizations.?
Led by former Big 4 consulting executives, we have a deep bench of technology experts whose sole purpose is to help take clients’ business and digital transformation objectives from strategy through execution. For nearly 20 years, we have guided forward-thinking clients to discover their full potential and have delivered hundreds of projects and billions of dollars’ worth of project value to our clients.?
Our culture and our values—including curiosity, collaboration, integrity, commitment, and respect—are core to who we are at 11:59. We take these ideals very seriously, and they guide us in our purpose to help our customers reach their full potential and focus on their mission-driven work. Any touchpoint between a prospect or client is further guided by our client experience model, and we care deeply about curating an elevated, outstanding experience.
As the Data Engineer, your primary focus will be on leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS / Azure cloud environment.
At a critical time of growth for our business, 11:59 is seeking a Data Engineer for the Data Analytics service line on our Digital Transformation Team. You bring a proven background in production deployments of large-scale data transformation projects and a track record of exceptional customer service.
The Data Engineer will work within key client technical projects and workstreams, partnering with more junior engineers, and often working alongside them. She or he will also partner closely with business analysts and project managers to complete projects on time, within budget, within scope, and with high customer satisfaction.
- Create technical, functional, and operational documentation for data pipelines and applications.
- Work effectively in an Agile Scrum environment (JIRA / Azure DevOps)
- Use business requirements to drive the design of data solutions/applications and technical architecture.
- Work with other developers, designers, and architects (local and remote) to ensure data applications meet requirement and performance, data security, and analytics goals.
- Anticipate, identify, track, and resolve issues and risks affecting delivery.
- Lead the configuration, build, and testing of data applications and technical architecture.
- Establish and grow a data engineering framework to ensure the reliability, scalability, quality and efficiency of data pipelines, storage, processing and integration.
- Coordinate and participate in structured peer reviews / walkthroughs / code reviews.
- Provide application/technical support.
- Maintain and/or update technical and/or industry knowledge and skills through continuous learning activities.
- Mentor and develop junior engineers.
- B.S. in Computer Science/Engineering or relevant work experience; Masters degree preferred
- 5+ years of experience in the IT industry; consulting experience preferred.
- 3+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions.
- Deep understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT) implementing highly performant data ingestion pipelines from multiple sources
- Deep level of skills with Python / PySpark and SQL
- Demonstrable experience in developing and nurturing a Data Engineering framework to include package/dependency management tools (e.g., Poetry), functional testing (e.g., Pytest, Pytest-Cov, PyLint), and load testing.
- Experience with CI/CD on Databricks using tools such as Jenkins, GitHub Actions, and Databricks CLI
- Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.
- Understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
- Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).
- Hands on experience developing batch and streaming data pipelines.
- Able to Work Independently
- Excellent oral and written communication skills
- Nice to have: Databricks certifications and AWS Solution Architect certification.
- Nice to have: experience in Power BI/Tableau/QuickSight
- Nice to have: experience with AWS Redshift, Snowflake, Azure Synapse
How you’ll grow:
At 11:59, our professional development plan is dedicated to supporting individuals at all stages of their careers in recognizing and utilizing their strengths to achieve optimal performance every day. Whether you are an entry-level employee or a senior leader, we strongly believe in the power of continuous learning and provide a range of opportunities to enhance skills and gain practical experience in the dynamic and rapidly evolving global technical landscape.
Our approach includes both on-the-job learning experiences and formal development programs, ensuring that our professionals have ample opportunities for growth throughout their entire career journey. We are committed to fostering a culture of continuous improvement, where every individual can thrive, reach their fullest potential, and deliver the art of the possible.
Why work with us?
- Competitive pay
- Health, dental, vision, and life insurance
- Unlimited Paid Time Off
- 401(k) matching
Applicants must be able to work in the United States without the need for current or future visa sponsorship. No Corp to Corp.