About Us
eSimplicity is a modern digital services company that works across government, partnering with our clients to improve the health and lives of millions of Americans, ensure the security of all Americans—from soldiers and veterans to kids and the elderly, and defend national interests on the battlefield. Our engineers, designers, and strategists cut through complexity to create intuitive products and services that courageously equip Federal agencies with solutions to transform today for a better tomorrow for all Americans.
Purpose and Scope:
We are seeking smart, experienced engineers to help deploy groundbreaking technical solutions to solve our customers’ hardest problems. We help our customers detect insider trading, improve disaster relief, fight healthcare fraud, and more. Each mission presents different challenges, from the regulatory environment to the nature of the data to the user population. Our Engineers work to accommodate all aspects of an environment to drive real technical outcomes for our customers.
You will work with Machine Learning Engineers and Data Scientists to build, scale, and maintain AI/ML training and scoring pipelines. You will ensure the pipelines track data lineage and promote explainability.
You will support assessment and research of incorporating AI-DevOps and AI-ops into our infrastructure.
You will work with architects and the CTO team to assess and research technologies, AWS services, and frameworks for our Cloud & DevSecOps pipelines.
Responsibilities:
- Implementing large-scale data ecosystems including data management, governance, and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms
- Developing end-to-end (Data/Dev/ML) Ops pipelines based on in-depth understanding of cloud platforms, AI lifecycle, and business problems to ensure analytics solutions are delivered efficiently, predictably, and sustainably
- Operationalizing and monitoring ML models using high end tools and technologies
- Maintaining current data product APIs, releasing ML models on a regular cadence, building capability of continuous training and monitoring
- Explaining model behaviour/results to both Technical and Non-Technical Audiences
- Understanding customer requirements and project KPIs
- Implementing various development, testing, automation tools, and IT infrastructure
- Defining and setting development, test, release, update, and support processes for DevOps operation
- Demonstrating the technical skill to review, verify, and validate the software code developed in the project
- Troubleshooting techniques and fixing the code bugs
- Monitoring the processes during the entire lifecycle for its adherence and updating or creating new processes for improvement and minimizing the wastage
- Encouraging and building automated processes wherever possible
- Participating in incidence management and root cause analyses
- Coordinating and communicating within the team and with customers
- Selecting and deploying appropriate CI/CD tools
- Striving for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
Required Qualifications:
- Bachelor’s Degree or equivalent in experience in Engineering, Computer Science, or related field
- 7+ years working in software development
- Extensive experience in AWS cloud data architecture and big data technologies, including EMR, Databricks, Hive, Spark, AWS Glue, Athena, and Redshift
- Experience working in AWS
- Experience working on Linux based infrastructure
- Experience working with IaC tools such as Terraform, Ansible, AWS CloudFormation
- Experience setting up AuthN and AuthZ systems, including Active Directory, Okta, and AWS IAM Policies/Roles using attribute-based access controls
- Strong experience with Python, PySpark, R, RStudio, and SageMaker
- Experience configuring and managing databases such as Hive, MySQL, MongoDB
- Experience with Docker, ECS, and EKS
- Familiarity with data mining, supervised, and unsupervised learning methodologies including data dimensionality reduction, correlation analysis, linear regression, PCA, clustering, random forest, etc.
- Familiarity with AI/ML libraries such as scikit-learn, SparkML, TensorFlow, PyTorch
- Excellent troubleshooting skills
- Working knowledge of various tools, open-source technologies, and cloud services
- Awareness of critical concepts in Security, DevOps and Agile principles
Desired Qualifications:
- Experience with configuring LLMs on Bedrock
- Experience with Retrieval-Augmented Generation (RAG) for LLMs in AWS Bedrock
- Experience with Amazon Aurora PostgreSQL
- Experience with Amazon Kendra
- Experience with agents and mixture-of-experts (MoE) in LLMs
- Experience with Amazon SageMaker ML Lineage Tracking
- AWS Machine Learning Specialization Certification
Working Environment:
This program supports a remote work environment operating within the Eastern time zone so we can work with and respond to our government clients. Expected hours are 9:00 AM to 5:00 PM Eastern unless otherwise directed by your manager.
Occasional travel for training and project meetings. It is estimated to be less than 25% per year.
Benefits:
We offer a highly competitive salary, healthcare benefits, and a flexible leave policy.
Equal Employment Opportunity:
eSimplicity is an equal-opportunity employer. All qualified applicants will be considered for employment without regard to race, religion, color, national origin, gender, age, status as a protected veteran, sexual orientation, gender identity, or status as a qualified individual with a disability.