This position is based in our Tampa office located on beautiful Rocky Point. Only local candidates and those willing to relocate will be considered.
All parties authorized to work in the US without sponsorship are encouraged to apply. No sponsorship provided.
Azure Architect will modernize, automate, and consolidate workloads, and pursue next-generation innovation with expertise in design, development, and implementation of cloud projects. This role will model and ELT (Extract/Load/Transform) data into our cloud repository (Snowflake) for report consumption.
This position works with domain experts to curate, design, and catalogue high quality data models to ensure that data is accessible and reliable. Build highly scalable data processing frameworks to use across a wide range of datasets and applications. Provide data-driven insight and decision-making critical to business processes, to expose data in a scalable and effective manner while understanding existing and potential data sets in both engineering and business context.
- Migrate data from on premise to cloud technologies primarily into Snowflake and Azure eco-system.
- Provide expertise to take data from disparate sources and discover, transform, manage, aggregate, catalog and deliver data to be used in reports and analytics repositories.
- Be domain experts to architect, design and build data models, semantic layers, data vault and/or Star Schema models based on business rules.
- Deep understanding of multidimensionality of data, data curation and data quality and manage access to sensitive datasets in collaboration with the governance team.
- Support and provide data in a ready-to-use form to run queries and algorithms against the information for predictive analytics, machine learning and data mining purposes.
- Evaluate, and acquire new internal & external data sets that contribute to business decision making.
- Design, develop, automate, monitor, maintain and performance tune ELT/ETL to manage high volume data transfer to and from internal and external systems with extensive experience using ETL/ELT tools and techniques such as StreamSets, ADF, Talend, SSIS etc.
- Recommend solutions and promote design thinking, advocate design principles and best practices.
- Proficient understanding and deploy CI/CD enabled pipelines and adapt best practices using tools such as Bitbucket, GitHub.
- Good understanding of relational database technologies: Oracle, Sql Server, DB2 and MySQL etc.
- Interact with cross-functional teams, project managers and agile teams to estimate development efforts and ensure complete delivery of solutions and fulfilling requirement.
- Other duties as assigned.
Bachelor’s degree (BA or BS) from an accredited college or university plus a minimum of four (4) years of experience in the specific or related field.
Company / Industry Knowledge:
Knowledge of financial institutions and financial data, preferably credit union or retail banking related a plus.
- Any cloud-based certification such as Snowflake SnowPro or Data Engineering on Azure is a plus.
- Data curation tasks using any Data Management tools is good to have.
- 5+ years of hands-on experience building and maintaining secure cloud solutions.
- Experience in Data Warehousing solutions and BI reporting tools a plus such as Power BI, Tableau, SSRS etc.
- Proven experience with Snowflake or Azure technologies.
- Hands on experience on Snowflake Architecture - Warehouse sizing, data structures, RBAC.
- Hands on experience on Snowflake SnowSQL and Snowflake performance optimization.
- Understand agile and DevOps concepts.
- Work experience as a data engineer or in related field, 5+ years preferred.
- Experience designing highly scalable ELT/ETL processes with complex data transformations, data formats including error handling and monitoring.
- Working knowledge of more than one programming language (Python, Java, C++, C#, etc.).