Data Warehouse Architect 5
Atlanta, San Francisco, Phoenix Engineering
Job Type
Full-time
Description

Who We Are


Bakkt is a trusted digital asset platform that enables consumers to buy, sell, store, send and spend digital assets. Bakkt’s platform, is available through the Bakkt App and to partners through the Bakkt platform.


Bakkt was founded in 2018 with a unifying vision: to connect the digital economy. We power commerce by enabling consumers, businesses and institutions to unlock value from digital assets.


Digital assets – which include cryptocurrency, loyalty and rewards points, gift cards, in-game assets, and non-fungible tokens, or NFTs, which are unique digital assets that are not interchangeable – comprise a growing $1.6 trillion marketplace. We are unlocking new ways to participate in the digital economy for consumers, businesses, and financial institutions. We accomplish this by expanding access to and improving liquidity for digital assets.


Since our inception, we have hit a number of milestones, working behind the scenes to build a platform worthy of our vision – lowering barriers of entry into cryptocurrency, launching partnerships with some of the world’s premier brands and adding value to consumers’ digital assets by making their rewards and loyalty programs more compelling.


Bakkt acts as a profound bridge connecting the digital ecosystem and sitting at the intersection of crypto, loyalty and payments. Our platform enables consumers to leverage their digital assets – from cryptocurrency, to select loyalty points, gift cards and merchant offers – in new and exciting ways. It also enables companies & merchants to access all of these capabilities, extending engagement with their customers, creating moments of delight while broadening the appeal and daily use of their loyalty and rewards programs. Now, let’s get to the real reason why you’re here – how we can work together.


The Data Warehouse Architect 5  will drive the adoption of cloud data services across the enterprise ensuring emerging business goals are met while maintaining a consistent and stable DW landscape. Cloud DW Architect will be responsible for delivering high quality solutions for applications using heterogeneous data sources. With hands-on experience & leveraging Cloud Services focused on RDBMS technologies – you will be responsible for designing, building, migrating, & maintaining robust Data Lake & Data Analytics solutions.


Responsibilities

  • Act as an SME for migrating on-prem applications to cloud-based PaaS services primarily focusing on DW and Big Data solutions, using modern technology stack. Provide guidance in building multi-terabyte DW strategy.
  • Architect Scalable Data Analytics solutions, including technical feasibility for Big Data Storage, Processing, and Consumption. Implement enterprise Data Lake & heterogeneous data management methods. Must be able to research, present, & articulate pros and cons of modern data tier technologies.
  • Develop data-gathering methods for analyzing and synthesizing data. Work with Data Analysts, DBAs, and Business Stake Holders to define secure and performance optimized data rendering methods and self-service BI.
  • Assist application teams in designing cloud data solutions incorporating security principles and best practices to ensure designs are in accordance with data security, governance, and costs.
  • Designing Data Pipelines to support machine-learning solutions. Data science experience using Python, R statistical modeling, data mining, and operationalizing end-to-end cloud data analytics solutions is a plus.
  • Thorough understanding of one or more Integration/Analysis/Reporting tools like Data Prep, SSIS, SSAS, Azure Data Factory, Pentaho, Segment etc. and proficient in migrating legacy ETL processes to cloud based solutions.
  • Design, implementation, and/or support of complex application architectures (i.e. having an architectural sense for connecting data sources, data visualization, structured and unstructured data)
  • Build robust Web Analytics platform to track and analyze site visits, conversions, promo tracking, hit maps etc. Experience with tools like Google Analytics, Heap, Matomo etc.
Requirements
  • 10+ years of hands on experience in Database engineering/Business Intelligence technologies.
  • Proficient with Google Cloud Platform Analytics tool set such as Big Query, Looker, Dataproc, Dataflow, Cloud Data Fusion, Dataprep, also Azure Data factory, Data Bricks and Synapse Analytics etc.
  • Hands on experience working with Analytics tools like Looker, Google Data Studio, Power BI, Tableau, Big Query, Sisense, Alteryx etc.
  • Knowledge of Master Data Management (MDM) and Data Quality tools. Experience with Segment.
  • Experience with Dev Ops and Cloud Solutions to implement (Git/VSTS/Automated Deployment).
  • Deep understanding in both traditional & modern Data Architecture and Processing concepts, including various RDBMS systems (MS SQL, MySQL, Oracle, Maria DB), Big Data (Hadoop, Spark), & Business Analytics
  • Knowledge handling APIs as Data Sources using any language (Python, .NET, Azure Functions, Java)

Certifications a Plus: Azure and / or GCP Data / DW Engineer