Senior Data Engineer
Articulate (View all Jobs)
United States (Remote)
1. Take-home project 2. Pair program on a problem similar to daily work
Programming Languages Mentioned
As Articulate and its customer base have rapidly grown, the quantity and source of operational data have accelerated. Finding new and optimal ways of leveraging the data sets empowers us to make better decisions and gain insights into key performance indicators.
We are looking for a full-time Sr. Data Engineer with experience in building data integrations using AWS technology stack and as well as building analytics layers. This role closely works with internal stakeholders cross-functionally to develop data models and pipelines for analysis and reporting. The ideal candidate has a passion for delighting internal customers and brings significant experience and skill in designing, building, and operating data warehousing solutions.
This position will report to our Sr. Director of Growth Insights.
What you'll do:
Create and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, leveraging solutions like Terraform, Stitch, SnapLogic, and dbt.
Designing and implementing high-performance, reusable, and scalable data models for our data warehouse to ensure end-users get consistent and reliable answers to their questions.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc., leveraging internal tools such as Github, dbt and etc.
Oversee Data Engineering/Analytics Engineering work, and create a roadmap for data development at Articulate.
Optimize data access and knowledge sharing with the entire org.
Actively contribute to adopting solid data engineering architecture, development practices, and new technologies.
Lead and support Analytics Engineers in building and maintaining the analytics layer of our team’s data environment to make data standardized and easily accessible.
Work closely with Product Engineering and Platform Engineering to ensure product changes integrate well with the analytics data model.
Maintain documentation around datasets, including Data Dictionary.
Data Engineering Roadmap
Oversee Data Engineering work, create a roadmap for data development at Articulate.
Actively contribute to the adoption of strong data engineering architecture, development practices, and new technologies.
Communicate across the org with Platform, Product Engineering, Sales Ops, and Marketing teams to capture customer data integration requirements, conceptualize solutions & build a required technology stack, and to centralize data operations.
Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility.
Be a resource for data users in the organization. Create awareness of available data. Assist our users with technical tasks including data extracts, report creation, SQL, and general data system troubleshooting.
Oversee of GitHub repository, enforcing industry code version-control best practices.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Review and approve dbt and LookML code submitted by others prior to merging changes into production.
Be an advocate for data governance, security, privacy, quality, and retention.
What you should have:
Proven 7+ years experience in building/operating/maintaining fault-tolerant and scalable data processing integrations using AWS.
Proven experience in designing and building a data warehouse from a disparate set of sources.
Excellent knowledge of SQL, data modeling, and patterns.
Experience managing a process of reporting and analytics development on a data warehouse.
Proficient in developing LookML.
Rigorous attention to detail and accuracy.
Strong capacity to manage numerous projects is a must.
BS/MS degree in Computer Science or equivalent industry experience.
Excellent communication skills (we’re a geographically distributed team, 100% virtual).
Passion for creating Intelligent data pipelines that internal customers love to use.
Curiosity and a drive to understand and clarify complex topics.
Python coding skills, particularly in the areas of automation & integrations
Past experience with data governance principles, KPI management, and supporting data operations for business intelligence
Please mention No Whiteboard if you apply!
I'm a one-man team looking to improve tech interviews, and could use any support! 😄