Analytics Engineer

Fetch Rewards (View all Jobs)

Madison WI

Please mention No Whiteboard if you apply!
I'm a one-man team looking to improve tech interviews, and could use any support! 😄


Interview Process

1. Short take-home project 2. 50 min screening interview that includes discussion of project 3. 5 hr (w/ breaks) final interview that involves speaking with your future manager and a non-technical product manager, a real-world coding problem, and high-level and low-level system design problems.

Programming Languages Mentioned

Python, ETL, SQL


What we’re building and why we’re building it. 

Fetch is a build-first technology company creating a rewards program to power the world. Over the last 5 years we’ve grown from 0 to 1M active users and taken over the rewards game in the US with our free app. The foundation has been laid. In the next 5 years we will become a global platform that completely transforms how people connect with brands. 

It all comes down to two core beliefs. First, that people deserve to be rewarded when they create value. If a third party directly benefits from an action you take or data you provide, you should be rewarded for it. And not just the “you get to use our product!” cop-out. We’re talkin’ real, explicit value. Fetch points, perhaps. 

Second, we also believe brands need a better and more direct connection with what matters most to them: their customers. -- Brands need to understand what people are doing, and have a direct line to be able to do something about it. Not just advertise, but ACT. Sounds nice right? 

That’s why we’re building the world’s rewards platform. A closed-loop, standardized rewards layer across all consumer behavior that will lead to happier shoppers and stronger brands.


Fetch Rewards is an equal employment opportunity employer.

In this role, you can expect to:

  • Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance
  • Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices to accomplish your work
  • Generate innovative approaches to datasets with millions of daily active users and terabytes of data
  • Translate business requirements for near-real-time actionable insights into data models and artifacts
  • Communicate findings clearly both verbally and in writing to a broad range of stakeholders
  • Administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure
  • Test, monitor, and report on data health and data quality
  • Lead the charge on data documentation and data discovery initiatives

You are a good fit if you:

  • Are proficient in SQL and understand the difference between SQL that works and SQL that performs
  • Have worked with data modeling and orchestration tools
  • Have experience with relational (SQL), non-relational (NoSQL), and/or object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB)
  • Have a solid understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools
  • Have prior experience clearly communicating about data with internal and external customers
  • Are highly motivated to work autonomously, with the ability to manage multiple work streams
  • Interest in building and experimenting with different tools and tech, and sharing your learnings with the broader organization
  • Love Dogs! . . . Or at least tolerate them. We're a very canine-friendly workplace!

You have an edge if you:

  • Have developed and maintained DBT or Airflow in production environments
  • Have experience programmatically deploying cloud resources on AWS, Azure, or GCP
  • Have successfully implemented data quality, data governance, or disaster recovery initiatives
  • Are proficient in at least one imperative programming language (i.e., Python)

Please mention No Whiteboard if you apply!
I'm a one-man team looking to improve tech interviews, and could use any support! 😄


Get weekly alerts of new jobs from companies not using whiteboard interviews!