Backend Engineer, MLOps

GitLab (View all Jobs)

Remote, Chile

Please mention No Whiteboard if you apply!
I'm a one-man team looking to improve tech interviews, and could use any support! 😄


Interview Process

1. A series of video calls 2. Coding exercise involving working on a Merge Request that is like a real work task

Programming Languages Mentioned

Python, Ruby, ETL, JavaScript, Golang


The GitLab DevSecOps platform empowers 100,000+ organizations to deliver software faster and more efficiently. We are one of the world’s largest all-remote companies with 2,000+ team members and values that foster a culture where people embrace the belief that everyone can contribute. Learn more about Life at GitLab.

About the work 

This group is focused on enabling GitLab users to build and run data science workloads across GitLab’s AI-powered DevSecOps platform. This individual will play a key role in building features that aid in the expansion of GitLab’s core personas to address the needs of data science teams and will work cross-functionally with other GitLab stages to ensure our platform supports ML/AI workloads that align with our ModelOps product direction. This individual will be responsible for enhancing existing product features used by millions of users across the world.

Your role will be to build, scale, and iterate on ML features that help operationalize customers’ ML ( Both AI and Gen AI solutions )  and aid in dropping the costs of Machine Learning. You will balance short and long-term efforts from a feature perspective and work on backend machine learning enterprise feature challenges. With the team, you will be building the backbone of our ML efforts that solve the challenges of today and tomorrow for our customers building in the journey of ML.

 Some examples:

What you will do

  • Play a key role in the design, implementation, building and integration of product features;
  • Solve technical backend machine learning  problems of high scope and complexity;
  • Test, deploy, maintain, and improve ML system solution features that aid with registry, deployments, inference of models;
  • Help to define and improve our internal standards for style, maintainability, and best practices for a high-scale web environment;
  • Collaborate with other ML, Python, Ruby, Golang engineers to build GenAI Solutions

What you will bring 

  • Significant experience in Golang
  • Some level of python is preferred
  • Some level of Vue.Js is preferred
  • Significant experience in how to operationalize the lifecycle management of data science workloads, including data ingestion and ETL, feature engineering, development environment setup, CI/CD integration, model versioning, testing, and production deployment
  • Deep technical understanding and usability  of ML/AI technologies (e.g GPUs, model inference, training, natural language processing, large language models, embeddings, vector database, etc)
  • Production experience with Terraform, Kubernetes, Docker, and preferably GCP or equivalent technologies.
  • Experience with inference backend and post-production monitoring of large language models ( >10GB)
  • High interest in defining experimentation and agent architecture for large-scale ML recommendation engines 
  • A genuine passion for learning as you will be solving the challenges of today, tomorrow, and many years to come.

About the team

Today, data scientists piece together data, tooling, and frameworks to get bespoke data science workloads running and producing business value. Data scientists then hand off models to engineering teams to attempt to deploy them to production. These teams speak different languages, use different tools, and have completely different workflows. This makes it very difficult to deploy data science workloads to production, increasing time to value and costs. One of our primary goals for our ModelOps stage is to reduce the complexities of data science workloads and integrate these workloads to easily be managed and developed within GitLab.

How GitLab will support you

 

Please note that we welcome interest from candidates with varying levels of experience; many successful candidates do not meet every single requirement. Additionally, studies have shown that people from underrepresented groups are less likely to apply to a job unless they meet every single qualification. If you're excited about this role, please apply and allow our recruiters to assess your application.


Country Hiring Guidelines: GitLab hires new team members in countries around the world. All of our roles are remote, however some roles may carry specific location-based eligibility requirements. Our Talent Acquisition team can help answer any questions about location after starting the recruiting process.  

Privacy Policy: Please review our Recruitment Privacy Policy. Your privacy is important to us.

GitLab is proud to be an equal opportunity workplace and is an affirmative action employer. GitLab’s policies and practices relating to recruitment, employment, career development and advancement, promotion, and retirement are based solely on merit, regardless of race, color, religion, ancestry, sex (including pregnancy, lactation, sexual orientation, gender identity, or gender expression), national origin, age, citizenship, marital status, mental or physical disability, genetic information (including family medical history), discharge status from the military, protected veteran status (which includes disabled veterans, recently separated veterans, active duty wartime or campaign badge veterans, and Armed Forces service medal veterans), or any other basis protected by law. GitLab will not tolerate discrimination or harassment based on any of these characteristics. See also GitLab’s EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know during the recruiting process.

Please mention No Whiteboard if you apply!
I'm a one-man team looking to improve tech interviews, and could use any support! 😄


Get weekly alerts of new jobs from companies not using whiteboard interviews!