
Senior Data Engineer, Genius
Job Description
Posted on: February 28, 2026
About Genius @ MediaLab Genius is the leading destination for music, lyrics, and the stories behind the songs—and part of the MediaLab portfolio of digital brands. MediaLab is a media and technology company that acquires and grows category-defining properties, providing the scale, resources, and expertise to help them thrive. As one of MediaLab’s flagship brands, Genius benefits from this shared foundation while maintaining its unique identity at the intersection of music and culture. Join an exceptionally talented team of engineers, designers, product leaders, and business builders who are shaping the future of music and media. MediaLab is headquartered in sunny Santa Monica, California, with growing Genius teams in New York and across the U.S. and Latin America. Your Role at Genius Genius is looking for a remote Senior Data Engineer to help build the ultimate music companion. Overseeing millions of pages of lyrics, annotations, and music metadata, your work will power Genius.com, our public API, and our partner integrations, with no shortage of interesting data challenges to solve. We're looking for engineers who thrive in complex backend systems: folks who can build robust data pipelines, optimize query performance at scale, and keep critical services running smoothly. As a Senior Data Engineer for Genius, you will play a key role in strengthening our data infrastructure, helping us process, transform, and serve the music knowledge that powers everything we do. You will work across our Rails backend, PostgreSQL databases, and Python-based data pipelines to improve data reliability, throughput, and accuracy. Additionally, you'll work to improve the stability, performance, and scalability of Genius' backend services, which is no small feat for a platform serving millions of users daily! Location Requirement:Candidates must be based in Los Angeles, California, or Seattle, Washington to be considered for this role. What You’ll Do
- Build and maintain data pipelines using Python and Airflow to ingest, transform, and enrich music metadata from internal and external sources
- Proactively identify and fix infrastructure bottlenecks to scale backend services to tens of thousands of requests per minute
- Architect database query patterns and migrations in PostgreSQL, Clickhouse and BigQuery that scale to large tables with 1B+ rows
- Design and implement backend APIs in Ruby on Rails that serve data reliably and performantly to our frontend and partner integrations
- Take ownership over the systems you build, proactively identifying and surfacing performance, reliability, and maintainability improvements
- During your on-call rotation, be the backstop for backend quality, stability, and performance. Triage incoming issues to find the most urgent problems, and handle emergent incidents to keep services online
- Work directly with stakeholders to uncover and address business needs, including product owners, data analysts, and other engineers across the company
What We’re Searching For
- Data Pipeline Experience: Hands-on experience building and maintaining ETL/ELT pipelines using Python and workflow orchestration tools like Apache Airflow. Candidates should be comfortable with data modeling, scheduling, monitoring, and debugging complex pipeline DAGs.
- Database Expertise: Deep proficiency with PostgreSQL and relational databases. Candidates should be able to design database schemas, write complex queries, optimize performance for scale, and manage migrations on large, high-traffic tables.
- Backend Framework Proficiency: Strong experience with Ruby on Rails (or a similar "batteries-included" framework). Candidates should be comfortable working with large, established codebases, and have an understanding of architectural patterns necessary to build APIs at scale.
- Product Ownership: Brings a product-first mindset, able to drive projects from ideation to launch while focusing on data quality, system reliability, and cross-functional alignment.
- Collaboration & Communication: Comfortable working cross-functionally with frontend engineers, data analysts, product, and other teams to ensure clarity, alignment, and smooth execution of projects.
- Continuous Learning: Demonstrates a passion for staying current with new technologies, frameworks, and industry best practices—proactively applying improvements to code quality and team workflows.
- Experience: At least 4 years of hands-on experience in a backend or data engineering capacity, preferably in a product-driven environment.
- Education (Preferred): Bachelor's degree in Computer Science, Engineering, or a related technical field.
Apply now
Please let the company know that you found this position on our job board. This is a great way to support us, so we can keep posting cool jobs every day!
USARemoteJobs.app
Get USARemoteJobs.app on your phone!

Senior Data Engineer, Genius

Sr. Project Control Manager (remote)

Key Account Manager - Commercial /Alliance Partnerships - Data Center Cooling (HDLC)

Architectural Services Representative - San Antonio, TX

