Job description
We are looking for an Analytics Engineer who lives at the intersection of data engineering and business intelligence — someone who can own the full lifecycle of data, from raw ingestion to trusted, decision-ready outputs. You are equally comfortable writing production-grade SQL and Python as you are talking to a stakeholder about what a metric actually means. You care deeply about data quality, pipeline reliability, and the downstream impact of every model you build.
Core Competencies:
- Advanced SQL : CTEs, window functions, query optimization, complex joins. You write SQL that others can read and maintain.
- Python for Data : pandas, PySpark or similar. Scripting for transformation, automation, and glue logic across the stack.
- Data Transformation : dbt or equivalent. Modular, tested, documented data models that are production-ready and reusable.
- Pipeline Architecture : End-to-end understanding from source to ingestion to transform to serve. You can design and debug at every stage.
- Data Extraction & Ingestion : API connectors, ELT tools (Airbyte, Fivetran, custom), handling messy sources with confidence.
- Cloud Data Warehouses : Snowflake, BigQuery, Redshift, or Databricks. Comfortable with warehouse-specific performance tuning.
What You’ll Do:
- Design and build robust ELT pipelines from diverse data sources: APIs, databases, event streams, flat files — ensuring reliability, observability, and scalability.
- Develop and maintain dbt (or equivalent) data models with a strong focus on modularity, documentation, and testing (schema tests, data assertions, freshness checks).
- Collaborate closely with data analysts, product teams, and business stakeholders to translate requirements into clean, trusted data layers.
- Own data quality end-to-end — from source anomalies to downstream metric consistency — and implement monitoring and alerting for pipeline health.
- Build and maintain semantic/metrics layers to ensure a single source of truth across reporting and dashboards.
- Work across client engagements to assess data infrastructure maturity, identify gaps, and deliver scalable solutions tailored to business goals.
- Write Python scripts and utilities for data validation, orchestration support, and automation of repetitive data workflows.
- Participate in code reviews, uphold engineering best practices, and contribute to internal tooling and documentation.
- Partner with analysts and business teams to understand what they’re actually trying to measure
- Own data quality end-to-end: from spotting source anomalies to validating that downstream dashboards reflect reality.
What We’re Looking For:
- 3+ years of hands-on experience in analytics engineering, data engineering, or a closely related role.
- Expert-level SQL — you can optimize a slow query, explain an execution plan, and model complex business logic cleanly.
- Strong Python proficiency for data tasks: transformation scripts, API data extraction, lightweight automation.
- Hands-on experience with at least one orchestration tool: Airflow, Prefect, Dagster, or similar.
- Solid understanding of data modeling concepts — star/snowflake schemas, slowly changing dimensions, normalization trade-offs.
- Experience working in version-controlled environments (Git) with CI/CD for data pipelines.
- Enough analytics sensibility to evaluate whether a model or metric makes business sense — not just whether it runs.
- Strong communicator — able to translate technical decisions into plain language for non-technical stakeholders.
Bonus: What Sets You Apart?
- You have adopted LLMs or AI tools (e.g. Claude, Copilot) to automate repetitive data tasks — writing boilerplate SQL, generating dbt model stubs, auto-documenting schemas, or speeding up code review.
- Experience building or integrating AI-assisted data quality checks or anomaly detection into pipelines.
- Familiarity with reverse ETL tools (Census, Hightouch) — you understand both directions of data flow.
What We Offer:
- Competitive salary
- Food, travel, gym allowance
- Paid time off and holidays
- Bi-annual increments and bonuses
- Opportunity for certifications
- Working on modern Technologies
- Flexible working hours
How to Apply:
Interested candidates are invited to submit their resume and a cover letter detailing their relevant experience and why they are a good fit for this position. Fill out the form here!.
Datum Labs is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Apply here
Make your first move in giving your career a massive push forward.

.png)