Senior Data Engineer / Analytics Engineer (India-Based)

Filled
February 9, 2026

Job Description

Company: Mercor (Partnering with AI Research Lab)
Location: Remote / India-Based
Employment Type: Full-Time

About the Role

Mercor is collaborating with a cutting-edge AI research lab to hire a Senior Data/Analytics Engineer with expertise in DBT and Snowflake Cortex CLI. You will build and scale Snowflake-native data and ML pipelines, leveraging emerging AI/ML capabilities while maintaining production-grade DBT transformations.

In this high-impact role, you will collaborate with data engineering, analytics, and ML teams to prototype, operationalise, and optimise AI-driven workflows, defining best practices for Snowflake-native feature engineering and model lifecycle management.

This position offers an exciting opportunity to work within a modern, fully cloud-native data stack and influence how AI/ML workloads are built and deployed at scale.

Key Responsibilities

  • Design, build, and maintain DBT models, macros, and tests following modular data modeling and semantic best practices.
  • Integrate DBT workflows with Snowflake Cortex CLI to support:
    • Feature engineering pipelines
    • Model training and inference tasks
    • Automated pipeline orchestration
    • Monitoring and evaluation of Cortex-driven ML models
  • Establish best practices for DBT–Cortex architecture and usage patterns.
  • Collaborate with data scientists and ML engineers to produce Cortex workloads in Snowflake.
  • Build and optimise CI/CD pipelines for DBT (GitHub Actions, GitLab, Azure DevOps).
  • Tune Snowflake compute and queries for performance and cost efficiency.
  • Troubleshoot issues across DBT artifacts, Snowflake objects, lineage, and data quality.
  • Provide guidance on DBT project governance, structure, documentation, and testing frameworks.

Required Qualifications

  • 3+ years experience with DBT Core or DBT Cloud, including macros, packages, testing, and deployments.
  • Strong expertise with Snowflake: warehouses, tasks, streams, materialized views, performance tuning.
  • Hands-on experience with Snowflake Cortex CLI, or ability to quickly learn it.
  • Advanced SQL skills and working familiarity with Python for scripting and DBT automation.
  • Experience integrating DBT with orchestration tools (Airflow, Dagster, Prefect, etc.).
  • Solid understanding of modern data engineering, ELT patterns, and version-controlled analytics development.

Nice-to-Have Skills

  • Operationalising ML workflows inside Snowflake.
  • Familiarity with Snowpark and Python UDFs/UDTFs.
  • Experience building semantic layers using DBT metrics.
  • Knowledge of MLOps / DataOps best practices.
  • Exposure to LLM workflows, vector search, and unstructured data pipelines.

Why Join

  • Build next-generation Snowflake AI/ML systems leveraging Cortex.
  • High-impact ownership of DBT and Snowflake architecture across production pipelines.
  • Collaborate with top-tier ML engineers, data scientists, and research teams.
  • Gain exposure to cutting-edge AI/ML integration and operationalisation within cloud-native systems.