Senior Backend Developer – Web Crawling (5 to 9 yrs)

Filled
February 27, 2026

Job Description

Location: Remote / Bengaluru
Experience: 5+ Years
Mode: Full-time
Positions: 2–5
Industry: IT Services / Data Engineering / AI & Automation
Education: Bachelor’s in Computer Science, IT, or related field
Notice Period: Immediate to 30 Days Preferred

🔹 Role Overview

We are looking for a Senior Backend Developer with deep expertise in Python and large-scale web crawling. The ideal candidate will design and maintain high-performance, distributed crawling systems, handle dynamic and JavaScript-heavy websites, and implement robust anti-bot mechanisms. This role involves collaboration with AI and data engineering teams to build reliable, structured data pipelines for enterprise-scale projects.

🔹 Key Responsibilities

  • Design, develop, and maintain large-scale web crawling and scraping systems.
  • Build scalable backend services and crawling architectures using Python.
  • Implement and optimize crawling frameworks with Scrapy and Requests.
  • Handle dynamic and JavaScript-rendered websites using Playwright and Selenium.
  • Develop and manage proxy rotation, IP management, and anti-bot bypass mechanisms.
  • Implement authentication flows, cookie persistence, and session handling strategies.
  • Build distributed crawling systems for enterprise-scale data extraction.
  • Ensure data validation, normalization, and structured storage pipelines.
  • Monitor crawler performance, logging, retries, error handling, and system stability.
  • Collaborate with data engineering and AI teams for downstream processing and automation.

🔹 Required Qualifications

  • 5+ years of backend development experience with strong Python expertise.
  • Proven experience with Scrapy, Requests, Playwright, and Selenium.
  • Strong understanding of HTTP protocols, headers, sessions, cookies, and browser behavior.
  • Experience implementing proxy rotation, IP management, and rate-limit handling.
  • Familiarity with CAPTCHA handling and anti-detection strategies.
  • Experience building large-scale or distributed crawling systems.
  • Strong knowledge of databases such as PostgreSQL or MongoDB.
  • Experience deploying and managing applications on AWS or similar cloud platforms.
  • Strong analytical, debugging, and performance optimization skills.
  • Excellent logical thinking and problem-solving ability.