Job Description
Job Details
- Company Name: Nestor Technologies.
- Employment type: Full time
- Experience: Contractor
- Salary: $40 to $65 Hourly
- Location : Remote Option Available
- Work schedule : 5 days a week
Job Overview
We are seeking a skilled Data Developer with strong experience in ETL/ELT pipeline design and implementation to support data integration, migration, and transformation initiatives. This role requires hands-on expertise with cloud-based and on-premise data solutions, strong SQL and Python skills, and experience working with spatial and geospatial data sources.
The ideal candidate is analytical, detail-oriented, and capable of building reliable, scalable data pipelines while collaborating closely with business and technical stakeholders.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines to extract, transform, and load data into databases and data warehouses.
- Collaborate with stakeholders to understand data requirements and translate them into technical ETL solutions.
- Implement data integration solutions using cloud platforms (AWS, Azure, GCP) and on-premise tools (Apache NiFi, Talend, Informatica, custom scripts).
- Optimize workflows for performance, scalability, and reliability based on data volume and complexity.
- Ensure data quality and integrity through validation, error handling, and monitoring mechanisms.
- Monitor ETL jobs, troubleshoot failures, and perform debugging and performance tuning.
- Maintain clear documentation for data mappings, processes, and system configurations.
- Stay current with industry trends, tools, and best practices in data engineering.
Skills
Candidates must meet all of the following requirements:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Proven experience as an ETL Developer / Data Engineer or in a similar role.
- Strong proficiency in SQL and relational databases such as PostgreSQL, MySQL, SQL Server, or Oracle.
- Hands-on experience with ETL/ELT tools and frameworks, including:
- Cloud-based: AWS Glue, Azure Data Factory, Google Cloud Dataflow
- On-premise: Apache NiFi, Talend, Informatica
- Strong Python programming skills for scripting, automation, and data manipulation.
- Experience with spatial and geospatial data using tools such as PostGIS, GeoPandas, ArcGIS.
- Solid understanding of GIS concepts and spatial data formats (Shapefiles, GeoJSON, etc.).
- Knowledge of data warehousing concepts, dimensional modeling, and integration patterns.
- Understanding of cloud computing fundamentals (storage, compute, networking).
- Strong analytical and troubleshooting skills.
- Excellent communication and collaboration abilities.