Job Description
Job Role : Data Engineer
Experience: 5+years
Location: Pune/Trivandrum/Bangalore
Introduction
we are looking for 5+years experienced candidates for this role.
Role Summary
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet business requirements
- Optimize data delivery and re-design infrastructure for greater scalability
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using kafka and Azure technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs.
- Work closely with business intelligence team, understand their reuiqremts thorugh API and needs to check with the backend postgres and snowflake data warehouses and needs to update the facts and dimensions according to that. Strong understanding of datawarehouse concepts is needed.
Responsibilities Include
- The candidate needs to work closely with Business intelligence team, understand their reuiqremts thorugh API and needs to check with the backend postgres and snowflake data warehouses and needs to update the facts and dimensions according to that. Strong understand of datawarehouse concepts is needed.
Primary Skills
- Databases, Datawarehouses, Kafka, AWS Datalake, Postgres , snowflake, API, Data ingestion, python. Distributed systems, apache Airflow.
- MDM/PIM
- Informatica, Databricks/Snowflakes, Atacamma, Syndigo (PIM)
Secondary Skills
- Mongo DB , openAPI, fastAPI, business intelligence