No. of Position :- 3
Experience range required – 5-15 years
Responsibilities and Duties:
- 5+ years of experience in Python , Apache Airflow , SLT/BODS & knowledge of Data warehouse/Snowflake.
- Strong experience with data replication tools (SLT, HVR, Qlik etc)
- Strong Experience with Airflow, Python, API integrations
- Strong experience with SQL, Python including debugging, performance optimization
- Experience with Snowflake is a strong plusHands-on experience integrating Datawarehouse with other third party systems.
- Experience in designing and optimizing data models on AWS cloud using data stores such as BigQuery, BigTable
- Experience architecting and implementing metadata management.
- Architecting and implementing data governance and security for data platform.
- Agile development skills and experience.
- Experience with CI/CD pipelines such as Concourse, Jenkins.