Responsibilities and Duties:
- 10+ years of experience in related field; preferably experience building and supporting data pipelines, data lakes and ELT solutions at scale.
- Data modeling, data ingestion, ELT/ETL, and data integration development using our cloud-based tooling including Snowflake, AWS, Airflow and GitHub.
- Strong data modeling experience (SQL Server, Oracle, SAP BW, SAP HANA etc.)
- Good knowledge of data architecture, data engineering, data modeling, data warehousing, and data platforms.
- Experience with Snowflake, BigQuery, Redshift, AWS, and pipeline orchestration tools (Airflow, etc.).
- Knowledge in at least one modern programming language (Python, Java, Ruby, Scala, etc.).
- Strong experience with SQL including debugging, performance optimization
- Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, operations, and technical documentation.
- Excellent verbal and written communication skills and technical writing skills.
- Strong interpersonal skills and the ability to communicate complex technology solutions to senior leadership to gain alignment, and drive progress.
- Bachelor’s degree or equivalent experience in Computer Science, Engineering, Management Information Systems (MIS), or related field.