Overview
Role involves designing and maintaining scalable cloud-based data pipelines.
Ideal candidate should have 3+ years of data engineering experience with expertise in Snowflake and Databricks.
remotemidfull-timeEnglishSnowflakeDatabricksAWSAzureGCPSQLPythonScalaGit+ 1 more
Locations
Requirements
Bachelor's degree required Proficiency in Databricks Strong SQL and Python skills Experience with cloud platforms Familiarity with version control Strong communication skills
Responsibilities
Design and implement ETL/ELT pipelines Collaborate with teams on data requirements Optimize workflows on cloud platforms Implement data quality checks Support data migration initiatives Document data models and processes Participate in client meetings Adhere to data governance standards
Benefits
Company-sponsored training