Overview
Role involves designing and maintaining scalable cloud-based data pipelines for analytics.
Ideal candidate has 5+ years of experience in data engineering with strong Java or Python skills.
remotemidfull-timeEnglishJavaPythonAWSKafkaSnowflakeApache SparkTerraform
Locations
Requirements
Bachelor's degree required 3+ years programming experience with Java/Python Experience with cloud-based data warehouse technologies Experience with Kafka-based data pipeline architectures
Responsibilities
Design and maintain data pipelines Collaborate with teams to fulfill data requirements Implement monitoring and alerting mechanisms Automate infrastructure management tasks Ensure data quality and performance
Benefits
Comprehensive benefits package Professional development opportunities Collaborative work culture