Overview
Role involves building and maintaining data solutions and ETL pipelines in a collaborative environment.
Ideal candidate has 5+ years of experience in data engineering with strong coding skills and expertise in big data frameworks.
remoteseniorEnglishPythonJavaScalaSQLSparkHiveKafkaAirflowAWSGCPBigQuery+ 2 more
Locations
Czech Republic, Praha, Hlavní město, Prague Ireland, Leinster, Dublin
Requirements
5+ years in data engineering Strong coding skills in Python, Java, or Scala Proven expertise in big data frameworks like Spark, Hive, Kafka, and Airflow Experience with cloud data technologies (AWS or GCP)
Responsibilities
Design and maintain ETL pipelines Build cloud-based data platforms Collaborate with cross-functional teams Ensure data quality and monitoring Participate in agile delivery cycles
Benefits
Flexible work environment On-call rotation participation