Axiom Software Solutions Limited

Overview

Role focused on developing and optimizing data pipelines and infrastructure.

Ideal candidate should have 5+ years of experience with strong expertise in Python and Snowflake.

hybridmidfull-timeSnowflakeAWSSQLPythonPySparkCI/CDTerraformKubernetesGitHub

Locations

  • India, Maharashtra, Pune

Requirements

  • Bachelor's degree required
  • 5+ years of experience
  • Strong expertise in Python, PySpark, and Snowpark

Responsibilities

  • Establish technical designs
  • Optimize ETL/data pipelines
  • Manage data ingestion and processing
  • Develop and maintain API and CI/CD processes
  • Conduct peer reviews
  • Promote best practices
  • Create design documentation
  • Contribute to Data Engineering community