Save Job Back to Search Job Description Summary Similar JobsExperience in programming language as Python or Java, AWS/Azure?GCPExperience with containerization and orchestration technologies, such as DockerAbout Our ClientOur client is one of the nation's largest retailers of apparel, home, jewellery, and beauty merchandiseJob DescriptionPrimary Responsibilities: Design, build and maintain scalable and reliable data pipelines to process large volumes of structures, semi-structured and un-structured data from diverse sources.Develop ETL processes to ingest, transform and load data into data warehouse, ensuring data quality, integrity and consistency.Collaborate with business stakeholders, report developers and data scientists to understand data requirements and translate then into technical solutions for various business purposes.Optimize performance and efficiency of data infrastructure and processes, including data storage, processing, and querying.Implement data governance policies and best practice to ensure compliance, security, and privacy of sensitive data.Troubleshoot data issues, identify root cause and implement solutions in timely manner.Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.The Successful ApplicantBasic Qualifications Experience as a Data Engineer with strong track record of designing and implementing data solutions.Experience in programming language as Python or Java, with experience in building data pipelines and workflows.Experience with cloud data warehousing technologies, such as Snowflake and Redshift.Experience with distributed computing frameworks such as Cloudera, Apache Hadoop and Spark.Experience with Cloud platforms such as AWS, Azure or Google Cloud Platform.Experience with AWS cloud services, such as S3, EC2, EMR, Glue, CloudWatch, Athena, Lambda.Experience with containerization and orchestration technologies, such as Docker and Kubernetes.Experience with building CI/CD pipeline using tools, such has GitLab and Bitbucket.Experience with data pipeline orchestration tools, such as Airflow and Jenkins.Knowledge of database concepts, data modelling, schemas and query languages, SQL and Hive.Knowledge of data visualization and reporting tools, such as MicroStrategy, Tableau and Power BI.Knowledge of data quality and monitoring techniques and tools, such as Great Expectations or similar.Knowledge of data governance processes, lineage, cataloging, dictionaries using tools, such as DataHub or similar.Knowledge of streaming data processing and real-time analytics technologies, such as Kafka.Retail experience is a plus.What's on OfferCompetitive compensation commensurate with role and skill set Medical Insurance Coverage worth of 10 Lacs Social Benifits including PF & Gratuity A fast-paced, growth-oriented environment with the associated (challenges and) rewards Opportunity to grow and develop your own skills and create your futureQuote job refJN-102024-6569189Job summaryFunctionInformation TechnologySub SectorNetworks EngineeringWhat is your area of specialisation?RetailLocationBangalore UrbanJob TypeTemporaryJob ReferenceJN-102024-6569189