Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
Job Responsibilities:Design, implement, and maintain robust and scalable CI/CD pipelines.* Implement and maintain infrastructure as code (IaC) for CI/CD environments.* Automate build, test, and deployment processes using industry-standard tools and best practices.* Develop and maintain pipeline scripts for Jenkins using Groovy and Java.* Maintain and develop tools used in the development infrastructure.* Evaluate software updates and ensure version changes do not impact existing functionality.* Consulting of application developers on the usage of the development environment and to troubleshoot and resolve issues related to CI/CD pipelines.* Collaborate with development, QA, and operations teams to ensure smooth and efficient software releases.* Monitor and optimize pipeline performance and reliability.* Develop and maintain scripts and tools for automation tasks.* Research and evaluate new CI/CD technologies and tools.* Contribute to the improvement of our DevOps practices and processes.* Document CI/CD processes and configurations.* Ensure security best practices are implemented throughout the CI/CD pipeline.SKILLS REQUIRED* Strong know-how in Jenkins with experience in Groovy and Java.* In-depth understanding of DevSecOps.* Extensive hands-on experience in using git.* Gitlab, Jenkins, Artifactory, SonarQube and Pact are your favourite friends.* You know Docker, Podman and Kubernetes, preferably in conjunction with AWS.* You are familiar with bash scripting and have a basic knowledge of HTTP.* Independent and proactive way of working.* Service-oriented and responsible - your contributions to the system can have very big impact.
This role leads the DevOps, SRE, and Infrastructure functions within Risk Technology team, ensuring reliability, scalability, and automation at scale. It demands deep technical expertise and strategic leadership to maintain high availability (99.99%), optimize infrastructure, and drive DevOps transformation.
We are looking for a skilled DevOps professional with expertise in Groovy Scripting to join a temporary project in Chennai. The role involves supporting technology services within the industry to enhance operational efficiency and streamline processes.
The Observability Integration Engineer is responsible for supporting product teams with Open Telemetry applications instrumentation and supporting the Application Observability Platform team in 24/7/365 Open Telemetry infrastructure maintenance.
Responsible for leading the design, implementation, and maintenance of the infrastructure and processes required to support the software development lifecycle. Work closely with the development and operations teams to ensure that the applications are deployed and running smoothly in the production environment.
This position seeks a Python DevOps Developer to build and maintain robust cloud-based solutions using AWS. The role is based in Chennai and requires expertise in Python, DevOps practices, and cloud technologies.
We are seeking a highly motivated and experienced CI/CD Engineer to join our growing IT Platforms team. The CI/CD Engineer will play a critical role in designing, implementing, and maintaining our continuous integration and continuous delivery pipelines. You will be responsible for automating our build, test, and deployment processes, ensuring the rapid and reliable release of our software products.
Qualifications Bachelor's degree in computer engineering, Computer Science, Data Analytics, or related field. 3-4 years of experience in BI engineering or data analytics and advanced SQL Programming like PL/SQL. Strong hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). In-depth knowledge of report design, development and maintenance, data extraction from DWH and cubes layer and dimensional data modeling concepts. Strong hands-on experience with Python/Spark scripting and shell (Unix/Linux) scripting. Familiarity with data warehousing concepts and cloud data platforms (Azure, Snowflake, etc.) and their services like Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Azure SQL DW/Synapse. Excellent communication and stakeholder management skills with Strong analytical abilities. Self-starter and motivated with ability to work in a fast-paced development environment. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Responsibilities:* Design, develop, and deploy Power BI dashboards, reports, and data models. * Support and monitor the existing reports in maintenance with clear documentation. * Optimize DAX queries and Power BI performance for scalability and responsiveness. * Automate the existing reports by removing manual steps and adhere reporting best practices. * Supports in analyzing reports, admin activities, usage monitoring, and licensing costs optimization. * Write custom scripts to extract data from unstructured/semi-structured sources from data warehouse or BI Cubes or from Snowflake. * Skilled in dashboard deployment, report consolidation, and enhancing report usability for end users. * Work with data engineers to define and validate data from the data sources for Power BI consumption. * Collaborate with business stakeholders to define KPIs and build self-service BI tools. * Proactive and clear communication about the report status with the stakeholders. * Provide Power BI training, support, and up to date documentation for business users and conduct KT sessions for the junior resources to guide and mentor themPreferred Skills:* Microsoft Certified: Data Analyst Associate or equivalent Power BI certification. * Hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). * Basic understanding of reporting governance and Power BI administration. * Working Knowledge of Dev-Ops processes (CI/CD), Azure DevOps, Git/Jenkins version control tool, reporting and maintenance of BI reports and SQL scripting. * Hands-on experience in extracting data from databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), SharePoint, Python/Unix shell Scripting. Technologies we use: Power BI, Databricks, Python, PySpark, Scripting (Power shell, Bash), Azure SQL DW/Synapse, Azure Tabular/Cubes, Azure DevOps, Git, Terraform, Snowflake
Key Responsibilities:
Required Skills and Qualifications:
Preferred Skills:
Your Responsibilities * Design and maintain frameworks for product classification and automated security requirements mapping * Conduct TARAs (Threat Analysis and Risk Assessment) and security assessments for Festo products * Support product teams in automating the generation of SBOMs (Software Bill of Materials) * Develop and execute test specifications, test cases and test plans for vulnerability testing of Festo products * Conduct penetration testing and basic vulnerability assessment of Festo products * Support documentation of test results and collaborate with the development teams * Support the continuous improvement and automation of security testing * Establish and maintain DevSecOps practices within CI/CD environments and develop automation infrastructure * Support the provision of tools and documentation in the context of SAMM (Software Assurance Maturity Model) * Collaborate with product compliance and development teams to implement and maintain product security measures * Support investigation and mitigation of product-related security incidents (PSIRT) Our Requirements * Education: Bachelor's degree in engineering, Computer Science, Mechatronics, Information Science and Electronics, Cyber Security or equivalent * Mandatory Experience: o Min 2 years of experience in product security, ideally in Industrial Automation or automotive field o Programming knowledge in Python and JavaScript o Basic familiarity with different industrial protocols and PLC systems o Experience with CI/CD practices and DevOps o Basic knowledge of Linux * Nice-to-Have: o Understanding of Secure Development Lifecycle and standards like IEC 62443-3 / 62443-4 o Additional knowledge in programming languages such as C, C++ or Shell scripting o Experience with tools like OpenVAS, Nessus, Nmap, Wireshark, embedded or IOT penetration testing o Experience in embedded domain o Experience in Linux hardening
Qualifications
Job Description:
Job location: Bengaluru, India
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
Looking for a highly experienced Senior/Lead Full Stack Developer with strong background in leading cross-functional teams, designing scalable systems, and driving projects independently.Should possess a deep understanding of databases, third-party integrations, event-based systems, GraphQL, and DevOps practices
As a SecOps Engineer, you will be responsible for ensuring the security and compliance of our systems and infrastructure. You will work closely with our development, architecture and DevOps teams to identify and remediate vulnerabilities, implement security best practices, automate security processes and ensure compliance with corporate and industry standards.
Create Job alert to receive Devops jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.