Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
As a Test Engineer, you will apply your foundational testing skills to support the delivery of Bupa releases. Working collaboratively within agile squad, you will design, execute, and maintain both manual and automated tests, with an automation-first mindset and a solid understanding of Bupa's testing frameworks
As a Senior Test Engineer at Bupa Technology, you will take ownership of quality engineering activities within squad to deliver periodic releases. With automation-first mindset you will be collaborating closely with cross-functional teams, you will design, execute, and maintain comprehensive test plans, defect co-ordination and test completion report. proactively identify quality risks early in the development process.
Lead the design and execution of enterprise-wide Knowledge Management strategy, integrating advanced tools, content governance, and analytics to enhance IT service delivery and innovation. Build and maintain knowledge infrastructure, foster a culture of knowledge sharing, ensure compliance, and drive stakeholder engagement, training, and change management to maximize knowledge asset value and business impact.
The client is a experienced Data Modeler specialising in scalable data model design for HRIS and IAM systems. The person will translate business needs into robust conceptual, logical and physical models supporting both transactional and analytical use cases. They'll collaborate with cross-functional teams to define data architecture, ensure governance and compliance, and optimize ETL pipelines.
Qualifications* Bachelor's or master's degree in computer science, or a comparable field.* 5 to 8 years of experience working in frontend development with web-based applications.* Expertise in frontend technologies like HTML, CSS and JavaScript.* Expertise in working with Blazor framework.* Expertise in backend technologies like C#/.NET 8.0.* Good knowledge of RESTful APIs and web services.* Solid understanding of OOP and design patterns.* Mathematical knowledge, particularly in geometry to support the development of CAD interface.* Experience of working in an agile development team (E.g. Scrum).* High degree of personal responsibility and analytical skills to solve complex challenges.* Quick thinking, positive attitude, and excellent collaboration skills.* Strong motivation and enthusiasm for professional development and visual appealing frontends.* Goal and results oriented.* Team player and ability to work in a multicultural and multinational environment.* Unconventional thinking, creativity, and a willingness to continuously improve.* Flexible, adaptable, open to change and the acquisition of new knowledge and skills.Job Description:* Design, develop and maintain the web application Fluid Draw web and fluidsim.festo.com using .Net 8.0 and Blazor framework.* Maintain and extend the CAD interface [Drawing area]* Implement new features and optimize existing functionalities for performance and usability.* Ensure code quality through testing, code reviews and adherence to best practices.* Collaborate with cross-functional teams and support PO to gather and analyze requirements.* Participate in architectural discussions and contribute to the overall design of the application.* Troubleshoot and resolve issues in a timely manner.Job Location: Bengaluru, IndiaJob Level: Experienced Professionals - 5+ yearsLevel of Education: BE/BTechJob Type: Full-Time/Regular
As a MS SQL Database Administrator, you will be responsible for maintaining Database Servers and optimization, tuning databases for high performance, scalability and reliability. You need to have experience with SQL Server Performance tuning, indexing, Query optimization and database design. This role requires strong technical and collaboration skills to work with different teams to improve the performance of the databases.
Lead the strategy, expansion, design the performance management of our claims and service provider network of the provider network. The role will ensure high-quality, cost-effective, and digitally integrated services across our P&C claims value chain - including repair garages, assessors, medical providers and third-party administrators and will oversee vendor relationships, performance analytics, contracting & digital integration.
Embedded Software EngineerYour Job
Professional Competencies and Requirements:
Educational Background: Master's or bachelor's degree in computer science, Electrical Engineering, Automation Technology, or a related field. Personal Competencies and Requirements:
You can expect the following with us:
Experience Level: 5 - 8 YearsWorking Model: HybridWork Location: Bommasandra, Bangalore
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
Our client is seeking a strategic, systems-driven CHRO to lead a cultural and capability transformation across the organization. This is not a traditional HR leadership role - it requires an architect's mindset, a psychologist's understanding, and an operator's discipline. The CHRO will own the design and execution of advanced HR systems that define hiring and shape the organisation into a high performer. CHRO's mandate: create a measurable, evolving human infrastructure aligned to character, culture, and business imperatives.
Qualifications Bachelor's degree in computer engineering, Computer Science, Data Analytics, or related field. 3-4 years of experience in BI engineering or data analytics and advanced SQL Programming like PL/SQL. Strong hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). In-depth knowledge of report design, development and maintenance, data extraction from DWH and cubes layer and dimensional data modeling concepts. Strong hands-on experience with Python/Spark scripting and shell (Unix/Linux) scripting. Familiarity with data warehousing concepts and cloud data platforms (Azure, Snowflake, etc.) and their services like Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Azure SQL DW/Synapse. Excellent communication and stakeholder management skills with Strong analytical abilities. Self-starter and motivated with ability to work in a fast-paced development environment. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Responsibilities:* Design, develop, and deploy Power BI dashboards, reports, and data models. * Support and monitor the existing reports in maintenance with clear documentation. * Optimize DAX queries and Power BI performance for scalability and responsiveness. * Automate the existing reports by removing manual steps and adhere reporting best practices. * Supports in analyzing reports, admin activities, usage monitoring, and licensing costs optimization. * Write custom scripts to extract data from unstructured/semi-structured sources from data warehouse or BI Cubes or from Snowflake. * Skilled in dashboard deployment, report consolidation, and enhancing report usability for end users. * Work with data engineers to define and validate data from the data sources for Power BI consumption. * Collaborate with business stakeholders to define KPIs and build self-service BI tools. * Proactive and clear communication about the report status with the stakeholders. * Provide Power BI training, support, and up to date documentation for business users and conduct KT sessions for the junior resources to guide and mentor themPreferred Skills:* Microsoft Certified: Data Analyst Associate or equivalent Power BI certification. * Hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). * Basic understanding of reporting governance and Power BI administration. * Working Knowledge of Dev-Ops processes (CI/CD), Azure DevOps, Git/Jenkins version control tool, reporting and maintenance of BI reports and SQL scripting. * Hands-on experience in extracting data from databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), SharePoint, Python/Unix shell Scripting. Technologies we use: Power BI, Databricks, Python, PySpark, Scripting (Power shell, Bash), Azure SQL DW/Synapse, Azure Tabular/Cubes, Azure DevOps, Git, Terraform, Snowflake
Role Summary The Zone Network and Security Project Manager will be responsible for the co-ordination of designing, implementing, and maintaining secure IT infrastructure solutions. The role's main responsibility is to ensure the organization's IT systems remain resilient, compliant, and aligned with business objectives. The ideal candidate will have a strong technical background in network engineering and cybersecurity coupled with hands-on experience in project management and cross-functional collaboration. Experience with retail and cloud infrastructure a plus.Key ResponsibilitiesProject Management
Network Engineering and Security
Cloud Infrastructure and Automation
Cybersecurity and Compliance
IT Systems Administration
Collaboration and Support
Qualifications and SkillsTechnical Certifications (bonus but not required)
Technical Expertise
Preferred Experience
Demonstrated experience in deploying, configuring, and maintaining Zscaler Private Access (ZPA) and Zscaler Secure Internet Access (ZIA) in an enterprise environment, particularly within a large-scale retail contextLocation- HyderabadMode- Hybrid, 3 Days WFO
Job Responsibilities:Design, implement, and maintain robust and scalable CI/CD pipelines.* Implement and maintain infrastructure as code (IaC) for CI/CD environments.* Automate build, test, and deployment processes using industry-standard tools and best practices.* Develop and maintain pipeline scripts for Jenkins using Groovy and Java.* Maintain and develop tools used in the development infrastructure.* Evaluate software updates and ensure version changes do not impact existing functionality.* Consulting of application developers on the usage of the development environment and to troubleshoot and resolve issues related to CI/CD pipelines.* Collaborate with development, QA, and operations teams to ensure smooth and efficient software releases.* Monitor and optimize pipeline performance and reliability.* Develop and maintain scripts and tools for automation tasks.* Research and evaluate new CI/CD technologies and tools.* Contribute to the improvement of our DevOps practices and processes.* Document CI/CD processes and configurations.* Ensure security best practices are implemented throughout the CI/CD pipeline.SKILLS REQUIRED* Strong know-how in Jenkins with experience in Groovy and Java.* In-depth understanding of DevSecOps.* Extensive hands-on experience in using git.* Gitlab, Jenkins, Artifactory, SonarQube and Pact are your favourite friends.* You know Docker, Podman and Kubernetes, preferably in conjunction with AWS.* You are familiar with bash scripting and have a basic knowledge of HTTP.* Independent and proactive way of working.* Service-oriented and responsible - your contributions to the system can have very big impact.
Data Scientist (IT)Position:Data ScientistJob type:Techno-FunctionalPreferred education qualifications:Bachelor/ Master's degree in Statistics, Operation Research, Computer Science, Data Science OR related quantitative fieldJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills: § 5+ years in developing and implementing forecasting models§ [Mandatory] Proven track record in data analysis (EDA, profiling, sampling), data engineering (wrangling, storage, pipelines, orchestration)§ [Mandatory] Proven expertise in time series analysis, regression analysis, and other statistical modelling techniques§ [Mandatory] Experience in ML algorithms such as ARIMA, Prophet, Random Forests, and Gradient Boosting algorithms (XGBoost, LightGBM, CatBoost)§ [Mandatory] Experience in model explainability with Shapley plot and data drift detection metrics.§ [Mandatory] Strong programming & analysis skills with Python and SQL, including experience with relevant forecasting packages§ [Mandatory] Prior experience on Data Science & ML Engineering on Google Cloud§ [Mandatory] Proficiency in version control systems such as GitHub§ [Mandatory] Strong organizational capabilities; and ability to work in a matrix/ multidisciplinary team§ [Mandatory] Excellent communication and presentation skills, with the ability to explain complex technical concepts to non-technical audience§ Experience in Beauty or Retail/FMCG industry is preferred§ Experience in handling large volume of data (>100 GB)§ Experience in delivering AI-ML projects using Agile methodologies is preferred§ Proven ability to work proactively and independently to address product requirements and design optimal solutionsJob objectives:Design, develop, implement, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with cross-functional teams to leverage statistical modeling, machine learning, and data mining techniques to improve forecast accuracy and aid strategic decision-making across the organization. Scale the proven AI-ML Product across the SAPMENA region.Job description:§ Deep understanding of business/functional needs, problem statements and objectives/success criteria§ Develop and maintain sophisticated statistical forecasting models, incorporating factors such as seasonality, promotions, media, traffic and other economic indicators§ Collaborate with internal and external stakeholders including business, data scientists & product team to understand the business and product needs and translate them into actionable data-driven solutions§ Review MVP implementations, provide recommendations and ensure Data Science best practices and guidelines are followed§ Evaluate and compare the performance of different forecasting models, recommending optimal approaches for various business scenarios§ Analyze large and complex datasets to identify patterns, insights, and potential risks and opportunities§ Communicate forecasting results and insights to both technical and non-technical audiences through clear visualizations and presentations§ Stay up to date with the latest advancements in forecasting techniques and technologies, continuously seeking opportunities for improvement§ Contribute to the development of a robust data infrastructure for AI-ML solutions, ensuring data quality and accessibility§ Collaborate with other data scientists and engineers to build and deploy scalable AI-ML solutions
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
Job Requirements Education* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience*3 - 4 years for Data Scientist* Relevant working experience in a data science/advanced analytics role Behavioural Skills*Delivery Excellence*Business disposition*Social intelligence*Innovation and agility Knowledge* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)*Statistical modelling using Analytical tools (R, Python, KNIME, etc.)*Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)*Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.*Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)*Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)*Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools*Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)
Location- Gurgaon
Your Responsibilities * Design and maintain frameworks for product classification and automated security requirements mapping * Conduct TARAs (Threat Analysis and Risk Assessment) and security assessments for Festo products * Support product teams in automating the generation of SBOMs (Software Bill of Materials) * Develop and execute test specifications, test cases and test plans for vulnerability testing of Festo products * Conduct penetration testing and basic vulnerability assessment of Festo products * Support documentation of test results and collaborate with the development teams * Support the continuous improvement and automation of security testing * Establish and maintain DevSecOps practices within CI/CD environments and develop automation infrastructure * Support the provision of tools and documentation in the context of SAMM (Software Assurance Maturity Model) * Collaborate with product compliance and development teams to implement and maintain product security measures * Support investigation and mitigation of product-related security incidents (PSIRT) Our Requirements * Education: Bachelor's degree in engineering, Computer Science, Mechatronics, Information Science and Electronics, Cyber Security or equivalent * Mandatory Experience: o Min 2 years of experience in product security, ideally in Industrial Automation or automotive field o Programming knowledge in Python and JavaScript o Basic familiarity with different industrial protocols and PLC systems o Experience with CI/CD practices and DevOps o Basic knowledge of Linux * Nice-to-Have: o Understanding of Secure Development Lifecycle and standards like IEC 62443-3 / 62443-4 o Additional knowledge in programming languages such as C, C++ or Shell scripting o Experience with tools like OpenVAS, Nessus, Nmap, Wireshark, embedded or IOT penetration testing o Experience in embedded domain o Experience in Linux hardening
Position:ML EngineerJob type:Techno-FunctionalPreferred education qualifications: Bachelor/ Master's degree in computer science, Data Science, Machine Learning OR related technical degreeJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills:
Job objectives:Design, develop, deploy, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with product managers, data scientists & analysts to identify innovative & optimal machine learning solutions that leverage data to meet business goals. Contribute to development, rollout and onboarding of data scientists and ML use-cases to enterprise wide MLOps framework. Scale the proven ML use-cases across the SAPMENA region. Be responsible for optimal ML costs.Job description:
Location- GurgaonExternal Interfaces Internal Interfaces* Contractor(s) - BPO Partners - Global CoE Data & Analytics team* Vendors - Digital tools (SAP Ariba, Coupa, etc.)* Vendors - Supplier Relation Management (SRM)* Global Procurement GNFR* Finance, accounting and Legal* GNFR Stakeholders* Global CoE - Data & AnalyticsJob Requirements Education* Bachelor's degree in Supply Chain, Business Administration, Engineering, or a related field.* MBA or relevant Master's degree preferred.Relevant Experience* 10+ years of total experience in procurement, with a strong focus on process optimization, tools implementation, and transformation.* Proven success in leading or supporting largescale S2P initiatives.* Experience with procurement platforms such as SAP Ariba, Coupa, Oracle, and ERP systems like SAP.Behavioural Skills* Strong analytical and problem-solving abilities.* Excellent communication and stakeholder engagement skills.* Proactive, organized, and adaptable in a fastpaced, global environment.* Willingness to work flexible hours to collaborate across time zones.* Strong project management mindset.Knowledge* Strong understanding of end-to-end procurement processes, with a focus on Sourceto-Pay (S2P) transformation.* Hands-on knowledge of procurement platforms such as SAP Ariba, Coupa, or other similar tools (e.g., Oracle, Ivalua).* Experience in integrating procurement systems with ERP platforms such as SAP or Oracle * Familiarity with contract lifecycle management (CLM), supplier management, e-sourcing, and invoice automation.* Working knowledge of data governance, compliance frameworks, and internal controls in a procurement context.* Proficiency in Power BI or other business intelligence tools (e.g., Tableau, Qlik, Spotfire) for creating dashboards and data-driven insights.* Strong command of Microsoft Excel (pivot tables, advanced formulas, data modeling) and PowerPoint for executive reporting.* Basic understanding of cloud platforms (e.g., Azure, AWS, or GCP) for analytics is a plus. Roles & Responsibilities Analytics (Data & Insights)* Drive implementation, optimization, and adoption of procurement platforms such as SAP Ariba, Coupa, Oracle (including CLM, e-Sourcing, and invoice automation).* Collaborate with IT and cross-functional teams for seamless ERP integration.* Monitor tool adoption and identify opportunities for automation and enhancement. Operational Excellence* Lead and execute end-to-end procurement transformation initiatives, spanning S2P and P2P workstreams.* Design and deliver standardized, automated, and scalable procurement processes aligned with global strategy.* Define and implement procurement transformation roadmaps and process excellence frameworks.Stakeholder Management* Engage and align with global stakeholders including Procurement, Finance, Legal, IT, and Business Units.* Act as a change agent for procurement transformation, building awareness and ownership among stakeholders.* Define KPIs and provide actionable insights to leadership through dashboards and reports.Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to businessData Governance & Quality Assurance* Use data-driven analysis to identify trends, optimize procurement spend, and enable strategic sourcing.* Build interactive dashboards (preferably in PowerBI) and ensure high data integrity and governance
Create Job alert to receive Design jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.