Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
The purpose of the Data Engineering specialist is to build and unit test code for Projects and Programmes on Azure Cloud Data and Analytics Platform
Senior Embedded Software EngineerYour Job:* Design, implement, and test embedded software for Smart Motion Products in the field of industrialautomation.* Collaborate within an agile team on projects related to industrial communication, cybersecurity, andclosed-loop control.* Assist in the development and enhancement of infrastructure for continuous integration and industrialEthernet.* Utilize and integrate reusable software components from our embedded platform.Your technical qualification:* A degree in computer science, software engineering, electrical engineering, or a related field.o Familiarity with electric and pneumatic systems, along with a willingness to engage withthem.* Proficiency in object-oriented design and the C++ programming languageo Python for Scripting and Automationo Principles of version control and branching with Gito Skills in troubleshooting and testing embedded softwareo Knowledge of unit and integration testing* 10 years of experience in developing software for embedded systems and industrialcommunication, particularly for "small systems" with limited resources and embedded RTOSo Knowledge of ARM v7/v8 Cortex-M / Cortex-A based microcontrollers and theirecosystems.* Understanding of industrial communication protocols, stacks e.g. EtherCAT, Profinet, Modbus, IOLink,and controllers, such as SIEMENS and Beckhoff.o Proficiency with modern software tools including VS Code, LLVM, Git, GitLab, CMake,and Conan.* Familiarity with current software development processes, methods, and relevant standards is a plus.o Awareness of architectural design principles and understanding of measures and bestpractices to ensure software quality.
YOUR RESPONSIBILITIES:
WHAT WE ARE LOOKING FORTechnical:
Personal:
Job Requirements Education* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience*3 - 4 years for Data Scientist* Relevant working experience in a data science/advanced analytics role Behavioural Skills*Delivery Excellence*Business disposition*Social intelligence*Innovation and agility Knowledge* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)*Statistical modelling using Analytical tools (R, Python, KNIME, etc.)*Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)*Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.*Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)*Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)*Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools*Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)
Location- Gurgaon
This role leads capital goods procurement for new and ongoing projects, focusing on project and routine Capex, vendor development, cost optimization, and digitalization of procurement processes. It requires strong engineering knowledge, cross-functional collaboration, and strategic leadership to ensure timely, value-driven, and system-compliant procurement outcomes.
Qualifications Bachelor's degree in computer engineering, Computer Science, Data Analytics, or related field. 3-4 years of experience in BI engineering or data analytics and advanced SQL Programming like PL/SQL. Strong hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). In-depth knowledge of report design, development and maintenance, data extraction from DWH and cubes layer and dimensional data modeling concepts. Strong hands-on experience with Python/Spark scripting and shell (Unix/Linux) scripting. Familiarity with data warehousing concepts and cloud data platforms (Azure, Snowflake, etc.) and their services like Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Azure SQL DW/Synapse. Excellent communication and stakeholder management skills with Strong analytical abilities. Self-starter and motivated with ability to work in a fast-paced development environment. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Responsibilities:* Design, develop, and deploy Power BI dashboards, reports, and data models. * Support and monitor the existing reports in maintenance with clear documentation. * Optimize DAX queries and Power BI performance for scalability and responsiveness. * Automate the existing reports by removing manual steps and adhere reporting best practices. * Supports in analyzing reports, admin activities, usage monitoring, and licensing costs optimization. * Write custom scripts to extract data from unstructured/semi-structured sources from data warehouse or BI Cubes or from Snowflake. * Skilled in dashboard deployment, report consolidation, and enhancing report usability for end users. * Work with data engineers to define and validate data from the data sources for Power BI consumption. * Collaborate with business stakeholders to define KPIs and build self-service BI tools. * Proactive and clear communication about the report status with the stakeholders. * Provide Power BI training, support, and up to date documentation for business users and conduct KT sessions for the junior resources to guide and mentor themPreferred Skills:* Microsoft Certified: Data Analyst Associate or equivalent Power BI certification. * Hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). * Basic understanding of reporting governance and Power BI administration. * Working Knowledge of Dev-Ops processes (CI/CD), Azure DevOps, Git/Jenkins version control tool, reporting and maintenance of BI reports and SQL scripting. * Hands-on experience in extracting data from databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), SharePoint, Python/Unix shell Scripting. Technologies we use: Power BI, Databricks, Python, PySpark, Scripting (Power shell, Bash), Azure SQL DW/Synapse, Azure Tabular/Cubes, Azure DevOps, Git, Terraform, Snowflake
As a Senior Test Engineer at Bupa Technology, you will take ownership of quality engineering activities within squad to deliver periodic releases. With automation-first mindset you will be collaborating closely with cross-functional teams, you will design, execute, and maintain comprehensive test plans, defect co-ordination and test completion report. proactively identify quality risks early in the development process.
We are looking for an Export Sales Manager who is well-versed in sales strategies and has excellent negotiation skills. The ideal candidate will have a solid understanding of the Industrial/Manufacturing and Engineering industry.
As a Lead Test Engineer, you will drive the quality engineering strategy across crews/program level, ensuring the delivery of Bupa releases on time. Leveraging technical expertise and leadership capabilities, you will define and implement test strategies, automation practices, and mentor team members to uplift testing maturity. You will collaborate with cross-functional stakeholders to identify risks early and optimize testing frameworks.
The Digital Delivery Lead (AVP/DVP) will oversee the engineering and delivery of core digital platforms, including the sales website and customer app, leading a team of 15+ developers to build scalable, high-performance systems. This role requires strong technical leadership, hands-on expertise in modern web and mobile technologies, and a proven track record in agile project execution.
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
Embedded Software EngineerYour Job
Professional Competencies and Requirements:
Educational Background: Master's or bachelor's degree in computer science, Electrical Engineering, Automation Technology, or a related field. Personal Competencies and Requirements:
You can expect the following with us:
Experience Level: 5 - 8 YearsWorking Model: HybridWork Location: Bommasandra, Bangalore
Data Scientist (IT)Position:Data ScientistJob type:Techno-FunctionalPreferred education qualifications:Bachelor/ Master's degree in Statistics, Operation Research, Computer Science, Data Science OR related quantitative fieldJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills: § 5+ years in developing and implementing forecasting models§ [Mandatory] Proven track record in data analysis (EDA, profiling, sampling), data engineering (wrangling, storage, pipelines, orchestration)§ [Mandatory] Proven expertise in time series analysis, regression analysis, and other statistical modelling techniques§ [Mandatory] Experience in ML algorithms such as ARIMA, Prophet, Random Forests, and Gradient Boosting algorithms (XGBoost, LightGBM, CatBoost)§ [Mandatory] Experience in model explainability with Shapley plot and data drift detection metrics.§ [Mandatory] Strong programming & analysis skills with Python and SQL, including experience with relevant forecasting packages§ [Mandatory] Prior experience on Data Science & ML Engineering on Google Cloud§ [Mandatory] Proficiency in version control systems such as GitHub§ [Mandatory] Strong organizational capabilities; and ability to work in a matrix/ multidisciplinary team§ [Mandatory] Excellent communication and presentation skills, with the ability to explain complex technical concepts to non-technical audience§ Experience in Beauty or Retail/FMCG industry is preferred§ Experience in handling large volume of data (>100 GB)§ Experience in delivering AI-ML projects using Agile methodologies is preferred§ Proven ability to work proactively and independently to address product requirements and design optimal solutionsJob objectives:Design, develop, implement, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with cross-functional teams to leverage statistical modeling, machine learning, and data mining techniques to improve forecast accuracy and aid strategic decision-making across the organization. Scale the proven AI-ML Product across the SAPMENA region.Job description:§ Deep understanding of business/functional needs, problem statements and objectives/success criteria§ Develop and maintain sophisticated statistical forecasting models, incorporating factors such as seasonality, promotions, media, traffic and other economic indicators§ Collaborate with internal and external stakeholders including business, data scientists & product team to understand the business and product needs and translate them into actionable data-driven solutions§ Review MVP implementations, provide recommendations and ensure Data Science best practices and guidelines are followed§ Evaluate and compare the performance of different forecasting models, recommending optimal approaches for various business scenarios§ Analyze large and complex datasets to identify patterns, insights, and potential risks and opportunities§ Communicate forecasting results and insights to both technical and non-technical audiences through clear visualizations and presentations§ Stay up to date with the latest advancements in forecasting techniques and technologies, continuously seeking opportunities for improvement§ Contribute to the development of a robust data infrastructure for AI-ML solutions, ensuring data quality and accessibility§ Collaborate with other data scientists and engineers to build and deploy scalable AI-ML solutions
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
We are seeking a detail-oriented Business Analyst for Requirements Engineering with expertise in technical IT systems to join our dynamic team. In this role, you will act as the bridge between business stakeholders and technical teams, ensuring that our IT systems align with organizational goals and deliver exceptional value.
The "Head of Product Management" will lead the product vision, strategy, strengthening the enterprise product suite, and scaling it to the next phase of growth. In this role, one will define the product roadmap, collaborate with cross-functional teams across design, engineering, GTM, and build a high-performing product team that delivers with velocity and customer-centricity.
Your Responsibilities * Design and maintain frameworks for product classification and automated security requirements mapping * Conduct TARAs (Threat Analysis and Risk Assessment) and security assessments for Festo products * Support product teams in automating the generation of SBOMs (Software Bill of Materials) * Develop and execute test specifications, test cases and test plans for vulnerability testing of Festo products * Conduct penetration testing and basic vulnerability assessment of Festo products * Support documentation of test results and collaborate with the development teams * Support the continuous improvement and automation of security testing * Establish and maintain DevSecOps practices within CI/CD environments and develop automation infrastructure * Support the provision of tools and documentation in the context of SAMM (Software Assurance Maturity Model) * Collaborate with product compliance and development teams to implement and maintain product security measures * Support investigation and mitigation of product-related security incidents (PSIRT) Our Requirements * Education: Bachelor's degree in engineering, Computer Science, Mechatronics, Information Science and Electronics, Cyber Security or equivalent * Mandatory Experience: o Min 2 years of experience in product security, ideally in Industrial Automation or automotive field o Programming knowledge in Python and JavaScript o Basic familiarity with different industrial protocols and PLC systems o Experience with CI/CD practices and DevOps o Basic knowledge of Linux * Nice-to-Have: o Understanding of Secure Development Lifecycle and standards like IEC 62443-3 / 62443-4 o Additional knowledge in programming languages such as C, C++ or Shell scripting o Experience with tools like OpenVAS, Nessus, Nmap, Wireshark, embedded or IOT penetration testing o Experience in embedded domain o Experience in Linux hardening
We are seeking an experienced engineering leader with over 6 years in the field and at least 3 years in a leadership role, ideally with a strong background in UAV/drone technology, embedded systems, and sensor integration. The ideal candidate is hands-on, thrives in a start-up environment, and brings proven expertise in Agile project management, hardware-software integration, leading high-performing teams .
Position:ML EngineerJob type:Techno-FunctionalPreferred education qualifications: Bachelor/ Master's degree in computer science, Data Science, Machine Learning OR related technical degreeJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills:
Job objectives:Design, develop, deploy, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with product managers, data scientists & analysts to identify innovative & optimal machine learning solutions that leverage data to meet business goals. Contribute to development, rollout and onboarding of data scientists and ML use-cases to enterprise wide MLOps framework. Scale the proven ML use-cases across the SAPMENA region. Be responsible for optimal ML costs.Job description:
Your Responsibilities
Our Requirements
Job Location: Bengaluru, India Job Type: Full-time Experience: 2+ years
Location- GurgaonExternal Interfaces Internal Interfaces* Contractor(s) - BPO Partners - Global CoE Data & Analytics team* Vendors - Digital tools (SAP Ariba, Coupa, etc.)* Vendors - Supplier Relation Management (SRM)* Global Procurement GNFR* Finance, accounting and Legal* GNFR Stakeholders* Global CoE - Data & AnalyticsJob Requirements Education* Bachelor's degree in Supply Chain, Business Administration, Engineering, or a related field.* MBA or relevant Master's degree preferred.Relevant Experience* 10+ years of total experience in procurement, with a strong focus on process optimization, tools implementation, and transformation.* Proven success in leading or supporting largescale S2P initiatives.* Experience with procurement platforms such as SAP Ariba, Coupa, Oracle, and ERP systems like SAP.Behavioural Skills* Strong analytical and problem-solving abilities.* Excellent communication and stakeholder engagement skills.* Proactive, organized, and adaptable in a fastpaced, global environment.* Willingness to work flexible hours to collaborate across time zones.* Strong project management mindset.Knowledge* Strong understanding of end-to-end procurement processes, with a focus on Sourceto-Pay (S2P) transformation.* Hands-on knowledge of procurement platforms such as SAP Ariba, Coupa, or other similar tools (e.g., Oracle, Ivalua).* Experience in integrating procurement systems with ERP platforms such as SAP or Oracle * Familiarity with contract lifecycle management (CLM), supplier management, e-sourcing, and invoice automation.* Working knowledge of data governance, compliance frameworks, and internal controls in a procurement context.* Proficiency in Power BI or other business intelligence tools (e.g., Tableau, Qlik, Spotfire) for creating dashboards and data-driven insights.* Strong command of Microsoft Excel (pivot tables, advanced formulas, data modeling) and PowerPoint for executive reporting.* Basic understanding of cloud platforms (e.g., Azure, AWS, or GCP) for analytics is a plus. Roles & Responsibilities Analytics (Data & Insights)* Drive implementation, optimization, and adoption of procurement platforms such as SAP Ariba, Coupa, Oracle (including CLM, e-Sourcing, and invoice automation).* Collaborate with IT and cross-functional teams for seamless ERP integration.* Monitor tool adoption and identify opportunities for automation and enhancement. Operational Excellence* Lead and execute end-to-end procurement transformation initiatives, spanning S2P and P2P workstreams.* Design and deliver standardized, automated, and scalable procurement processes aligned with global strategy.* Define and implement procurement transformation roadmaps and process excellence frameworks.Stakeholder Management* Engage and align with global stakeholders including Procurement, Finance, Legal, IT, and Business Units.* Act as a change agent for procurement transformation, building awareness and ownership among stakeholders.* Define KPIs and provide actionable insights to leadership through dashboards and reports.Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to businessData Governance & Quality Assurance* Use data-driven analysis to identify trends, optimize procurement spend, and enable strategic sourcing.* Build interactive dashboards (preferably in PowerBI) and ensure high data integrity and governance
Create Job alert to receive Engineering jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.