Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
As an AWS & Infrastructure Specialist, you will play a critical role in leading the migration of our infrastructure, designing scalable and secure cloud solutions, and optimizing our document and EDI platform. You'll collaborate closely with software engineers, architects, and other infrastructure teams to make key architectural decisions, ensure high performance and security, and oversee the successful transition to the cloud environment.
We are seeking an experienced Sr. MLOps Specialist with deep expertise in AWS services and machine learning deployment best practices to design, build, and maintain scalable, secure, and automated ML pipelines.
The purpose of the Data Engineering specialist is to build and unit test code for Projects and Programmes on Azure Cloud Data and Analytics Platform
The Senior Engineer will participate in or undertake the technical analysis, specification, estimation, design, development, implementation and support of software solutions, working with business users, third parties, and other IT colleagues, as appropriate
The Software Engineer will participate in or undertake the technical analysis, specification, estimation, design, development, implementation and support of software solutions, working with business users, third parties, and other IT colleagues, as appropriate.
As a Sr. Systems Engineer, you will help develop, maintain, and secure our global infrastructure. The ideal candidate is a flexible, driven, diligent worker that can analyze the environment both as a whole and in detail to provide necessary business systems and services to the organization that meet business objectives and SLAs.
The VP - Enterprise Digital Systems & Technologies is a senior enterprise applications leader (ERP/CRM/SaaS) with strong cloud, architecture, and transformation expertise, who has scaled global delivery and can work directly with CXOs to modernize and digitize core business functions.
we are embarking on a major strategic initiative to modernize our core integration capabilities. We are migrating from a traditional ESB to a modern, cloud-native iPaaS platform.We are seeking a talented and highly skilled Senior Integration Developer to be a core contributor to this transformation.
This position seeks a Python DevOps Developer to build and maintain robust cloud-based solutions using AWS. The role is based in Chennai and requires expertise in Python, DevOps practices, and cloud technologies.
YOUR RESPONSIBILITIES:
WHAT WE ARE LOOKING FORTechnical:
Personal:
Qualifications Bachelor's degree in computer engineering, Computer Science, Data Analytics, or related field. 3-4 years of experience in BI engineering or data analytics and advanced SQL Programming like PL/SQL. Strong hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). In-depth knowledge of report design, development and maintenance, data extraction from DWH and cubes layer and dimensional data modeling concepts. Strong hands-on experience with Python/Spark scripting and shell (Unix/Linux) scripting. Familiarity with data warehousing concepts and cloud data platforms (Azure, Snowflake, etc.) and their services like Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Azure SQL DW/Synapse. Excellent communication and stakeholder management skills with Strong analytical abilities. Self-starter and motivated with ability to work in a fast-paced development environment. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Responsibilities:* Design, develop, and deploy Power BI dashboards, reports, and data models. * Support and monitor the existing reports in maintenance with clear documentation. * Optimize DAX queries and Power BI performance for scalability and responsiveness. * Automate the existing reports by removing manual steps and adhere reporting best practices. * Supports in analyzing reports, admin activities, usage monitoring, and licensing costs optimization. * Write custom scripts to extract data from unstructured/semi-structured sources from data warehouse or BI Cubes or from Snowflake. * Skilled in dashboard deployment, report consolidation, and enhancing report usability for end users. * Work with data engineers to define and validate data from the data sources for Power BI consumption. * Collaborate with business stakeholders to define KPIs and build self-service BI tools. * Proactive and clear communication about the report status with the stakeholders. * Provide Power BI training, support, and up to date documentation for business users and conduct KT sessions for the junior resources to guide and mentor themPreferred Skills:* Microsoft Certified: Data Analyst Associate or equivalent Power BI certification. * Hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). * Basic understanding of reporting governance and Power BI administration. * Working Knowledge of Dev-Ops processes (CI/CD), Azure DevOps, Git/Jenkins version control tool, reporting and maintenance of BI reports and SQL scripting. * Hands-on experience in extracting data from databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), SharePoint, Python/Unix shell Scripting. Technologies we use: Power BI, Databricks, Python, PySpark, Scripting (Power shell, Bash), Azure SQL DW/Synapse, Azure Tabular/Cubes, Azure DevOps, Git, Terraform, Snowflake
Key Responsibilities:
Required Skills and Qualifications:
Preferred Skills:
Data Scientist (IT)Position:Data ScientistJob type:Techno-FunctionalPreferred education qualifications:Bachelor/ Master's degree in Statistics, Operation Research, Computer Science, Data Science OR related quantitative fieldJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills: § 5+ years in developing and implementing forecasting models§ [Mandatory] Proven track record in data analysis (EDA, profiling, sampling), data engineering (wrangling, storage, pipelines, orchestration)§ [Mandatory] Proven expertise in time series analysis, regression analysis, and other statistical modelling techniques§ [Mandatory] Experience in ML algorithms such as ARIMA, Prophet, Random Forests, and Gradient Boosting algorithms (XGBoost, LightGBM, CatBoost)§ [Mandatory] Experience in model explainability with Shapley plot and data drift detection metrics.§ [Mandatory] Strong programming & analysis skills with Python and SQL, including experience with relevant forecasting packages§ [Mandatory] Prior experience on Data Science & ML Engineering on Google Cloud§ [Mandatory] Proficiency in version control systems such as GitHub§ [Mandatory] Strong organizational capabilities; and ability to work in a matrix/ multidisciplinary team§ [Mandatory] Excellent communication and presentation skills, with the ability to explain complex technical concepts to non-technical audience§ Experience in Beauty or Retail/FMCG industry is preferred§ Experience in handling large volume of data (>100 GB)§ Experience in delivering AI-ML projects using Agile methodologies is preferred§ Proven ability to work proactively and independently to address product requirements and design optimal solutionsJob objectives:Design, develop, implement, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with cross-functional teams to leverage statistical modeling, machine learning, and data mining techniques to improve forecast accuracy and aid strategic decision-making across the organization. Scale the proven AI-ML Product across the SAPMENA region.Job description:§ Deep understanding of business/functional needs, problem statements and objectives/success criteria§ Develop and maintain sophisticated statistical forecasting models, incorporating factors such as seasonality, promotions, media, traffic and other economic indicators§ Collaborate with internal and external stakeholders including business, data scientists & product team to understand the business and product needs and translate them into actionable data-driven solutions§ Review MVP implementations, provide recommendations and ensure Data Science best practices and guidelines are followed§ Evaluate and compare the performance of different forecasting models, recommending optimal approaches for various business scenarios§ Analyze large and complex datasets to identify patterns, insights, and potential risks and opportunities§ Communicate forecasting results and insights to both technical and non-technical audiences through clear visualizations and presentations§ Stay up to date with the latest advancements in forecasting techniques and technologies, continuously seeking opportunities for improvement§ Contribute to the development of a robust data infrastructure for AI-ML solutions, ensuring data quality and accessibility§ Collaborate with other data scientists and engineers to build and deploy scalable AI-ML solutions
Role Summary The Zone Network and Security Project Manager will be responsible for the co-ordination of designing, implementing, and maintaining secure IT infrastructure solutions. The role's main responsibility is to ensure the organization's IT systems remain resilient, compliant, and aligned with business objectives. The ideal candidate will have a strong technical background in network engineering and cybersecurity coupled with hands-on experience in project management and cross-functional collaboration. Experience with retail and cloud infrastructure a plus.Key ResponsibilitiesProject Management
Network Engineering and Security
Cloud Infrastructure and Automation
Cybersecurity and Compliance
IT Systems Administration
Collaboration and Support
Qualifications and SkillsTechnical Certifications (bonus but not required)
Technical Expertise
Preferred Experience
Demonstrated experience in deploying, configuring, and maintaining Zscaler Private Access (ZPA) and Zscaler Secure Internet Access (ZIA) in an enterprise environment, particularly within a large-scale retail contextLocation- HyderabadMode- Hybrid, 3 Days WFO
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
elta Lake, and Unity Catalog-all on Azure Cloud. You'll play a critical role in ensuring data security (PII encryption), governance, and cost optimization, while mentoring a talented team of engineers.
A highly experienced AWS Technical Architect to lead the design, implementation, and optimization of cloud infrastructure solutions on the Amazon Web Services (AWS) platform
Hiring for a PE-Backed Portfolio Company - a cloud-first, omnichannel retail ERP suite provider headquartered in Gurgaon.
We are looking for a highly skilled Salesforce Marketing Cloud (SFMC) Specialist to manage and optimize our SFMC platform, build personalized customer journeys, and ensure seamless integration with other systems.
Position:ML EngineerJob type:Techno-FunctionalPreferred education qualifications: Bachelor/ Master's degree in computer science, Data Science, Machine Learning OR related technical degreeJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills:
Job objectives:Design, develop, deploy, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with product managers, data scientists & analysts to identify innovative & optimal machine learning solutions that leverage data to meet business goals. Contribute to development, rollout and onboarding of data scientists and ML use-cases to enterprise wide MLOps framework. Scale the proven ML use-cases across the SAPMENA region. Be responsible for optimal ML costs.Job description:
Qualifications
Job Description:
Job location: Bengaluru, India
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:* Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).* Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.* Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.* Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.* Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).* Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.* ADF, Databricks and Azure certification is a plus.
This is a mid-senior leadership role based in Mumbai (BKC), with full-time work from office. It requires leading IT ops, SAP cloud systems, and digital projects, while collaborating across departments to deliver secure, scalable, and business-aligned tech solutions. It's ideal for someone who wants to build, partner, and lead in a high-impact environment.
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
Location- GurgaonExternal Interfaces Internal Interfaces* Contractor(s) - BPO Partners - Global CoE Data & Analytics team* Vendors - Digital tools (SAP Ariba, Coupa, etc.)* Vendors - Supplier Relation Management (SRM)* Global Procurement GNFR* Finance, accounting and Legal* GNFR Stakeholders* Global CoE - Data & AnalyticsJob Requirements Education* Bachelor's degree in Supply Chain, Business Administration, Engineering, or a related field.* MBA or relevant Master's degree preferred.Relevant Experience* 10+ years of total experience in procurement, with a strong focus on process optimization, tools implementation, and transformation.* Proven success in leading or supporting largescale S2P initiatives.* Experience with procurement platforms such as SAP Ariba, Coupa, Oracle, and ERP systems like SAP.Behavioural Skills* Strong analytical and problem-solving abilities.* Excellent communication and stakeholder engagement skills.* Proactive, organized, and adaptable in a fastpaced, global environment.* Willingness to work flexible hours to collaborate across time zones.* Strong project management mindset.Knowledge* Strong understanding of end-to-end procurement processes, with a focus on Sourceto-Pay (S2P) transformation.* Hands-on knowledge of procurement platforms such as SAP Ariba, Coupa, or other similar tools (e.g., Oracle, Ivalua).* Experience in integrating procurement systems with ERP platforms such as SAP or Oracle * Familiarity with contract lifecycle management (CLM), supplier management, e-sourcing, and invoice automation.* Working knowledge of data governance, compliance frameworks, and internal controls in a procurement context.* Proficiency in Power BI or other business intelligence tools (e.g., Tableau, Qlik, Spotfire) for creating dashboards and data-driven insights.* Strong command of Microsoft Excel (pivot tables, advanced formulas, data modeling) and PowerPoint for executive reporting.* Basic understanding of cloud platforms (e.g., Azure, AWS, or GCP) for analytics is a plus. Roles & Responsibilities Analytics (Data & Insights)* Drive implementation, optimization, and adoption of procurement platforms such as SAP Ariba, Coupa, Oracle (including CLM, e-Sourcing, and invoice automation).* Collaborate with IT and cross-functional teams for seamless ERP integration.* Monitor tool adoption and identify opportunities for automation and enhancement. Operational Excellence* Lead and execute end-to-end procurement transformation initiatives, spanning S2P and P2P workstreams.* Design and deliver standardized, automated, and scalable procurement processes aligned with global strategy.* Define and implement procurement transformation roadmaps and process excellence frameworks.Stakeholder Management* Engage and align with global stakeholders including Procurement, Finance, Legal, IT, and Business Units.* Act as a change agent for procurement transformation, building awareness and ownership among stakeholders.* Define KPIs and provide actionable insights to leadership through dashboards and reports.Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to businessData Governance & Quality Assurance* Use data-driven analysis to identify trends, optimize procurement spend, and enable strategic sourcing.* Build interactive dashboards (preferably in PowerBI) and ensure high data integrity and governance
This role involves leading Adobe Experience Manager (AEM) and Salesforce Marketing Cloud (SFMC) solutions within the technology department for a prominent business services company in Bangalore. The ideal candidate will drive innovative digital marketing initiatives and ensure seamless integration across platforms in the facilities management industry.
Create Job alert to receive Cloud jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.