Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
The client is seeking a dynamic Digital Analytics Leader to shape its global digital future by building and driving digital analytics and AI practices, delivering insights, and enabling enterprise-wide automation. The role will leverage current and emerging technologies to transform client's business, driving intelligent automation solutions and shaping the AI/ML, analytics, and reporting strategy.
The purpose of the Data Engineering specialist is to build and unit test code for Projects and Programmes on Azure Cloud Data and Analytics Platform
- This role focuses on supporting strategic HR projects with an emphasis on compensation, people analytics and talent management.- It involves analyzing HR data, conducting benchmarking and driving initiatives like org design, job evaluation and workforce planning to support business goals.
You will have a key role in advancing how GIA utilises available data for its risk assessment and assurance planning, aiming to integrate this into a real-time data-led ecosystem. Your responsibilities include driving innovation and increasing the productivity of the audit function through the use of data, developing predictive models and use of Data Science techniques in the department.
elta Lake, and Unity Catalog-all on Azure Cloud. You'll play a critical role in ensuring data security (PII encryption), governance, and cost optimization, while mentoring a talented team of engineers.
The Senior Demand Planner Analyst must demonstrate the ability to effectively lead their respective staff in properly plan for shifts in consumer purchasing, while optimizing product inventory levels. Product forecasting will be managed in Relex (inventory planning system), interfacing with Merchants and Vendors for their assigned area of business, ensuring proper planning for item introductions, promotions, item discontinuation, and more.
Lead the design and continuous improvement of Data Science programs, ensuring academic excellence and industry relevance. Manage and mentor faculty, delivered impactful sessions, and represent Hero Vired at conferences and external forums. Further, establish and uphold academic rigor and delivery excellence across diverse learning formats (online, hybrid, live).
We are looking for a customer-focused & business-oriented Data Scientist who will be responsible for conducting analyses using advanced statistics and data mining techniques to enable better decision-making.
As a Data Analyst within our Data & Analytics Team, you will be a crucial partner to the business. Collaborating closely with our business teams across divisions, you will leverage data to drive strategic decision-making and optimize operational efficiency.
Job Description: looking for a talented and experienced Senior Data Engineer with a strong expertise in Snowflake to join our team. As a Senior Data Engineer, you will play a critical role in the development and optimisation of data systems, data pipelines, and ensuring seamless data transformations to support business intelligence and data-driven decisions.
This position will be responsible for all aspects of Order Management System and its integration. Drive & Manage Projects, especially in the case of a brand integration context with design, build follow up with developpers, testing, cutover and hypercare.The role requires the individual to work closely with all teams (IT & Digital Teams, Key Business Users, Data & Analytic.
Qualifications Bachelor's degree in computer engineering, Computer Science, Data Analytics, or related field. 3-4 years of experience in BI engineering or data analytics and advanced SQL Programming like PL/SQL. Strong hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). In-depth knowledge of report design, development and maintenance, data extraction from DWH and cubes layer and dimensional data modeling concepts. Strong hands-on experience with Python/Spark scripting and shell (Unix/Linux) scripting. Familiarity with data warehousing concepts and cloud data platforms (Azure, Snowflake, etc.) and their services like Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Azure SQL DW/Synapse. Excellent communication and stakeholder management skills with Strong analytical abilities. Self-starter and motivated with ability to work in a fast-paced development environment. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Responsibilities:* Design, develop, and deploy Power BI dashboards, reports, and data models. * Support and monitor the existing reports in maintenance with clear documentation. * Optimize DAX queries and Power BI performance for scalability and responsiveness. * Automate the existing reports by removing manual steps and adhere reporting best practices. * Supports in analyzing reports, admin activities, usage monitoring, and licensing costs optimization. * Write custom scripts to extract data from unstructured/semi-structured sources from data warehouse or BI Cubes or from Snowflake. * Skilled in dashboard deployment, report consolidation, and enhancing report usability for end users. * Work with data engineers to define and validate data from the data sources for Power BI consumption. * Collaborate with business stakeholders to define KPIs and build self-service BI tools. * Proactive and clear communication about the report status with the stakeholders. * Provide Power BI training, support, and up to date documentation for business users and conduct KT sessions for the junior resources to guide and mentor themPreferred Skills:* Microsoft Certified: Data Analyst Associate or equivalent Power BI certification. * Hands-on experience with Microsoft Power BI (including Power Query, DAX, dataflows, etc.). * Basic understanding of reporting governance and Power BI administration. * Working Knowledge of Dev-Ops processes (CI/CD), Azure DevOps, Git/Jenkins version control tool, reporting and maintenance of BI reports and SQL scripting. * Hands-on experience in extracting data from databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), SharePoint, Python/Unix shell Scripting. Technologies we use: Power BI, Databricks, Python, PySpark, Scripting (Power shell, Bash), Azure SQL DW/Synapse, Azure Tabular/Cubes, Azure DevOps, Git, Terraform, Snowflake
You will have a key role in advancing how GIA utilises available data for its risk assessment and assurance planning, aiming to integrate this into a real-time data-led ecosystem. Your responsibilities include driving innovation and increasing the productivity of the audit function through the use of data, automation, analytics and use of Data Engineering techniques within the department.
Job Requirements Education Bachelor's degree in business or related field is preferred Relevant Experience * 2 years+ of experience in inventory demand planning * Proven track record in setting and achieving inventory short, medium and long-range inventory plans * Exceptional ability to communicate complex analytics in a clear, actionable manner to influence decision-making * Strong interpersonal skills to effectively collaborate with internal teams and external parties * Skilled in monitoring and measuring inventory demand planning performance against goals, with the ability to provide insights and make timely adjustments * Results-driven mindset with the ability to balance strategic planning and tactical execution Behavioural Skills * Delivery Excellence * Business Disposition * Social Intelligence * Innovation & Agility * Purposeful Leadership Technical Skills * Strong analytical skills with proficiency in inventory management software and tools (e.g., PDI, Relex) * Advanced proficiency in Excel and other data analysis tools; experience with visualization tools e.g., Power BI, Tableau etc. Knowledge * Data Analytics * Good knowledge in Power BI, MS Excel and Tableau
The role of a Data Engineer involves designing, developing, and maintaining robust data pipelines and systems to support analytics and decision-making in the manufacturing industry. This position is based in Pune and requires strong technical skills combined with a passion for working with data.
Location- GurgaonExternal Interfaces Internal Interfaces* Contractor(s) - BPO Partners - Global CoE Data & Analytics team* Vendors - Digital tools (SAP Ariba, Coupa, etc.)* Vendors - Supplier Relation Management (SRM)* Global Procurement GNFR* Finance, accounting and Legal* GNFR Stakeholders* Global CoE - Data & AnalyticsJob Requirements Education* Bachelor's degree in Supply Chain, Business Administration, Engineering, or a related field.* MBA or relevant Master's degree preferred.Relevant Experience* 10+ years of total experience in procurement, with a strong focus on process optimization, tools implementation, and transformation.* Proven success in leading or supporting largescale S2P initiatives.* Experience with procurement platforms such as SAP Ariba, Coupa, Oracle, and ERP systems like SAP.Behavioural Skills* Strong analytical and problem-solving abilities.* Excellent communication and stakeholder engagement skills.* Proactive, organized, and adaptable in a fastpaced, global environment.* Willingness to work flexible hours to collaborate across time zones.* Strong project management mindset.Knowledge* Strong understanding of end-to-end procurement processes, with a focus on Sourceto-Pay (S2P) transformation.* Hands-on knowledge of procurement platforms such as SAP Ariba, Coupa, or other similar tools (e.g., Oracle, Ivalua).* Experience in integrating procurement systems with ERP platforms such as SAP or Oracle * Familiarity with contract lifecycle management (CLM), supplier management, e-sourcing, and invoice automation.* Working knowledge of data governance, compliance frameworks, and internal controls in a procurement context.* Proficiency in Power BI or other business intelligence tools (e.g., Tableau, Qlik, Spotfire) for creating dashboards and data-driven insights.* Strong command of Microsoft Excel (pivot tables, advanced formulas, data modeling) and PowerPoint for executive reporting.* Basic understanding of cloud platforms (e.g., Azure, AWS, or GCP) for analytics is a plus. Roles & Responsibilities Analytics (Data & Insights)* Drive implementation, optimization, and adoption of procurement platforms such as SAP Ariba, Coupa, Oracle (including CLM, e-Sourcing, and invoice automation).* Collaborate with IT and cross-functional teams for seamless ERP integration.* Monitor tool adoption and identify opportunities for automation and enhancement. Operational Excellence* Lead and execute end-to-end procurement transformation initiatives, spanning S2P and P2P workstreams.* Design and deliver standardized, automated, and scalable procurement processes aligned with global strategy.* Define and implement procurement transformation roadmaps and process excellence frameworks.Stakeholder Management* Engage and align with global stakeholders including Procurement, Finance, Legal, IT, and Business Units.* Act as a change agent for procurement transformation, building awareness and ownership among stakeholders.* Define KPIs and provide actionable insights to leadership through dashboards and reports.Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to businessData Governance & Quality Assurance* Use data-driven analysis to identify trends, optimize procurement spend, and enable strategic sourcing.* Build interactive dashboards (preferably in PowerBI) and ensure high data integrity and governance
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
The Head of Collections for Secured Business is responsible for leading and managing the collections function for secured loan portfolios (such as home loans, auto loans, etc.). This role ensures effective recovery strategies, minimizes delinquency, and maintains regulatory compliance. It involves overseeing collections teams, setting performance targets, managing vendor relationships, and using data analytics to optimize recovery efforts.
This role involves leading the development and maintenance of client and custom applications, contributing to technology solutions in the commercial insurance value chain.Positioned within the Data & Analytics division, the focus is on developing technology products for strategic clients Responsibilities include identifying, managing, and implementing digitization opportunities, and using data science to transform insurance experience.
Job Requirements Education* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience*3 - 4 years for Data Scientist* Relevant working experience in a data science/advanced analytics role Behavioural Skills*Delivery Excellence*Business disposition*Social intelligence*Innovation and agility Knowledge* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)*Statistical modelling using Analytical tools (R, Python, KNIME, etc.)*Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)*Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.*Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)*Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)*Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools*Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)
Location- Gurgaon
The Data and AI Governance Leader, will play an integral role in orchestrating the governance council and ensuring compliance with governance policies in data, analytics, and AI. Their participation will be instrumental in defining a robust Data and AI Governance Framework, codifying organizational and industry policies into standards, procedures and technical controls ensuring effective, ethical, and legal use of Data & AI technologies
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:* Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).* Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.* Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.* Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.* Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).* Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.* ADF, Databricks and Azure certification is a plus.
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
This position is responsible for to define and align analytics vision, manage execution of data initiatives, lead data talents, drive analytics maturity and ensure the business is powered by data solution. He/She will also facilitate analytics use-cases generation, lead selection and prioritization and oversee validation process. The position serves as an advisor to the leadership team.
Create Job alert to receive Data Analytics jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.