Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
The client is a experienced Data Modeler specialising in scalable data model design for HRIS and IAM systems. The person will translate business needs into robust conceptual, logical and physical models supporting both transactional and analytical use cases. They'll collaborate with cross-functional teams to define data architecture, ensure governance and compliance, and optimize ETL pipelines.
This role requires a highly organized and experienced individual to lead and manage complex data-centric programs, incorporating strong change management principles and process analysis capabilities. The Data Program Owner will oversee all aspects of data initiatives, from inception to completion, ensuring projects are delivered on time, within budget, and meet business objectives.
The Data and AI Governance Leader, will play an integral role in orchestrating the governance council and ensuring compliance with governance policies in data, analytics, and AI. Their participation will be instrumental in defining a robust Data and AI Governance Framework, codifying organizational and industry policies into standards, procedures and technical controls ensuring effective, ethical, and legal use of Data & AI technologies
The role of Engineering Manager requires expertise in leading technology teams and delivering high-quality software solutions. The position is based in Bangalore and demands a strong understanding of Big Data, AWS and CI/CD pipeline
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
Job Requirements Education* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience*3 - 4 years for Data Scientist* Relevant working experience in a data science/advanced analytics role Behavioural Skills*Delivery Excellence*Business disposition*Social intelligence*Innovation and agility Knowledge* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)*Statistical modelling using Analytical tools (R, Python, KNIME, etc.)*Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)*Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.*Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)*Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)*Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools*Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)
Location- Gurgaon
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:* Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).* Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.* Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.* Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.* Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).* Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.* ADF, Databricks and Azure certification is a plus.
The role of a Data Engineer involves designing, developing, and maintaining robust data pipelines and systems to support analytics and decision-making in the manufacturing industry. This position is based in Pune and requires strong technical skills combined with a passion for working with data.
As the Commercial Head, you will spearhead strategy, pricing, procurement, and contract management for a leading services group. Reporting directly to the CFO, this leadership role combines business acumen with operational excellence to enhance margins and ensure compliance. You'll be instrumental in aligning commercial performance with long-term business goals.
Head the Fuel Supply Chain function for a renewable energy company. This role is critical to ensuring secure, cost-effective, and sustainable procurement and logistics of biomass, coal, and other alternative fuels across multiple project locations. The ideal candidate will bring significant experience in fuel procurement, supplier management, and logistics optimization across large-scale industrial operations
We are seeking a dynamic and strategic professional to lead Global Talent Acquisition and Employer Branding. This role will be pivotal in attracting, engaging, and retaining top global talent while strengthening organisation's employer brand in key geographies and talent markets.
We are seeking a dedicated Finance professional to join a leading global shipping & logistics company based out of Chennai. This role will require you to interact with APAC stakeholders while handling aspects across finance operations, audit & assurance, risk management and financial performance.
Create Job alert to receive Big Data jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.