Browse our jobs and apply for your next role.
The right candidate is just a few clicks away.
PageGroup changes lives for people through creating opportunity to reach potential.
If you have any questions, we’re here to help.
Job Requirements Education* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience*3 - 4 years for Data Scientist* Relevant working experience in a data science/advanced analytics role Behavioural Skills*Delivery Excellence*Business disposition*Social intelligence*Innovation and agility Knowledge* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)*Statistical modelling using Analytical tools (R, Python, KNIME, etc.)*Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)*Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.*Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)*Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)*Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools*Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)
Location- Gurgaon
Job RequirementsEducation* Bachelor's degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)* Master's degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.)Relevant Experience* 5 - 7 years for Data Scientist* Relevant working experience in a data science/advanced analytics roleBehavioural Skills* Delivery Excellence* Business disposition* Social intelligence* Innovation and agilityExperience* Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.)* Statistical modelling using Analytical tools (R, Python, KNIME, etc.)* Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference)* Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference.* Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.)* Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.)* Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)* Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools* Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.)* Microsoft Office applications (MS Excel, etc.)Roles & ResponsibilitiesAnalytics & Strategy1. Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business2. Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data3. Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions4. Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional dataOperational Excellence1. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project2. Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems3. Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analysesStakeholder Engagement1. Working collaboratively across multiple sets of stakeholders - Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables2. Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats
Qualifications
Job Description:
Job location: Bengaluru, India
Qualifications* Bachelor's or master's degree in computer science, or a comparable field.* 5 to 8 years of experience working in frontend development with web-based applications.* Expertise in frontend technologies like HTML, CSS and JavaScript.* Expertise in working with Blazor framework.* Expertise in backend technologies like C#/.NET 8.0.* Good knowledge of RESTful APIs and web services.* Solid understanding of OOP and design patterns.* Mathematical knowledge, particularly in geometry to support the development of CAD interface.* Experience of working in an agile development team (E.g. Scrum).* High degree of personal responsibility and analytical skills to solve complex challenges.* Quick thinking, positive attitude, and excellent collaboration skills.* Strong motivation and enthusiasm for professional development and visual appealing frontends.* Goal and results oriented.* Team player and ability to work in a multicultural and multinational environment.* Unconventional thinking, creativity, and a willingness to continuously improve.* Flexible, adaptable, open to change and the acquisition of new knowledge and skills.Job Description:* Design, develop and maintain the web application Fluid Draw web and fluidsim.festo.com using .Net 8.0 and Blazor framework.* Maintain and extend the CAD interface [Drawing area]* Implement new features and optimize existing functionalities for performance and usability.* Ensure code quality through testing, code reviews and adherence to best practices.* Collaborate with cross-functional teams and support PO to gather and analyze requirements.* Participate in architectural discussions and contribute to the overall design of the application.* Troubleshoot and resolve issues in a timely manner.Job Location: Bengaluru, IndiaJob Level: Experienced Professionals - 5+ yearsLevel of Education: BE/BTechJob Type: Full-Time/Regular
Qualifications⎯ Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred.⎯ 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.⎯ 3+ years of experience with setting up and operating data pipelines using Python or SQL⎯ 3+ years of advanced SQL Programming: PL/SQL, T-SQL⎯ 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.⎯ Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. ⎯ 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data⎯ 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.⎯ 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring.⎯ Strong analytical abilities and a strong intellectual curiosity.⎯ In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts⎯ Understanding of REST and good API design.⎯ Strong collaboration, teamwork skills, excellent written and verbal communications skills.⎯ Self-starter and motivated with ability to work in a fast-paced development environment.⎯ Agile experience highly desirable.⎯ Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.Preferred Skills:* Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).* Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.* Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.* Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.* Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).* Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.* ADF, Databricks and Azure certification is a plus.
Qualifications & Required Skills: Full-Timebachelor's or master's degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards.Preferred* Azure Data Factory (ADF), Databricks certification is a plus.* Data Architect or Azure cloud Solution Architect certification is a plus.Technologies we use: Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake
Position:ML EngineerJob type:Techno-FunctionalPreferred education qualifications: Bachelor/ Master's degree in computer science, Data Science, Machine Learning OR related technical degreeJob location:IndiaGeography:SAPMENARequired experience:6-8 YearsPreferred profile/ skills:
Job objectives:Design, develop, deploy, and maintain data science and machine learning solutions to meet enterprise goals. Collaborate with product managers, data scientists & analysts to identify innovative & optimal machine learning solutions that leverage data to meet business goals. Contribute to development, rollout and onboarding of data scientists and ML use-cases to enterprise wide MLOps framework. Scale the proven ML use-cases across the SAPMENA region. Be responsible for optimal ML costs.Job description:
Your Responsibilities
Our Requirements
Job Location: Bengaluru, India Job Type: Full-time Experience: 2+ years
We are looking for an Azure Applications Developer to be responsible for implementing solutions for and providing support to our Azure Application customers.
An excellent career opportunity to work with a global recruitment consulting firm, Michael Page within our Technology practice. This is a 360-degree role which will offer you an opportunity to work on both Business Development and delivery in the form of Recruitments.
Hiring for a PE-Backed Portfolio Company - a cloud-first, omnichannel retail ERP suite provider headquartered in Gurgaon.
The purpose of the Data Engineering specialist is to build and unit test code for Projects and Programmes on Azure Cloud Data and Analytics Platform
- This role blends executive support, event management, and vendor coordination to drive smooth operations and deliver exceptional experiences.- It offers high visibility, working closely with leadership while managing multiple priorities and representing the organization's culture.
Create Job alert to receive Computer Science jobs via email the minute they become available
Submit your CV to register with us and we will contact you if a suitable role becomes available.