24,453 Maiora Data Engineer jobs in India
Maiora-Data Engineer
Job Viewed
Job Description
Role: Data Engineer
Exp-5-7 years
Location-Bangalore
Roles and Responsibilities
• Collaborate with a dynamic team in a fast-paced environment to develop and maintain Python-based applications.
• Write clean, scalable, and well-documented code.
• Design and implement software solutions, ensuring high performance and responsiveness.
• Optimize code for maximum efficiency and maintainability.
• Collaborate with cross-functional teams to define, design, and ship new features.
• Contribute to the entire software development lifecycle, from concept to deployment.
• Troubleshoot, debug, and address software defects and issues.
• Stay updated on industry best practices and emerging technologies.
Required Skills:
• Strong proficiency in Python and PySpark.
• Experience in writing SQL queries & scripting.
• Experience in creating ETL flows and data orchestration.
• Experience in working on files: CSV, Excel, Parquet
• Good to have working experience with Databricks, and Spark Server.
• Good to have working experience with PowerBi, Tableau.
• Knowledge of database systems: MySQL, PostgreSQL,OracleDB, MSSQL.
• Familiarity with version control systems, particularly Git.
• Exposure to DevOps practices and tools.
• Exposure to cloud services, particularly AWS.
• Experience in managing Apache Airflow.
Qualifications:
• Bachelor's degree in Computer Science or a related field.
• Proven work experience as a Developer with 4-6 years of experience.
• Strong problem-solving and algorithmic thinking.
• Ability to work collaboratively in a team-oriented environment
About Maiora:
At MAIORA, we excel in swiftly converting raw data into actionable insights, empowering faster and more informed decision-making. We specialize in handling extensive data analyses, leveraging top-notch tools and a skilled team to offer robust solutions for diverse data projects.
Our focus is on cutting-edge data tools, seamlessly integrating your environment with our proprietary platforms, developed in-house by our expert engineers. Partnering with key networks like Snowflake and Databricks, our dedicated team of professionals is available round the clock to manage your environment efficiently.
What's the impact on you? It gives you the ability to think big, act fast, delve deeper, and understand comprehensively. This results in quicker decision-making, increased value for your network, and swift navigation from start to finish. It minimizes risks, reduces stress, and preserves value without compromise.
Job No Longer Available
This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.
However, we have similar jobs available for you below.
Maiora-Data Engineer
Posted today
Job Viewed
Job Description
Role: Data Engineer
Exp-5-7 years
Location-Bangalore
Roles and Responsibilities
• Collaborate with a dynamic team in a fast-paced environment to develop and maintain Python-based applications.
• Write clean, scalable, and well-documented code.
• Design and implement software solutions, ensuring high performance and responsiveness.
• Optimize code for maximum efficiency and maintainability.
• Collaborate with cross-functional teams to define, design, and ship new features.
• Contribute to the entire software development lifecycle, from concept to deployment.
• Troubleshoot, debug, and address software defects and issues.
• Stay updated on industry best practices and emerging technologies.
Required Skills:
• Strong proficiency in Python and PySpark.
• Experience in writing SQL queries & scripting.
• Experience in creating ETL flows and data orchestration.
• Experience in working on files: CSV, Excel, Parquet
• Good to have working experience with Databricks, and Spark Server.
• Good to have working experience with PowerBi, Tableau.
• Knowledge of database systems: MySQL, PostgreSQL,OracleDB, MSSQL.
• Familiarity with version control systems, particularly Git.
• Exposure to DevOps practices and tools.
• Exposure to cloud services, particularly AWS.
• Experience in managing Apache Airflow.
Qualifications:
• Bachelor's degree in Computer Science or a related field.
• Proven work experience as a Developer with 4-6 years of experience.
• Strong problem-solving and algorithmic thinking.
• Ability to work collaboratively in a team-oriented environment
About Maiora:
At MAIORA, we excel in swiftly converting raw data into actionable insights, empowering faster and more informed decision-making. We specialize in handling extensive data analyses, leveraging top-notch tools and a skilled team to offer robust solutions for diverse data projects.
Our focus is on cutting-edge data tools, seamlessly integrating your environment with our proprietary platforms, developed in-house by our expert engineers. Partnering with key networks like Snowflake and Databricks, our dedicated team of professionals is available round the clock to manage your environment efficiently.
What's the impact on you? It gives you the ability to think big, act fast, delve deeper, and understand comprehensively. This results in quicker decision-making, increased value for your network, and swift navigation from start to finish. It minimizes risks, reduces stress, and preserves value without compromise.
Specialist, Data Architecture
Posted today
Job Viewed
Job Description
Calling all innovators – find your future at Fiserv.
We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
Job Title
Specialist, Data Architecture
- Experience on SSIS/SQL(4-7) years and will be responsible for the development of ETL and Reporting solutions
- Strong Knowledge of SSIS packages, design principles & best practices.
- Experience with requirements gathering, technical analysis, and writing technical specifications
- Candidate must have strong database fundamentals.
- Must have good knowledge of Data Warehousing & Data Modelling Concepts.
- Good communication skills are required.
- Capability to work in a distributed team environment with minimal supervision is required for this profile.
- The position doesn't require working in shifts, however flexibility to overlap with US hours is required.
- Should have good knowledge in writing SQL commands, queries and stored procedures
- Good Knowledge of Snowflake would be preferred.
- Good Knowledge of Python/Pyspark would be preferred
Thank you for considering employment with Fiserv. Please:
- Apply using your legal name
- Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
Our commitment to Diversity and Inclusion:
Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
Note to agencies:
Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
Warning about fake job posts:
Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Specialist, Data Architecture
Posted today
Job Viewed
Job Description
Calling all innovators – find your future at Fiserv.
We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
Job Title
Specialist, Data Architecture
What does a great AI/ML Engineer do?
We are seeking a highly skilled Generative AI Specialist with 6-8 years of extensive experience in AI and machine learning, focusing on generative models and prompt engineering. The successful candidate will work collaboratively within our team to design, implement, and optimize generative AI solutions that enhance our product offerings and provide value to our clients.
What You Will Do
- Develop, implement, and optimize generative models (e.g., GANs, VAEs) for various applications.
- Conduct research and stay updated with the latest advancements in generative AI and deep learning.
- Collaborate with cross-functional teams to identify opportunities for leveraging generative AI in our products.
- Design and optimize prompts used in AI models to improve output quality and relevance.
- Analyze model performance and user data to refine prompt strategies.
- Analyze and preprocess large datasets to train and validate models effectively.
- Create and maintain documentation for algorithms, methodologies, and processes.
- Provide training and support to internal teams on generative AI technologies.
What You Will Need To Have
- Bachelor's or Master's degree in Computer Science, Data Science, AI, or a related field.
- 6-8 years of experience in AI/ML, with a strong focus on generative models and prompt engineering.
- Proficiency in programming languages such as Python or R.
- Experience with deep learning frameworks (e.g., TensorFlow, PyTorch) and libraries.
- Strong understanding of machine learning algorithms and statistical analysis.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
- Strong communication skills to articulate complex ideas to non-technical stakeholders.
What Would Be Great To Have
- Experience in the financial services industry is a plus.
- Familiarity with natural language processing (NLP) or computer vision (CV) applications.
- Contributions to open source projects or published research in generative AI.
Thank You For Considering Employment With Fiserv. Please
- Apply using your legal name
- Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
Our Commitment To Diversity And Inclusion
Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
Note To Agencies
Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
Warning About Fake Job Posts
Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Advisor, Data Architecture
Posted today
Job Viewed
Job Description
Calling all innovators – find your future at Fiserv.
We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
Job Title
Advisor, Data Architecture
What does a successful Advisor do at Fiserv?
As a member of our Data Commerce Solutions group, you will build and take ownership of the design and development of data engineering projects within Fiserv's Enterprise Data Commerce Solutions division. You will apply your depth of knowledge and expertise to all aspects of the data engineering lifecycle, as well as partner continuously with your many stakeholders daily to stay focused on common goals.
You will Lead large-scale data engineering, integration and warehousing projects, build custom integrations between cloud-based systems using APIs and write complex and efficient queries to transform raw data sources into easily accessible models by using the Data integration tool with coding across several languages such as Java, Python, and SQL. Additional responsibilities include, but are not limited to Architect, build, and launch new data models that provide intuitive analytics to the team and Build data expertise and own data quality for the pipelines you create.
What you will do:
- Provide strategic leadership and direction to the software development team, fostering a culture of innovation, collaboration, and continuous improvement.
- Develop and implement a robust software development strategy aligned with the company's overall objectives and long-term vision.
- Collaborate with product management to define software requirements, scope, and priorities, ensuring alignment with business goals.
- Lead and guide the software development team in creating technical design specifications, architecture, and development plans for complex software projects.
- Ensure adherence to industry best practices, coding standards, and software development methodologies to deliver high-quality and scalable software solutions.
- Monitor and analyze software development metrics and key performance indicators (KPIs) to track team productivity, efficiency, and code quality.
- Manage the software development budget and resource allocation, optimizing resource utilization and capacity planning.
- Foster a culture of learning and development within the team, providing coaching, mentoring, and professional growth opportunities to team members.
- Identify and mitigate potential risks and challenges in software development projects, developing contingency plans as needed.
- Collaborate with other stakeholders to establish and maintain effective communication channels and project status updates.
- Stay up to date with industry trends, emerging technologies, and best practices to drive continuous improvement and innovation in software development processes.
- Build and maintain strong relationships with external partners, vendors, and third-party providers to enhance software development capabilities and delivery.
What you will need to have:
- Bachelor's or master's degree in computer science, Software Engineering, or a related field. An advanced degree is preferred.
- Proven experience (minimum 7+ years) in a senior leadership role within software development or software engineering.
- Demonstrated success in delivering complex software projects and products on time and within budget.
- Extensive experience in software development methodologies, such as Agile, Scrum, or Kanban, and experience in transitioning teams to these methodologies.
- Strong technical expertise in software architecture, design patterns, and modern software development languages and frameworks.
- Excellent communication, interpersonal, and leadership skills, with the ability to influence and inspire cross-functional teams.
- Exceptional problem-solving and decision-making abilities, with a keen attention to detail and a focus on delivering high-quality products.
- Proven track record of building and managing high-performing software development teams.
- Strong business acumen and the ability to align software development initiatives with broader business objectives.
- A passion for innovation, technology, and keeping abreast of the latest developments in the software industry.
- Proficiency with solutions for processing large volumes of data, using data processing tools and Big Data platforms.
- Understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS, SQL experience
- Hands-on experience in production rollout and infrastructure configuration
- Demonstrable experience of successfully delivering big data projects using Kafka, Spark
- Exposure working on NoSQL Databases such as Cassandra, HBase, DynamoDB, and Elastic Search
- Experience working with PCI Data and working with data scientists is a plus.
- In depth knowledge of design principles and patterns
- Experience with cloud platforms and services such as AWS, Azure, or Google Cloud Platform, and knowledge of deploying and managing APIs in a cloud environment.
- Knowledge of API gateway solutions and their implementation, such as Kong, Apigee, or AWS API Gateway.
What would be great to have:
- Exposure to Big Data tools and solutions a strong plus.
- Exposure to Relational Modeling, Dimensional Modeling, and Modeling of Unstructured Data.
- Experience in Design and architecture review and Banking and Financial domain.
Thank you for considering employment with Fiserv. Please:
- Apply using your legal name
- Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
Our commitment to Diversity and Inclusion:
Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
Note to agencies:
Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
Warning about fake job posts:
Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Data Architecture Manager
Posted today
Job Viewed
Job Description
Job Title: Senior Data Architect – OSS
Minimum - 10+ years of hands-on experience in data architecture, data modelling, and designing large-scale data platforms.
Location: Bangalore (Onsite)
Why should you choose us?
Rakuten Symphony is a Rakuten Group company, that provides global B2B services for the mobile telco industry and enables next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are taking our mobile offering global. To support our ambitions to provide an innovative cloud-native telco platform for our customers, Rakuten Symphony is looking to recruit and develop top talent from around the globe. We are looking for individuals to join our team across all functional areas of our business – from sales to engineering, support functions to product development. Let’s build the future of mobile telecommunications together!
What Do We Expect From You
As a Senior Data Architect within Product Architecture, you will address the critical challenge of a fragmented data model across our various teams. This pivotal role is responsible for unifying disparate data sources - including telemetry, service graphs, and Root Cause Analysis (RCA) inputs - into a single, coherent, and consistent data model. This foundational work is a prerequisite for advancing our capabilities in Generative AI, comprehensive observability, and robust service assurance, enabling a truly data-driven approach across our product portfolio.
Responsibilities:
Unified Data Model Design & Ownership:
- Lead the architectural design and implementation of a unified data model that consolidates telemetry, service graphs, RCA, and other critical operational data.
- Define conceptual, logical, and physical data models that support diverse product needs while ensuring consistency and interoperability.
- Establish and maintain data architecture standards, principles, and best practices across the product organization.
Data Integration & Pipeline Architecture:
- Architect scalable and efficient data pipelines for ingesting, transforming, and harmonizing data from various heterogeneous sources.
- Design robust data integration patterns (e.G., streaming, batch processing) to ensure data availability and freshness for downstream consumers.
- Ensure data quality, integrity, and security throughout the data lifecycle.
Enabling AI, Observability & Assurance:
- Ensure the unified data model serves as a foundational prerequisite for GenAI applications, providing clean, contextualized data for training and inference.
- Design data structures and access patterns that enhance observability capabilities, enabling real-time monitoring, analytics, and alerting.
- Architect data solutions that power advanced service assurance functionalities, enabling accurate fault correlation, root cause analysis, and performance management.
Data Governance & Lifecycle Management:
- Collaborate with data governance teams to establish data ownership, definitions, and stewardship across product domains.
- Define data retention policies, archiving strategies, and data lifecycle management processes.
- Ensure compliance with data privacy regulations and security best practices.
Technical Leadership & Collaboration
- Provide senior technical leadership and guidance to engineering teams on data modelling, database technologies, and data architecture best practices.
- Work closely with other architects (AIML, Observability, Workflow Management), data scientists, and product managers to understand data requirements and deliver impactful solutions.
- Champion a data-first mindset and promote data literacy across the organization.
Qualifications :
- 10+ years of hands-on experience in data architecture, data modelling, and designing large-scale data platforms.
- Proven track record of unifying fragmented data landscapes and building robust, scalable data models for complex domains.
- Expertise in designing data solutions that support real-time analytics, observability, and machine learning applications.
- Strong background in architecting data pipelines and integration strategies for high-volume, diverse data sources.
Technical Skills
- Deep expertise in relational, NoSQL databases (e.G., Cassandra, ClickHouse) and TSDB (e.G., Cortex, VictoriaMetrics, etc.)
- Strong experience with stream processing technologies (e.G., Kafka, Flink, Spark Streaming) and batch processing (e.G., Apache Spark).
- Proficient in data modelling techniques (dimensional, relational, graph) and data schema design.
- Knowledge of data governance frameworks and tools.
- Proficient in SQL and at least one relevant programming language.
Domain Knowledge
- Familiarity with telecom operational data, including network telemetry, service topology/graphs, and alarm/event data.
Analytical and Problem-Solving Skills
- Exceptional analytical skills to identify data relationships, inconsistencies, and design optimal data structures.
- Strong problem-solving abilities for complex data integration and quality challenges.
Collaboration & Communication
- Excellent written and verbal communication skills, able to articulate complex data architectural concepts to technical and business stakeholders.
- Proven ability to collaborate effectively with cross-functional teams, driving consensus on data standards.
Educational Background
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related technical field.
RAKUTEN SHUGI PRINCIPLES :
Our worldwide practices describe specific behaviours that make Rakuten unique and united across the world. We expect Rakuten employees to model these 5 Shugi Principles of Success.
- Always improve, always advance . Only be satisfied with complete success - Kaizen.
- Be passionately professional . Take an uncompromising approach to your work and be determined to be the best.
- Hypothesize - Practice - Validate - Shikumika. Use the Rakuten Cycle to success in unknown territory.
- Maximize Customer Satisfaction . The greatest satisfaction for workers in a service industry is to see their customers smile.
- Speed! Speed! Speed! Always be conscious of time. Take charge, set clear goals, and engage your team.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Job Overview:
We are seeking a highly skilled Technical Architect with 12+ years of experience to oversee and guide the development of advanced data architectures, business intelligence solutions, and ETL workflows using Azure Data Bricks (ADB), Power BI, and related technologies. This is a strategic role in which you will collaborate with cross-functional teams to design robust, scalable, and efficient data systems that meet business objectives.
Key Responsibilities:
- Architecture Design & Strategy:
- Lead the design and architecture of data pipelines, ETL workflows, and BI solutions leveraging Azure Data Bricks (ADB) and Power BI.
- Develop high-level solutions, ensuring scalability, performance, and cost-effectiveness.
- Create data models, schemas, and architecture blueprints for various business units.
- Guide and mentor teams in implementing best practices for data processing, transformation, and storage.
- ETL Solutions:
- Design, develop, and optimize ETL workflows using Azure Data Bricks, Azure Data Factory, and other Azure services.
- Integrate data from multiple sources into a centralized data lake or warehouse for reporting and analytics.
- Ensure ETL processes are efficient, automated, and error-free.
- Business Intelligence & Reporting:
- Lead the implementation of Power BI solutions for reporting, dashboards, and data visualization.
- Collaborate with stakeholders to understand business requirements and deliver actionable insights through visual reports.
- Ensure that Power BI reports are optimized for performance, usability, and scalability.
- Collaboration & Leadership:
- Work closely with cross-functional teams (business analysts, data engineers, software engineers, and stakeholders) to gather requirements and deliver solutions.
- Provide technical guidance and mentorship to junior team members and foster a culture of continuous learning.
- Translate business requirements into technical specifications and deliver scalable and reliable solutions.
- Cloud & Data Technologies:
- Utilize Azure cloud services such as Azure Databricks, Azure SQL Database, Data Factory, ADLS, Blob Storage, and Azure Synapse to manage and orchestrate data workflows.
- Stay up-to-date with the latest trends in cloud computing, data architecture, and business intelligence.
- Quality Assurance & Best Practices:
- Establish coding standards, data governance practices, and security protocols for all data-related processes.
- Conduct code reviews, and performance tuning, and ensure data integrity and accuracy.
- Design disaster recovery and backup strategies to ensure data availability and reliability.
Required Qualifications:
- 12+ years of experience as a Technical Architect,, or similar role with a focus on Azure Data Bricks, Power BI, and ETL.
- Expertise in designing and implementing data architectures using Azure Data Bricks (ADB).
- Strong proficiency in Power BI for building scalable reports and dashboards.
- In-depth knowledge of ETL tools and processes, particularly with Azure Data Factory and other Azure-based ETL solutions.
- Proficiency in SQL and familiarity with data warehousing concepts (e.G., star schema, snowflake schema).
- Strong understanding of cloud computing and Azure services, including storage, compute, and security best practices.
- Experience with data lake architecture, data pipelines, and data governance.
- Ability to understand complex business requirements and translate them into technical solutions.
- Strong communication skills with the ability to collaborate across business and technical teams.
- Leadership and mentoring experience, guiding junior team members to achieve project goals.
Preferred Qualifications:
- Certification in Azure (e.G., Azure Solutions Architect, Azure Data Engineer).
- Experience with other BI tools or visualization platforms (e.G., Power BI,, PowerApps).
- Knowledge of programming/scripting languages such as Python, Scala, or DAX.
- Familiarity with DevOps practices in data pipelines and CI/CD workflows.
- Experience with Agile methodologies and project management tools like JIRA or Azure DevOps.
Data Architecture and Modeling Analyst
Posted today
Job Viewed
Job Description
Greetings from Teknikoz
Experience : 6-8 Years
Skill Required: Microsoft SQL Server 2019~Digital : Databricks~Data Analytics & Insights : ignio AIOps~Azure Data Factory
Job Description:
Experience in Databricks, Data Analyst with SQL, Azure background
Essential Skills:
- Azure Synapse Analytics
- Postgres Database Management
- Azure Kubernetes Service (AKS) Management
- Bash/Python scripting
- CI/CD pipelines (Jenkins, GitHub)
Be The First To Know
About the latest Maiora data engineer Jobs in India !
Staff Engineer - Data Architecture
Posted today
Job Viewed
Job Description
A Career at HARMAN
As a technology leader that is rapidly on the move, HARMAN is filled with people who are focused on making life better. Innovation, inclusivity and teamwork are a part of our DNA. When you add that to the challenges we take on and solve together, you’ll discover that at HARMAN you can grow, make a difference and be proud of the work you do every day.
Introduction: A Career at HARMAN Automotive
We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN Automotive, we give you the keys to fast-track your career.
About the Role
As a Data Architect in the Enterprise Architecture team, you will shape and govern the data architecture vision for our global automotive organization. Your role is pivotal in ensuring enterprise-wide alignment of data strategies, platforms, and governance to support AI adoption, business process digitization, and advanced analytics. You will define reference architectures, guide implementation teams, and collaborate with business and technology leaders to ensure scalable, secure, and value-driven data solutions.
What You Will Do
What You Need to Be Successful
Bonus Points if You Have
What Makes You Eligible
What We Offer
#LI-AD3
HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard torace, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Sr. Analyst, Data Architecture

Posted 14 days ago
Job Viewed
Job Description
Job Number # - Mumbai, Maharashtra, India
**Who We Are**
Colgate-Palmolive Company is a global consumer products company operating in over 200 countries specialising in Oral Care, Personal Care, Home Care, Skin Care, and Pet Nutrition. Our products are trusted in more households than any other brand in the world, making us a household name!
Join Colgate-Palmolive, a caring, innovative growth company reimagining a healthier future for people, their pets, and our planet. Guided by our core values-Caring, Inclusive, and Courageous-we foster a culture that inspires our people to achieve common goals. Together, let's build a brighter, healthier future for all.
**Title: Sr. Analyst, Data Architecture**
**Brief introduction - Role Summary/Purpose :**
+ The candidate will collaborate with Colgate Business teams and CBS Analytics to identify and develop high-impact use cases utilizing Prompt Engineering.
+ This position requires a strong foundation in Artificial Intelligence, Machine Learning, and Generative AI.
+ The ideal candidate is an analytical problem solver skilled at working with large data sets, demonstrates a collaborative and customer-centric approach (proactive and responsive to business needs), and possesses strong written and verbal communication abilities. Additionally, the candidate should have a passion for continuous learning and driving innovation to unlock new business opportunities.
**Responsibilities :**
+ Design, develop and refine AI-generated text prompts for various applications
+ Collaborate with content creators, product teams and data scientists to ensure prompt alignment with company goals and user needs
+ Monitor and analyze prompt performance to identify areas for improvement
+ Optimize AI prompt generation process to enhance overall system performance
+ Stay up-to-date on the latest advancements in AI, natural language processing and machine learning
+ Provide support to teams in understanding prompt engineering best practices
**Required Qualifications :**
+ Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field
+ 1 Year of Experience with prompt engineering for large language models (LLMs) including Gemini, GPT, Llama, and Claude is preferred
+ Comprehensive understanding of Artificial Intelligence, Machine Learning, and generative AI platforms
+ Knowledge of Data Transformation tools - R/Python/ SQL/DBT/Cloud solutions ( GCP/Snowflake )
+ Working knowledge of visualization tools like SIgma,Tableau, DOMO, Data studio,
+ Ability to Read , Analyze and Visualize data
+ Effective Verbal & Written Communication for Business engagement
**Preferred Qualifications:**
+ Excellent problem-solving and analytical skills
+ Ability to collaborate effectively with cross-functional teams
+ Working knowledge of consumer packaged goods industry
+ Understanding of Colgate's processes, and tools supporting analytics (for internal candidates)
+ Willingness and ability to experiment with new tools and techniques
+ Good facilitation and project management skills
**Our Commitment to Inclusion**
Our journey begins with our people-developing strong talent with diverse backgrounds and perspectives to best serve our consumers around the world and fostering an inclusive environment where everyone feels a true sense of belonging. We are dedicated to ensuring that each individual can be their authentic self, is treated with respect, and is empowered by leadership to contribute meaningfully to our business.
**Equal Opportunity Employer**
Colgate is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, colour, religion, gender, gender identity, sexual orientation, national origin, ethnicity, age, disability, marital status, veteran status (United States positions), or any other characteristic protected by law.
Reasonable accommodation during the application process is available for persons with disabilities. Please complete this request form ( should you require accommodation.
#LI-Hybrid
Sr Manager, Data Architecture
Posted 18 days ago
Job Viewed
Job Description
About McDonalds:
One of the worlds largest employers with locations in more than 100 countries, McDonalds Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise.Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe.
Position Summary:
We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization.
Who we are looking for:
Primary Responsibilities:
Key Responsibilities
Architecture & Design:
- Design and implement comprehensive data architectures using AWS or GCP services
- Develop data models, schemas, and integration patterns for structured and unstructured data
- Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines
- Implement data governance frameworks and ensure compliance with security standards
- Design disaster recovery and business continuity strategies for data systems
Technical Leadership:
- Lead cross-functional teams in implementing data solutions and migrations
- Provide technical guidance on cloud data services selection and optimization
- Collaborate with stakeholders to translate business requirements into technical solutions
- Drive adoption of cloud-native data technologies and modern data practices
Platform Implementation:
- Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.)
- Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery)
- Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub)
- Implement automated data quality monitoring and validation processes
- Establish CI/CD pipelines for data infrastructure deployment
Performance & Optimization:
- Monitor and optimize data pipeline performance and cost efficiency
- Implement data partitioning, indexing, and compression strategies
- Conduct capacity planning and scaling recommendations
- Troubleshoot complex data processing issues and performance bottlenecks
- Establish monitoring, alerting, and logging for data systems
Skill:
- Bachelors degree in Computer Science, Data Engineering, or related field
- 9+ years of experience in data architecture and engineering
- 5+ years of hands-on experience with AWS or GCP data services
- Experience with large-scale data processing and analytics platforms
- AWS Redshift, S3, Glue, EMR, Kinesis, Lambda
- AWS Data Pipeline, Step Functions, CloudFormation
- BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub
- GCP Cloud Functions, Cloud Composer, Deployment Manager
- IAM, VPC, and security configurations
- SQL and NoSQL databases
- Big data technologies (Spark, Hadoop, Kafka)
- Programming languages (Python, Java, SQL)
- Data modeling and ETL/ELT processes
- Infrastructure as Code (Terraform, CloudFormation)
- Container technologies (Docker, Kubernetes)
- Data warehousing concepts and dimensional modeling
- Experience with modern data architecture patterns
- Real-time and batch data processing architectures
- Data governance, lineage, and quality frameworks
- Business intelligence and visualization tools
- Machine learning pipeline integration
- Strong communication and presentation abilities
- Leadership and team collaboration skills
- Problem-solving and analytical thinking
- Customer-focused mindset with business acumen
Preferred Qualifications:
- Masters degree in relevant field
- Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer)
- Experience with multiple cloud platforms
- Knowledge of data privacy regulations (GDPR, CCPA)
Work location: Hyderabad, India
Work pattern: Full time role.
Work mode: Hybrid.
Additional Information:
McDonalds is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonalds provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
McDonalds Capability Center India Private Limited (McDonalds in India) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonalds in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonalds in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws.
Nothing in this job posting or description should be construed as an offer or guarantee of employment.