106 Big Data Technologies jobs in Bangalore
Data Engineering, Associate

Posted 1 day ago
Job Viewed
Job Description
At BlackRock, technology has always been at the core of what we do - and today, our technologists continue to shape the future of the industry with their innovative work. We are not only curious but also collaborative and eager to embrace experimentation as a means to solve complex challenges. Here you'll find an environment that promotes working across teams, businesses, regions and specialties - and a firm committed to supporting your growth as a technologist through curated learning opportunities, tech-specific career paths, and access to experts and leaders around the world.
We are seeking a highly skilled and motivated Senior level Data Engineer to join the Private Market Data Engineering team within Aladdin Data at BlackRock for driving our Private Market Data Engineering vision of making private markets more accessible and transparent for clients. In this role, you will work multi-functionally with Product, Data Research, Engineering, and Program management.
Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. The candidate will have extensive experience in developing data pipelines using Python, Java, Apache Airflow orchestration platform, DBT (Data Build Tool), Great Expectations for data validation, Apache Spark, MongoDB, Elasticsearch, Snowflake and PostgreSQL. In this role, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines. You will collaborate with various stakeholders to ensure the data pipelines are efficient, reliable, and meet the needs of the business.
**Key Responsibilities**
+ Design, develop, and maintain data pipelines using Aladdin Data Enterprise Data Platform framework
+ Develop ETL/ELT data pipelines using Python, SQL and deploy them as containerized apps on a Kubernetes cluster
+ Develop API for data distribution on top of the standard data model of the Enterprise Data Platform
+ Design and develop optimized back-end services in Java / Python for APIs to handle faster data retrieval and optimized processing
+ Develop reusable back-end services for data pipeline processing in Python / Java
+ Develop data transformation using DBT (Data Build Tool) with SQL or Python
+ Ensure data quality and integrity through automated testing and validation using tools like Great Expectations
+ Implement all observability requirements in the data pipeline
+ Optimize data workflows for performance and scalability
+ Monitor and troubleshoot data pipeline issues, ensuring timely resolution
+ Document data engineering processes and best practices whenever required
**Required Skills and Qualifications**
+ Must have 5 to 8 years of experience in data engineering, with a focus on building data pipelines and Data Services APIs
+ Strong server-side programming skills in Python and/or Java.
+ Experience working with backend microservices and APIs using Java and/or Python
+ Experience with Apache Airflow or any other orchestration framework for data orchestration
+ Proficiency in DBT for data transformation and modeling
+ Experience with data quality validation tools like Great Expectations or any other similar tools
+ Strong at writing SQL and experience with relational databases like SQL Server, PostgreSQL
+ Experience with cloud-based data warehouse platform like Snowflake
+ Experience working on NoSQL databases like Elasticsearch and MongoDB
+ Experience working with container orchestration platform like Kubernetes on AWS and/or Azure cloud environments
+ Experience on Cloud platforms like AWS and/or Azure
+ Ability to work collaboratively in a team environment
+ Need to possess critical skills of being detail oriented, passion to learn new technologies and good analytical and problem-solving skills
+ Experience with Financial Services application is a plus
+ Effective communication skills, both written and verbal
+ Bachelor's or Master's degree in computer science, Engineering, or a related field
**Our benefits**
To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.
**Our hybrid work model**
BlackRock's hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person - aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.
**About BlackRock**
At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children's educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.
This mission would not be possible without our smartest investment - the one we make in our employees. It's why we're dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.
For additional information on BlackRock, please visit @blackrock ( | Twitter: @blackrock ( | LinkedIn: is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Data Engineering Developer - Data Core

Posted 1 day ago
Job Viewed
Job Description
**Job Description:**
**Primary Responsibilities** include but are not limited to the following:
**Responsibilities:**
+ Analyzes, designs, develops, tests, and supports Data platform configuration and management
+ Reviews platform user requirements and develops technical architectural design documents to implement functionality needed
+ Learn and understand existing and new platform capabilities to use efficiently and effectively
+ Scripting and automating jobs across platforms to identify issues to fix and optimize
+ Data platform configuration, optimization and support
+ Security design and implementation
+ Oversight and governance of platform capabilities
+ Participates actively in local and global teams
**Requirements:**
+ Bachelor's Degree in Computer Science or Technology from an accredited university or equivalent combination of education and experience
+ 4-6years of experience in data platforms particularly on Snowflake, HANA, Azure, AWS or relevant data platform technologies
+ Experience in Snowflakemodeling, advanced SQL scripting, expert performance tuning
+ Experience in Relational Databases and concepts
+ Detail oriented, self-directed and have strong independent problem-solving and multi-tasking skills
+ Fluent in English (verbal and written communication)
+ Certification in Snowflake, Databricks, Azure, AWS, HANA is a plus
+ Have a strong background and interest in technology
Learn more about 3M's creative solutions to the world's problems at or on Instagram, Facebook, and LinkedIn @3M.
Safety is a core value at 3M. All employees are expected to contribute to a strong Environmental Health and Safety (EHS) culture by following safety policies, identifying hazards, and engaging in continuous improvement.
**Please note: your application may not be considered if you do not provide your education and work history, either by: 1) uploading a resume, or 2) entering the information into the application fields directly.**
**3M Global Terms of Use and Privacy Statement**
Carefully read these Terms of Use before using this website. Your access to and use of this website and application for a job at 3M are conditioned on your acceptance and compliance with these terms.
Please access the linked document by clicking here ( , select the country where you are applying for employment, and review. Before submitting your application, you will be asked to confirm your agreement with the terms.
At 3M we apply science in collaborative ways to improve lives daily as our employees connect with customers all around the world. Learn more about 3M's creative solutions to global challenges at or on Twitter @3M or @3MNews.
3M does not discriminate in hiring or employment on the basis of race, color, sex, national origin, religion, age, disability, veteran status, or any other characteristic protected by applicable law.
Sr Director, Data Engineering

Posted 1 day ago
Job Viewed
Job Description
We are brand builders who focus our passion and creativity to build Calvin Klein and TOMMY HILFIGER into the most desirable lifestyle brands in the world and at the same time position PVH as one of the best-performing brand groups in our sector. Guided by our values and enabled by our scale and global reach, we are driving fashion forward for good, as one team with one vision and one plan. That's the Power of Us, that's the Power of PVH+.
One of PVH's greatest strengths is our people. Our collective desire is to create a workplace environment where every individual is valued, and every voice is heard, and we are committed to fostering an inclusive and diverse community of associates with a strong sense of belonging. Learn more about Inclusion & Diversity at PVH **here ( .
**Position Overview**
As the **Sr Director of Data Engineering** , you will provide enterprise leadership for PVH's data engineering initiatives and play a critical role in transforming how data is ingested, integrated, and delivered to power decision-making across the company. This role leads the design, development, and governance of enterprise-scale data platforms as part of the **PVH Data Powerhouse initiative** , PVH's enterprise-wide modernization program consolidating legacy BI and data solutions into a unified, governed, and scalable data ecosystem on Microsoft Azure with defined integration points to AWS.
You will be accountable for the enterprise data architecture, semantic modeling standards, and operational performance of data platforms, ensuring solutions are designed, governed, and optimized to maximize business value. In partnership with Enterprise Architecture, Data Product Owners, and senior business leaders, you will define and enforce data engineering standards, oversee the migration and rationalization of legacy data pipelines and integrations, and enable scalable, high-quality data products across PVH's global brands (Calvin Klein, Tommy Hilfiger) and corporate functions.
This is a leadership role that requires not only deep technical expertise but also vision, executive influence, and people leadership. You will lead a team of data engineers, platform architects, and data integration specialists, set enterprise-wide standards, and champion the Data Powerhouse transformation at the highest levels of the organization.
**Key Responsibilities**
· **Visionary Leadership:** Define and execute PVH's enterprise data engineering strategy to advance the Data Powerhouse initiative, ensuring alignment with organizational objectives and business outcomes.
· **Team Leadership & Development:** Lead, mentor, and scale a high-performing team of data engineers, architects, and integration specialists, fostering a culture of innovation, technical excellence, and collaboration.
· **Enterprise Architecture & Solution Oversight:** Architect and oversee the implementation of scalable, secure, and high-performance data platforms and pipelines across the enterprise, leveraging Azure data services, Data bricks, and other cloud technologies.
· **Data Product Enablement:** Collaborate with Data Product Managers, Analytics Leaders, and Business Executives to translate strategic business priorities into technical solutions that accelerate insights and innovation.
· **Data Governance & Compliance:** Establish and enforce enterprise-wide standards for data governance, security, and regulatory compliance, ensuring all data platforms and pipelines meet rigorous quality and operational standards.
· **Optimization & Performance Management:** Drive best practices in data modeling, ETL/ELT processes, and platform optimization to maximize the performance, scalability, and reliability of enterprise data assets.
· **Cross-Functional Collaboration:** Act as a trusted advisor to senior leadership, providing technical guidance, architectural review, and thought leadership to advance enterprise data capabilities.
· **Operational Excellence:** Implement enterprise-level monitoring, observability, and operational controls to ensure the integrity, availability, and reliability of data solutions.
**Qualifications & Skills**
**Required**
· 12+ years of progressive experience in data engineering, with at least 5 years in a senior leadership role.
· Proven expertise in AWS & Azure data services, including Azure Data Factory, Azure Databricks, and Azure Synapse, with hands-on experience driving enterprise adoption.
· Deep technical knowledge of SQL, Python, distributed data processing frameworks, and modern data architectures.
· Strong understanding of enterprise data governance, data cataloging, metadata management, and regulatory compliance.
· Demonstrated ability to lead and scale high-performing technical teams while influencing business stakeholders and driving transformational initiatives.
· Experience implementing CI/CD pipelines for data workflows and enterprise-grade data operations.
· Exceptional foresight, problem-solving, and executive-level communication skills.
**Preferred**
· Experience with real-time data streaming, event-driven architectures, or operational analytics.
· Familiarity with enterprise data catalog, lineage, and metadata management tools within Azure.
· Azure Data Engineer or Azure Solutions Architect certification.
**Reporting Line & Location**
· Reports to: Vice President, Data & Analytics
o **Other team members of CDA** (Commercial Data & Analytics) - Senior Vice President Commercial Data & Analytics organization
· Location(s):
o Bangalore, India (Approx 10-12 +)
· Direct Reports:
o **Data Engineers** - Located in Bangalore, India
**Why This Role**
This is a high-impact, enterprise leadership role responsible for shaping PVH's data engineering landscape. The Director of Data Engineering will ensure all enterprise data assets are scalable, secure, governed, and fully aligned with the strategic objectives of the **Data Powerhouse initiative** , enabling PVH to achieve data-driven innovation, operational excellence, and measurable business outcomes across its global brands and corporate functions.
_PVH Corp. or its subsidiary ("PVH") is an equal opportunity employer and considers all applicants for employment on the basis of their individual capabilities and qualifications without regard to race, ethnicity, color, sex, gender identity or expression, age, religion, national origin, citizenship status, sexual orientation, genetic information, physical or mental disability, military status or any other characteristic protected under federal, state or local law. In addition to complying with all applicable laws, PVH is also committed to ensuring that all current and future PVH associates are compensated solely on job-related factors such as skill, ability, educational background, work quality, experience and potential._
DIVERSITY & EQUAL OPPORTUNITY We are committed to recruiting, training and providing career advancement to all associates regardless of gender, race, religion, age, disability, sexual orientation, nationality, or social or ethnic origin. Diversity in the workplace is encouraged. Bigotry, racism and any form of harassment or discrimination is not tolerated.
Software Developer - Data Engineering

Posted 1 day ago
Job Viewed
Job Description
At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
**Your role and responsibilities**
Advanced Programming Skills in Python,Scala,Go: Strong expertise in developing and maintaining microservices in Go (or other similar languages), with the ability to lead and mentor others in this area.
* Extensive exposure in developing Big Data Applications ,Data Engineering ,ETL and Data Analytics .
* Cloud Expertise: In-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of deploying and managing cloud-native applications.
Leadership and Collaboration: Ability to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. * Security and Compliance: Strong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements.
* Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment.
**Required technical and professional expertise**
* 4-7 years' experience primarily in using Apache Spark, Kafka and SQL preferably in Data Engineering projects with a strong TDD approach.
* Advanced Programming Skills in languages like Python ,Java , Scala with proficiency in SQL
* Extensive exposure in developing Big Data Applications, Data Engineering, ETL ETL tools and Data Analytics.
* Exposure in Data Modelling, Data Quality and Data Governance.
* Extensive exposure in Creating and maintaining Data pipelines - workflows to move data from various sources into data warehouses or data lakes.
* Cloud Expertise: In-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of developing, deploying and managing cloud-native applications.
* Good to have Front-End Development experience: React, Carbon, and Node for managing and improving user-facing portals.
* Leadership and Collaboration: Ability to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members.
* Security and Compliance: Strong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements.
* Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment.
**Preferred technical and professional experience**
* Hands on experience with Data Analysis & Querying using SQLs and considerable exposure to ETL processes.
* Expertise in developing Cloud applications with High Volume Data processing.
* Worked on building scalable Microservices components using various API development frameworks.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Cloud Data Engineering - Databricks
Posted 6 days ago
Job Viewed
Job Description
Description for Internal Candidates
Core Skills /Must Have
- Azure Databricks; Azure Data Factory; Azure SQL;
Primary / Should Have
- API integrations, SQL
Secondary / Could Have
- OneSpace
Key Responsibilities:
- Should be able to manage all the ISAD flow in PROD and UAT environment.
- Should be available 24*7 over the first 15 days of the month to support Month end loads.
- Should be able to debug if any issues occur during the loads
- Should be responsible for completing loads under SLA.
- Should raise incidents if any issues occur
- Should be able to coordinate with the release management team for all the release activities.
- Should work with Business teams if needed.
Qualifications:
- Bachelors degree in computer science, Information Technology, or a related field.
- 3-5 Years of Experience
- Proven troubleshooting skills with a focus on analysis and resolution of complex issues.
- Excellent communication and interpersonal skills, with the ability to interact effectively with technical and non-technical stakeholders.
- A quick learner with a proactive approach to problem-solving and the ability to work independently as well as part of a team.
Senior Manager - Data Engineering Lead

Posted 1 day ago
Job Viewed
Job Description
**Job Title:** Senior Manager - Data Engineering Lead
**About the Function:** Our Digital and Technology (D&T) team are innovators, delivering ground-breaking solutions that will help shape the future of our iconic brands. Technology touches every part of our business, from the sourcing of sustainable ingredients to marketing and development of our online platforms. We utilise data insights to build competitive advantage, supporting our people to deliver value faster.
Our D&T team includes some of the most talented digital professionals in the industry. Every day, we come together to push boundaries and innovate, shaping the digital solutions of tomorrow. Whatever your passion, we'll help you become the best you can be, creating career-defining work and delivering breakthrough thinking.
**About the role:** Data Management roles are skills and expertise in the secure storage and management of data in compliance with guidelines, standards and best practice. They enable efficient and high quality data analysis by designing and implementing approaches to clean and transform data and store data in the most appropriate way.
**Purpose of Position:** We are seeking a highly skilled and experienced Data Engineering Lead to spearhead our data architecture strategy, oversee robust data engineering initiatives, and manage data lake operations. The ideal candidate will be a hands-on leader with deep expertise in building and optimizing data platforms that support advanced analytics and business intelligence at scale.
**Qualification:** Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
**Required skillset:**
+ Experience in data engineering.
+ Proven experience in cloud platforms (AWS, Azure, or GCP) and data services (Glue, Synapse, BigQuery, Databricks, etc.).
+ Hands-on experience with tools like Apache Spark, Kafka, Airflow, dbt, and modern orchestration platforms.
+ Technical Skills
+ Proficient in SQL, Python/Scala/Java.
+ Strong understanding of modern data Lake concepts (e.g., Snowflake, Redshift, BigQuery).
+ Familiarity with CI/CD, Infrastructure as Code (e.g., Terraform), and DevOps for data.
**Nice to Have:**
+ Prior experience working in a regulated industry (alcohol, pharma, tobacco, etc.).
+ Exposure to demand forecasting, route-to-market analytics, or distributor performance management.
+ Knowledge of CRM, ERP, or supply chain systems (e.g., Salesforce, SAP, Oracle).
+ Familiarity with marketing attribution models and campaign performance tracking.
**Preferred Attributes:**
+ Strong analytical and problem-solving skills.
+ Excellent communication and stakeholder engagement abilities.
+ Passion for data-driven innovation and delivering business impact.
+ Certification in cloud platforms or data engineering (e.g., Google Cloud Professional Data Engineer).
+ Excellent communication and stakeholder management skills.
**Key Accountabilities:**
+ Design and implement scalable, high-performance data architecture solutions aligned with enterprise strategy.
+ Define standards and best practices for data modelling, metadata management, and data governance.
+ Collaborate with business stakeholders, data scientists, and application architects to align data infrastructure with business needs.
+ Guide the selection of technologies, including cloud-native and hybrid data architecture patterns (e.g., Lambda/Kappa architectures).
+ Lead the development, deployment, and maintenance of end-to-end data pipelines using ETL/ELT frameworks.
+ Manage ingestion from structured and unstructured data sources (APIs, files, databases, streaming sources).
+ Optimize data workflows for performance, reliability, and cost efficiency.
+ Ensure data quality, lineage, cataloging, and security through automated validation and monitoring.
+ Oversee data lake design, implementation, and daily operations (e.g., Azure Data Lake, AWS S3, GCP BigLake).
+ Implement access controls, data lifecycle management, and partitioning strategies.
+ Monitor and manage performance, storage costs, and data availability in real time.
+ Ensure compliance with enterprise data policies and regulatory requirements (e.g., GDPR, CCPA).
+ Lead and mentor a team of data engineers and architects.
+ Establish a culture of continuous improvement, innovation, and operational excellence.
+ Work closely with IT, DevOps, and InfoSec teams to ensure secure and scalable infrastructure.
**Flexible Working Statement:** Flexibility is key to our success. From part-time and compressed hours to different locations, our people work flexibly in ways to suit them. Talk to us about what flexibility means to you so that you're supported from day one.
**Diversity statement:** Our purpose is to celebrate life, every day, everywhere. And creating an inclusive culture, where everyone feels valued and that they can belong, is a crucial part of this.
We embrace diversity in the broadest possible sense. This means that you'll be welcomed and celebrated for who you are just by being you. You'll be part of and help build and champion an inclusive culture that celebrates people of different gender, ethnicity, ability, age, sexual orientation, social class, educational backgrounds, experiences, mindsets, and more.
Our ambition is to create the best performing, most trusted and respected consumer products companies in the world. Join us and help transform our business as we take our brands to the next level and build new ones as part of shaping the next generation of celebrations for consumers around the world.
Feel inspired? Then this may be the opportunity for you.
_If you require a reasonable adjustment, please ensure that you capture this information when you submit your application._
**Worker Type :**
Regular
**Primary Location:**
Bangalore HO
**Additional Locations :**
**Job Posting Start Date :**
With over 200 brands sold in more than 180 countries, we're the world's leading premium drinks company. Every day, over 30,000 talented people come together at Diageo to create the magic behind our much-loved brands. From iconic names to innovative newcomers - the brands we're building are rooted in culture and local communities. Our ambition is to be one of the best performing, most trusted and most respected consumer products companies in the world.
Our founders, such as Arthur Guinness, John Walker, and Charles Tanqueray, were visionary entrepreneurs whose brilliant minds helped shape the alcohol industry. And through our people, their legacy lives on. Join us, and you'll collaborate with talented people from all corners of the world. Together, you'll innovate and push boundaries, shaping a more inclusive and sustainable future that we can all be proud of.
With diversity at our core, we celebrate our people's unique passions, commitments and specialist skills. Because when varied voices, mindsets, and personalities come together, great ideas are born. In our supportive culture, your voice will be heard and you'll be empowered to be you. Just bring your ambition, curiosity and ideas, and we'll celebrate your work and help you reach your fullest potential.
**DRINKiQ**
What's your DRINKiQ? Take our quiz to understand how alcohol is made and explore the effects of drinking. You can discover everything you need to know at DRINKiQ (
Senior Manager Software Engineering-Data Engineering
Posted today
Job Viewed
Job Description
Technology, Digital and Data
**Job Description:**
**Your Work Shapes the World at Caterpillar Inc.**
When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it.
We are seeking a **highly skilled and visionary Software Engineering Manager** to lead a team of engineers in building Caterpillar's next-generation **Digital Manufacturing Data Platform** . This platform is central to our Future of Manufacturing initiative-designed to unify and operationalize data across design, engineering, production, and supply chain operations.
The ideal candidate will possess deep expertise in **Big Data, Data Warehousing, real-time data movement** , and **Snowflake-based architectures** . You will architect and deliver scalable, secure, and intelligent data platforms that enable advanced analytics, AI, and digital twin capabilities across global manufacturing ecosystems.
**Key Responsibilities**
**Team Leadership & Management**
+ Lead, mentor, and manage a team of data engineers and platform developers.
+ Foster a culture of technical excellence, collaboration, and continuous learning.
+ Drive Agile practices and ensure timely delivery of high-quality solutions.
**Technical Strategy & Architecture**
+ Architect and oversee the development of scalable, secure, and resilient data platforms.
+ Design and implement near real-time data movement and streaming architectures using tools like Kafka, Spark, and cloud-native services.
+ Establish best practices in data modeling, ETL/ELT, data governance, and metadata management.
**Data Engineering & Snowflake Expertise**
+ Lead the development of robust data pipelines for ingestion, transformation, and delivery using Snowflake, dbt, and cloud-native tools.
+ Optimize data storage, retrieval, and processing for performance, reliability, and cost-efficiency.
+ Implement data quality frameworks, lineage tracking, and schema evolution strategies.
**Big Data & Data Warehousing**
+ Build and maintain large-scale data lakes and data warehouses for structured and unstructured data.
+ Design scalable data architectures to support manufacturing analytics, predictive maintenance, and supply chain optimization.
**Cloud & Platform Engineering**
+ Leverage Azure and AWS services for data ingestion, transformation, and analytics.
+ Deploy software using CI/CD tools (Azure DevOps preferred, Jenkins, AWS CloudFormation).
+ Ensure platform scalability, security, and operational readiness across global deployments.
**AI & Advanced Analytics Enablement**
+ Collaborate with Data Science and AI teams to operationalize ML models and analytics workflows.
+ Promote integration of AI capabilities into data engineering pipelines (e.g., GenAI, MCP, ATA).
+ Support real-time analytics and edge AI use cases in manufacturing environments.
**Stakeholder Engagement**
+ Partner with product managers, manufacturing SMEs, and business leaders to understand requirements and deliver impactful data solutions.
+ Communicate technical concepts to non-technical audiences and influence strategic decisions.
**Must-Have Skills**
+ Proven experience in Big Data processing and Data Warehousing.
+ Expertise in building end-to-end near real-time data pipelines for OLTP & OLAP.
+ Strong architecture exposure for building robust, scalable Data Platforms.
+ Deep expertise in Snowflake, SQL, NoSQL, and distributed data systems.
+ Experience with data transformation tools (dbt, Apache Spark, Azure Data Factory).
+ Strong analytical skills and solid knowledge of computer science fundamentals.
+ Deep exposure to Azure and AWS cloud platforms.
+ Good understanding of AI concepts and latest developments (Gen AI, MCP, ATA, etc.).
**Nice-to-Have Skills**
+ Knowledge of the NVIDIA ecosystem and its applications in data and AI.
+ Experience building production-ready AI solutions and integrating with MLOps workflows.
+ Familiarity with modern data visualization and BI tools (e.g., Power BI, Tableau, Looker).
**Qualifications**
+ Bachelor's or Master's degree in Computer Science, Engineering, or related field.
+ 15+ years of experience in data engineering, with at least 5+ years in a leadership role.
+ Demonstrated success in managing engineering teams and delivering complex data solutions.
+ Excellent communication, leadership, and stakeholder management skills.
Relocation is available for this position.
**Posting Dates:**
October 22, 2025 - October 30, 2025
Caterpillar is an Equal Opportunity Employer. Qualified applicants of any age are encouraged to apply
Not ready to apply? Join our Talent Community ( .
Be The First To Know
About the latest Big data technologies Jobs in Bangalore !
Practice SME- Data Engineering AWS/Azure
Posted 20 days ago
Job Viewed
Job Description
PDL/SME Responsibilities:
Technical Expertise:
- Expertise in building big data pipelines with AWS or Azure Data Engineering services and good hands-on experience with Spark/PySpark, Scala/Python/Java, Data Modelling and Visualization.
Delivery Assurance:
- Proposal review, solution/architecture review and efforts validation from delivery perspective.
- Develop and implement engineering best practices, processes, and methodologies to improve efficiency, quality, and productivity.
Delivery Support:
- Lead and resolve delivery escalations.
- Provide technical guidance and support to engineering teams, assisting in problem-solving and decision-making processes.
- Drive towards higher customer satisfaction.
Talent building:
- Drive talent building for offerings and new-age skills.
- Enable collaboration and knowledge sharing within the DPE tower / CPPE top accounts.
- Evaluate and onboard new skills and frameworks to enhance development processes and capabilities.
Technical Excellence:
- Incubate new and large projects for the initial period till start green
- Provide Technical leadership for solution or product development, modernization & transformation for Engineering Modern Data Platforms and Data Product programs.
Qualifications and Skills:
- Bachelor's or Master's degree in Engineering. Advanced degrees are highly desirable.
- Good technical experience with Data Engineering, Data Platforms, Modern Data Platforms, and Data Product programs.
- Proven experience in a leadership role, such as Engineering Director, Engineering Manager, or similar, with a track record of successfully leading and delivering complex projects.
- Demonstrated experience in managing and mentoring engineering teams, fostering a collaborative and high-performing work environment.
- Good understanding of software development methodologies, agile practices, and project management principles.
- Excellent problem-solving and analytical skills, with the ability to make data-driven decisions.
- Good communication and interpersonal skills.
- Strategic mindset with the ability to align engineering efforts with business goals and objectives.
- Proven ability to manage budgets, allocate resources effectively, and deliver projects on time and within budget.
Trainee Intern Data Science
Posted 15 days ago
Job Viewed
Job Description
Company Overview – WhatJobs Ltd
WhatJobs is a global job search engine and career platform operating in over 50 countries. We leverage advanced technology and AI-driven tools to connect millions of job seekers with opportunities, helping businesses and individuals achieve their goals.
Position: Data Science Trainee/Intern
Location: Commercial Street
Duration: 3 Months
Type: Internship/Traineeship (with potential for full-time opportunities)
Role Overview
We are looking for enthusiastic Data Science trainees/interns eager to explore the world of data analytics, machine learning, and business insights. You will work on real-world datasets, apply statistical and computational techniques, and contribute to data-driven decision-making at WhatJobs.
Key Responsibilities
- Collect, clean, and analyze datasets to derive meaningful insights.
- Assist in building and evaluating machine learning models.
- Work with visualization tools to present analytical results.
- Support the team in developing data pipelines and automation scripts.
- Research new tools, techniques, and best practices in data science.
Requirements
- Basic knowledge of Python and data science libraries (Pandas, NumPy, Matplotlib, Scikit-learn).
- Understanding of statistics, probability, and data analysis techniques.
- Familiarity with machine learning concepts.
- Knowledge of Google Data Studio and BigQuery for reporting and data management.
- Strong analytical skills and eagerness to learn.
- Good communication and teamwork abilities.
What We Offer
- Hands-on experience with real-world data science projects.
- Guidance and mentorship from experienced data professionals.
- Opportunity to work with a global technology platform.
- Certificate of completion and potential for full-time role.
Company Details
Manager, Engineering - Data Engineering | Big Data | People Management

Posted 1 day ago
Job Viewed
Job Description
Manager, Engineering - Data Engineering | Big Data | People Management
**About Skyhigh Security:**
Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the world's data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency.
Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company.
Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our 'Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self.
We are on these too! Follow us on LinkedIn ( and ( .
**_Role Overview:_**
We are seeking an experienced and strategic Data Engineering Manager to lead our team. The ideal candidate will have a strong background in big data architecture, cloud-native services, and team leadership. You will be responsible for building and optimizing our data platforms, ensuring data security and privacy, and integrating emerging technologies like AI/LLMs to enhance our analytics capabilities. This role requires a blend of technical expertise, leadership skills, and a forward-thinking mindset to drive our data initiatives.
**Responsibilities:**
+ Lead, mentor, and grow a team of talented data engineers. Provide technical guidance and conduct code reviews to ensure best practices.
+ Architect and develop scalable and robust data pipelines using a blend of big data frameworks (e.g., Spark, Kafka, Flink) and cloud-native services (AWS) to support security analytics use cases.
+ Drive CI/CD best practices, infrastructure automation, and performance tuning across distributed environments.
+ Evaluate and pilot the use of AI/LLM technologies in data pipelines for tasks like anomaly detection, metadata enrichment, and automation.
+ Ensure data security and privacy compliance across all data platforms and processes.
+ Evaluate and integrate LLM-based automation and AI-enhanced observability into engineering workflows.
+ Collaborate with data scientists, product managers, and business leaders to understand data needs and deliver solutions that drive business value.
+ Oversee the design and implementation of various databases, including NoSQL databases like HBase and Cassandra.
**What We're Looking For (Minimum Qualifications)**
+ 10 to 15 years of experience in big data architecture and engineering, including deep proficiency with the AWS cloud platform, with at least 3-5 years in a leadership or management role.
+ Expertise in distributed systems and frameworks such as Apache Spark, Scala, Kafka, and Flink, with experience building production-grade data pipelines.
+ Strong programming skills in Java for building scalable data applications.
+ Hands-on experience with ETL tools and orchestration systems.
+ Solid understanding of data modeling across both relational (PostgreSQL, MySQL) and NoSQL (HBase) databases, as well as performance tuning.
+ Demonstrated experience with AWS services including Lambda functions, AWS Step Functions, and CloudFormation (CF).
+ Strong interpersonal and communication skills to effectively lead a team and collaborate with a diverse group of stakeholders.
+ Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
**What Will Make You Stand Out (Preferred Qualifications)**
+ Experience integrating AI/ML or LLM frameworks (e.g., LangChain, LlamaIndex) into data workflows.
+ Experience implementing CI/CD pipelines with Kubernetes, Docker, and Terraform.
+ Knowledge of modern data warehousing (e.g., BigQuery, Snowflake) and data governance principles (GDPR, HIPAA).
**_Company Benefits and Perks:_**
We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees.
+ Retirement Plans
+ Medical, Dental and Vision Coverage
+ Paid Time Off
+ Paid Parental Leave
+ Support for Community Involvement
We're serious about our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.