1,405 Data Engineering jobs in India
Data Engineering
Posted today
Job Viewed
Job Description
Job Title: Middleware Engineer
Position: Data Engineer
Experience: 5-6yrs
Category: IT Infrastructure
Main location: India, Karnataka, Bangalore
Employment Type: Full Time
Qualification: Bachelor's degree in Computer Science or related field or higher.
Roles and Responsibilities
Data Engineer - 5-6 years experience.
Responsibilities
===
Design, develop, and maintain data architectures, pipelines, and workflows for the collection, processing, storage, and retrieval of large volumes of structured and unstructured data from multiple sources.
Collaborate with cross-functional teams to identify and prioritize data engineering requirements and to develop and deploy data-driven solutions to address business challenges.
Build and maintain scalable data storage and retrieval systems (e.g., data lakes, data warehouses, databases), fault-tolerant, and high-performance data platforms on cloud infrastructure such as AWS, Azure, or Google Cloud Platform.
Develop and maintain ETL workflows, data pipelines, and data transformation processes to prepare data for machine learning and AI applications.
Implement and optimize distributed computing frameworks such as Hadoop, Spark, or Flink to support high-performance and scalable processing of large data sets.
Build and maintain monitoring, alerting, and logging systems to ensure the availability, reliability, and performance of data pipelines and data platforms.
Collaborate with Data Scientists and Machine Learning Engineers to deploy models on production environments and ensure their scalability, reliability, and accuracy.
Requirements:
===
Bachelor s or master s degree in computer science, engineering, or related field.
At least 5-6 years of experience in data engineering, with a strong background in machine learning, cloud computing and big data technologies.
Experience with at least one major cloud platform (AWS, Azure, GCP).
Proficiency in programming languages like Python, Java, and SQL.
Experience with distributed computing technologies such as Hadoop, Spark, and Kafka.
Familiarity with database technologies such as SQL, NoSQL, NewSQL.
Experience with data warehousing and ETL tools such as Redshift, Snowflake, or Airflow.
Strong problem-solving and analytical skills.
Excellent communication and teamwork skills.
Preferred qualification:
===
Experience with DevOps practices and tools such as Docker, Kubernetes, or Ansible, Terraform.
Experience with data visualization tools such as Tableau, Superset, Power BI, or Plotly, D3.js.
Experience with stream processing frameworks such as Kafka, Pulsar or Kinesis.
Experience with data governance, data security, and compliance.
Experience with software engineering best practices and methodologies such as Agile or Scrum.
Must Have Skills
===
data engineer with expertise in machine learning, cloud computing , and big data technologies.
Data Engineering Experince on multiple clouds one of them , preferably GCP
data lakes, data warehouses, databases
ETL workflows, data pipelines,data platforms
Hadoop, Spark, or Flink
Hadoop, Spark, and Kafka
SQL, NoSQL, NewSQL
Redshift, Snowflake, or Airflow
Data Engineering Consultant
Posted today
Job Viewed
Job Description
**Primary Responsibilities:**
+ Be able to write complex SQL's and Python script
+ Be able design/Test DataStage(ETL) jobs
+ Come up with architecture and design on various aspects like extensibility, scalability, security, design patterns etc., against a predefined checklist and ensure that all relevant best practices are followed
+ Execute POCs to make sure that suggested design/technologies meet the requirements
+ Architecting with modern technology stack and Designing Public Cloud Application leveraging in Azure
+ Research complex functional issues logged by business teams
+ Be able to manage/guide technical teams
+ Status reporting for the projects
+ Recommendations for new data warehouse/ mart design as well as improvements in current data warehouse
+ Act as a mentor for junior team members
+ Analyzes and investigates
+ Provides explanations and interpretations within area of expertise
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ B.Tech/MCA (Master of Computer Application)/M.Tech are preferred candidates(16+ years of formal Education)
+ 10+ years in supporting requirement gathering of large DWH
+ Hands-on experience on writing Complex and Advance SQLs in multiple database environments
+ Hands on experience on ETL/Data Model and experience of collecting business requirement
+ Hands-on experience on navigation of Unix file system and profiling large data files
+ Experience with cloud computing platforms, such as Amazon Web Services (AWS) and Microsoft Azure
+ Experience with big data technologies, such as Apache Hadoop and Apache Spark
+ Knowledge of SDLC process; experience in Agile will be given more preference
+ Knowledge on Unix, testing and automation
+ Good knowledge in US Health Care domain
+ Proven solid communication and presentation skills
+ Proven excellent analytical, problem solving and data analysis abilities
+ Proven ability to Work collaboratively in global project with a positive team spirit and ensure team work & coordination
+ Proven excellent communication and client facing skills
+ Proven excellent Analytical and SQL skill set to support in the data validation
+ Proven to possess/acquire solid troubleshooting skills and be interested in performing. Troubleshooting of issues in different desperate technologies and environments
+ Flexible with the working hours as he/she needs to closely work with the US counterparts
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
Data Engineering Analyst
Posted today
Job Viewed
Job Description
**Primary Responsibilities:**
+ Communicating overall status of assigned tasks, achievements and POC's results at all levels in the projects/stakeholders to gain buy-in
+ Will provide recommendations and carry out POC's to improve performance, reliability, and reusability within the constraints of budget and business dependencies
+ In base and mid-level roles, engages across teams in a capacity that ranges from assisting on in-flight initiatives, up to technical briefings and demonstrations of new technologies across the organization
+ Design, Develop & Implement technology big data solution to convert raw datasets into reusable assets
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 2+ years combined experience in Solution Development, Project Deliveries, Product development
+ Skills:
+ Big Data / Databricks SQL & PySpark
+ Programming Languages - Python, Snowflake
+ Build / Deployment Automation - Github
+ Cloud - Azure
+ Knowledge of Scrum
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
Data Engineering Consultant
Posted today
Job Viewed
Job Description
**Primary Responsibilities:**
+ Solid experience (3+ years !) with SQL SERVER Development (PL/SQL Programming)
+ Solid experience (2+ years !) with Database Administration (SQL Server)
+ Experience with SQL Server Management Studio (SSMS)
+ Experience with SQL Server Profiler and resolving deadlocks and blocking sessions
+ Experience with MS Azure platform
+ Experience with analysing execution Plans, followed up with Query Tuning & Optimization & Indexing Strategies / statistics
+ Ability to analyse and optimize slow-running queries
+ Experience with identifying/resolving Sleeping Sessions
+ Experience with index maintenance (rebuild/reorganize)
+ Application Performance Tuning - Understanding of connection pooling, caching, and load balancing
**Required Qualifications:**
+ Bachler's degree or equivalent experience
+ 7+ years of overall experience in Data & Analytics engineering
+ 5+ years of experience working with Azure, Databricks, and ADF, Data Lake
+ 5+ years of experience working with data platform or product using PySpark and Spark-SQL
+ Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc.
+ In-depth understanding of Azure architecture & ability to come up with efficient design & solutions
+ Highly proficient in Python and SQL
+ Proven excellent communication skills
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_#Nic_
Data Engineering Analyst
Posted 2 days ago
Job Viewed
Job Description
**Primary Responsibilities:**
+ Implementation:
+ Data Mapping and Transformation:
+ File Mapping: Complete file mapping based on layouts and requirements provided by Project Managers
+ Business Logic: Document business logic for transforming data into product specifications
+ Data Quality Checks: Run and interpret quality checks against loaded data to ensure accuracy and completeness
+ Data transformation: Author and test ETL to convert data from one format to another. This includes cleaning, filtering, aggregating, enriching, normalizing, and encoding data to make it suitable for analysis, processing or integration
+ Troubleshooting and Support:
+ Issue Resolution: Troubleshoot issues raised by project managers and cross matrix teams from root cause identification to resolution
+ Support Requests: Handle support requests and provide timely solutions to ensure client satisfaction
+ Collaboration and Communication:
+ Stakeholder Interaction: Work closely with Client, Project Managers, Product managers and other stakeholders to understand requirements and deliver solutions
+ Documentation: Contribute to technical documentation of specifications and processes
+ Communication: Effectively communicate complex concepts, both verbally and in writing, to team members and clients
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Bachelor's degree in Computer Science, Health Informatics, Information Technology, or other related fields.
+ 2+ years of experience working with healthcare data (EMR Clinical and Financial data HL7 v2, CCDAs, EDI data - 835s, 837s, Claims from variety of payers etc.) sent as either Flat file/JSON/XML etc.
+ 2+ years of experience working with Hive SQL, Postgres, or other data analysis language.
+ 1+ years of experience with Git/GitHub.
**Preferred Qualifications:**
+ 2+ years of experience managing clients or working with them on tasks like requirements gathering, impact analysis etc.
+ 2+ years of experience implementing and supporting client solutions on Azure Cloud platform
+ 2+ years of experience coding in Databricks/Python or Databricks/Scala
+ Experience with cutting edge technology (AI/ML)
+ Familiarity with Agile or experience working in Scrum teams
+ Proven highly analytical and think outside the box
+ Proven solid written and verbal communications skills. Ability to clearly articulate ideas and concepts
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
Data Engineering Consultant
Posted 6 days ago
Job Viewed
Job Description
Data engineering emphasis supports the ongoing digital transformation and modernization of internal audit's risk assessment, automation efforts, and risk monitoring activities. This position is responsible for supporting Internal Audit engagements with scalable, end-to-end ETL and analytic processes. Additionally, the role is responsible for working closely with data analytics teams to create robust scripted data solutions, develop and support business monitoring tools, and support existing data systems and analytic reports. This includes identifying and integrating data sources, assessing data quality, and developing and executing data analytic tools/languages to support enterprise analytical risk assessments. This role is integral to our strategy to enable Internal Audit with data driven insights and bring value to our business partners.
The role will challenge you to leverage your data analytics skills on a variety of initiatives in a hands-on role, as well as the opportunity to develop your skills as an auditor in a matrixed and cross-functional internal audit department.
**Primary Responsibilities:**
+ Automation and Data Modeling
+ Design, build, and maintain automated data pipelines for extracting, transforming, and loading data from diverse sources (enterprise platforms, SharePoint, NAS drives, etc.)
+ Develop robust and scalable data models to support risk surveillance analytics and reporting needs
+ Implement and maintain workflows for scheduling and monitoring ETL/ELT jobs to ensure data freshness and reliability
+ Utilize scripting and workflow automation tools to reduce manual intervention in data movement and processing
+ Integrate new data sources and automate ingestion processes to expand surveillance coverage
+ Data Management and Governance
+ Ensure data quality, completeness, and consistency across all risk surveillance datasets
+ Develop and enforce data validation, cleansing, and transformation procedures to support accurate analysis
+ Implement data security and access controls in compliance with regulatory and organizational standards
+ Maintain detailed metadata, data dictionaries, and lineage documentation for all data assets
+ Support data governance initiatives, including data cataloguing, retention policies, and audit readiness
+ Collaboration and Communication
+ Partner with Risk Surveillance partners, data analysts, and audit teams to understand requirements and deliver analytical-ready datasets
+ Collaborate with IT, data stewards, and business partners to resolve data issues and facilitate access to new data sources
+ Communicate data pipeline status, issues, and solution approaches clearly to both technical and non-technical stakeholders
+ Provide training and support for users on data tools, repositories, and best practices
+ Document data workflows, processes, and solutions for knowledge sharing and operational continuity
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Overall 8+ years of program experience in Computer Science, Information Technology, Mathematics, Engineering, Data Analytics or related field
+ 4+ years of SQL programming
+ 4+ years programming in Python and/or R
+ 2+ years of data modeling and scaled automation experience
+ 2+ years of data visualization experience (Tableau and/or PowerBI)
+ Solid interpersonal and analytical skills while working effectively with a matrixed team
+ Solid oral and written communication skills
**Preferred Qualifications:**
+ 2+ years experience in developing scalable solutions with SSIS, Data Factory, Python, or R
+ Extensive program experience in Computer Science, Information Technology, Mathematics, Engineering, or related field
+ Internal Audit / Control experience
+ Cloud computing experience including Azure, AWS, Databricks, and/or Spark computing
+ Experience working in a Healthcare Industry and or a complex IT environment
+ Experience with conducting automation surrounding API calls
+ Working knowledge of Big Data tools, Cloud platforms, SQL Server database engineering
+ Data Science experience including regression analysis and machine learning techniques
+ Change management tool experience (e.g., Github, Jenkins, or similar)
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
#njp
Data Engineering Specialist
Posted 8 days ago
Job Viewed
Job Description
+ We are seeking an experienced Data Engineering Specialist interested in challenging the status quo to ensure the seamless creation and operation of the data pipelines that are needed by Sanofi's advanced analytic, AI and ML initiatives for the betterment of our global patients and customers.
+ Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives
**Main Responsibilities:**
+ Establish technical designs to meet Sanofi requirements aligned with the architectural and Data standards
+ Ownership of the entire back end of the application, including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes
+ Fine-tune and optimize queries using Snowflake platform and database techniques
+ Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements.
+ Assess and resolve data pipeline issues to ensure performance and timeliness of execution
+ Assist with technical solution discovery to ensure technical feasibility.
+ Assist in setting up and managing CI/CD pipelines and development of automated tests
+ Developing and managing microservices using python
+ Conduct peer reviews for quality, consistency, and rigor for production level solution
+ Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods
+ Own all areas of the product lifecycle: design, development, test, deployment, operation, and support
**About you**
+ **Qualifications:**
+ 5+ years of relevant experience developing backend, integration, data pipelining, and infrastructure
+ Expertise in database optimization and performance improvement
+ Expertise in Python, PySpark, and Snowpark
+ Experience data warehousing and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries
+ Experience in cloud-based data platforms (Snowflake, AWS)
+ Proficiency in developing robust, reliable APIs using Python and FastAPI Framework
+ Expert in ELT and ETL & experience working with large data sets and performance and query optimization. IICS is a plus
+ Understanding of data structures and algorithms
+ Understanding of DBT is a plus
+ Experience in modern testing framework (SonarQube, K6 is a plus)
+ Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side
+ Knowledge of DevOps best practices and associated tools is a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools:
+ **Containers and containerization technologies** (Kubernetes, Argo, Red Hat OpenShift)
+ Infrastructure as code (Terraform)
+ Monitoring and Logging (CloudWatch, Grafana)
+ CI/CD Pipelines (JFrog Artifactory)
+ Scripting and automation (Python, GitHub, Github actions)
+ Experience with JIRA & Confluence
+ Workflow orchestration (Airflow)
+ Message brokers (RabbitMQ)
+ Education: Bachelor's degree in computer science, engineering, or similar quantitative field of study
Why choose us?
+ Bring the miracles of science to life alongside a supportive, future-focused team.
+ Discover endless opportunities to grow your talent and drive your career, whether it's through a promotion or lateral move, at home or internationally.
+ Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact.
+ Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks' gender-neutral parental leave.
+ Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas.
**Pursue** _Progress_ . **Discover** _Extraordinary_ .
Progress doesn't happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let's pursue progress. And let's discover extraordinary together.
At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity.
Watch our ALL IN video ( and check out our Diversity Equity and Inclusion actions at sanofi.com ( !
Languages: English is a must
**Pursue** **_progress_** **, discover** **_extraordinary_**
Better is out there. Better medications, better outcomes, better science. But progress doesn't happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let's be those people.
At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity.
Watch our ALL IN video ( and check out our Diversity Equity and Inclusion actions at sanofi.com ( !
Global Terms & Conditions and Data Privacy Statement ( is dedicated to supporting people through their health challenges. We are a global biopharmaceutical company focused on human health. We prevent illness with vaccines, provide innovative treatments to fight pain and ease suffering. We stand by the few who suffer from rare diseases and the millions with long-term chronic conditions.
With more than 100,000 people in 100 countries, Sanofi is transforming scientific innovation into healthcare solutions around the globe. Discover more about us visiting or via our movie We are Sanofi ( an organization, we change the practice of medicine; reinvent the way we work; and enable people to be their best versions in career and life. We are constantly moving and growing, making sure our people grow with us. Our working environment helps us build a dynamic and inclusive workplace operating on trust and respect and allows employees to live the life they want to live.
All in for Diversity, Equity and Inclusion at Sanofi - YouTube (
Be The First To Know
About the latest Data engineering Jobs in India !
Data Engineering Consultant

Posted 10 days ago
Job Viewed
Job Description
**Primary Responsibilities:**
+ Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure
+ Design and develop Azure Databricks processes using PySpark/Spark-SQL
+ Design and develop orchestration jobs using ADF, Databricks Workflow
+ Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve
+ Building test framework for the Databricks notebook jobs for automated testing before code deployment
+ Design and build POCs to validate new ideas, tools, and architectures in Azure
+ Continuously explore new Azure services and capabilities; assess their applicability to business needs
+ Prepare case studies and technical write-ups to showcase successful implementations and lessons learned
+ Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture
+ Contribute to full lifecycle project implementations, from design and development to deployment and monitoring
+ Ensure solutions adhere to security, compliance, and governance standards
+ Monitor and optimize data pipelines and cloud resources for cost and performance efficiency
+ Identifies solutions to non-standard requests and problems
+ Mentor and support existing on-prem developers for cloud environment
+ Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
**Required Qualifications:**
+ Undergraduate degree or equivalent experience
+ 7+ years of overall experience in Data & Analytics engineering
+ 5+ years of experience working with Azure, Databricks, and ADF, Data Lake
+ 5+ years of experience working with data platform or product using PySpark and Spark-SQL
+ Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc.
+ In-depth understanding of Azure architecture & ability to come up with efficient design & solutions
+ Highly proficient in Python and SQL
+ Proven excellent communication skills
+ **Key Skill:** Azure Data Engineer - Azure Databricks, Azure Data factory, Python/Pyspark, Terraform
_At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission._
_#Nic #NJP_
Data Engineering manager

Posted 10 days ago
Job Viewed
Job Description
Who We Are:
Ever wonder who brings the entertainment to your flights? Panasonic Avionics Corporation is #1 in the industry for delivering inflight products such as movies, games, WiFi, and now Bluetooth headphone connectivity!
How exciting would it be to be a part of the innovation that goes into creating technology that delights millions of people in an industry that's here to stay! With our company's history spanning over 40 years, you will have stability, career growth opportunities, and will work with the brightest minds in the industry. And we are committed to a diverse and inclusive culture that will help our organization thrive! We seek diversity in many areas such as background, culture, gender, ways of thinking, skills and more.
If you want to learn more about us visit us at ( . And for a full listing of open job opportunities go to (
**The Position:**
We are seeking a proven Data Engineering Leader to drive the design, development, and deployment of scalable, secure, and high-performance data solutions. This role will lead high-performing teams, architect cloud-native data platforms, and collaborate closely with business, AI/ML, and BI teams to deliver end-to-end data products that power innovation and strategic decision-making.
The position offers the opportunity to shape data architecture strategy, establish best practices in Lakehouse and streaming solutions, and enable advanced analytics and AI/ML at scale.
**Responsibilities**
**What We're Looking For:**
+ Proven leadership in building and mentoring high-performing data engineering teams.
+ Expertise in architecting cloud-native data platforms on AWS, leveraging services such as EMR, EKS, Glue, Redshift, S3, Lambda, and SageMaker.
+ Strong background in Lakehouse architecture (Glue Catalog, Iceberg, Delta Lake) and distributed processing frameworks (Spark, Hive, Presto).
+ Experience with real-time streaming solutions (Kafka, Kinesis, Flink).
+ Proficiency in orchestrating complex data workflows with Apache Airflow.
+ Hands-on experience with GitLab CI/CD, Terraform, CloudFormation Templates, and Infrastructure-as-Code.
+ Strong understanding of MDM strategies and data governance best practices (GDPR, HIPAA, etc.).
+ Ability to design and develop middleware APIs (REST) to seamlessly integrate data pipelines with applications and analytics platforms.
+ Experience supporting AI/ML teams with feature engineering, training, and deployment pipelines using SageMaker.
+ Solid knowledge of SQL & NoSQL databases (Redshift, DynamoDB, PostgreSQL, Elasticsearch).
+ Familiarity with BI enablement and data modeling for visualization platforms like Amazon QuickSight.
+ In-depth knowledge of security best practices in AWS-based data architectures.
+ Demonstrated success in driving AI/ML initiatives from ideation to production.
**Our Principles:** ** **
Contribution to Society | Fairness & Honesty | Cooperation & Team Spirit | Untiring Effort for Improvement | Courtesy & Humility | Adaptability | Gratitude
**What We Offer:** ** **
At Panasonic Avionics Corporation we realize the most important aspects in leading our industry are the bright minds behind everything we do. We are proud to offer our employees a highly competitive, comprehensive and flexible benefits program.
**Qualifications**
**Educational Background:**
+ Bachelor's degree or higher in Computer Science, Data Engineering, Aerospace Engineering, or a related field.
+ Advanced degrees (Master's/PhD) in Data Science or AI/ML are a plus.
REQ-
Director, Data Engineering
Posted 11 days ago
Job Viewed
Job Description
**Job Purpose and Impact**
The Director, Data Engineering job leads a data engineering team responsible for the execution of the tactical and strategic plans related to design, development and maintenance of robust data pipelines and solutions. This job provides guidance to the team that ensures the efficient processing and availability of data for analysis and reporting.
**Key Accountabilities**
+ Establishes and maintains robust data systems that support large and complex data products, ensuring reliability and accessibility for partners.
+ Leads the development of technical products and solutions using big data and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
+ Oversees and guides the design and development of data pipelines that facilitate the movement of data from various sources to internal databases.
+ Handles the construction and optimization of data infrastructure, resolving appropriate data formats to ensure data readiness for analysis.
+ Examines and settles appropriate data formats to optimize data usability and accessibility across the organization.
+ Liaises with partners to understand data needs and ensure alignment with organizational objectives.
+ Champions development standards and brings forward prototypes to test new data framework concepts and architecture patterns supporting efficient data processing and analysis and promoting standard methodologies in data management.
+ Leads the creation and maintenance of automated reporting systems that provide timely insights and facilitate data driven decision making.
+ Oversees data modeling to ensure the preparation of data in databases for use in various analytics tools and to configurate and develop data pipelines to move and improve data assets.
+ Manages team members to achieve the organization's goals, by ensuring productivity, communicating performance expectations, creating goal alignment, giving and seeking feedback, providing coaching, measuring progress and holding people accountable, supporting employee development, recognizing achievement and lessons learned, and developing enabling conditions for talent to thrive in an inclusive team culture.
**Qualifications**
Minimum requirement of 6 years of relevant work experience. Typically reflects 10 years or more of relevant experience.
Preferred Work Experience
+ Prior experience as a data/ software engineer performing data modeling and data pipeline engineering leveraging advanced cloud technologies and diverse coding languages
+ Leading geographically distributed engineering teams across a large global organization
+ Developing and managing strategic partnerships across both digital and business facing stakeholders
+ Track record of leading architecture strategies and execution across a diverse digital and data technology landscape
+ Experience developing and leading transformation strategies regarding to people, process, and technology
+ Thorough understanding of industry trends and best practices related to data engineering of robust, performant, and cost effective solutions
+ Proven record helping drive the adoption of new technologies and methods within the functional data and analytics team and be a role model and mentor for data engineers.