223 Big Data Technologies jobs in Chennai
Data engineer - Big data technologies

Posted 5 days ago
Job Viewed
Job Description
**Responsibilities:**
+ Perform system and application monitoring, capacity planning and systems tests to ensure products meet performance requirements
+ Evaluate technologies, develop prototypes, contribute to design issues, and implement solutions
+ Work with various internal and external teams to identify and resolve problems
+ Consult with end users and clients to identify and correct systems problems or propose solutions
+ Assist in the development of software and systems tools used by integration teams to create end user packages
+ Provide support for operating systems and in-house applications, including third party applications, as needed
+ Perform coding, analysis, testing or other appropriate functions in order to identify problems and propose solutions
+ Adhere to Citi technology standards, audit requirements and corporate compliance issues and requirements
+ Apply knowledge of engineering procedures and concepts and basic knowledge of other technical areas to day to day activities
+ Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
**Qualifications:**
+ 2-4 years of relevant experience in an Engineering role
+ Experience working in Financial Services or a large complex and/or global environment
+ Project Management experience
+ Consistently demonstrates clear and concise written and verbal communication
+ Comprehensive knowledge of design metrics, analytics tools, benchmarking activities and related reporting to identify best practices
+ Demonstrated analytic/diagnostic skills
+ Ability to work in a matrix environment and partner with virtual teams
+ Ability to work independently, multi-task, and take ownership of various parts of a project or initiative
+ Ability to work under pressure and manage to tight deadlines or unexpected changes in expectations or requirements
+ Proven track record of operational process change and improvement
+ Deep understanding of retail banking products and functions (Deposits, Savings and Checking, Money market funds, Certificate of Deposits) Payments, Fund Transfers etc)
+ Deep understanding of Card products, associated processes and life cycle.
+ Understanding of Private banking and wealth management
+ Experience in Hadoop technologies, Data warehousing technologies
+ Comfortable in SQL
+ Excellent written and verbal communication skills, with the ability to present complex financial information clearly and concisely to both technical and non-technical audiences.
+ Ability to work effectively both independently and as part of a team. Strong collaboration and relationship-building skills are crucial for success in this role.
+ A self-starter who takes initiative and is driven to achieve results
+ Possesses a strong analytical mindset and meticulous attention to detail to ensure accuracy and completeness in all tasks
Able to adapt quickly to changing priorities and work effectively in a dynamic environment.
+ Snowflake experience
+ Experience in Data lineage identification, DQ rules implementation
+ Risk and Finance Regulatory reports exposure
**Education:**
+ Bachelor's degree/University degree or equivalent experience
---
**Job Family Group:**
Technology
---
**Job Family:**
Systems & Engineering
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Data engineer - Big data technologies
Posted today
Job Viewed
Job Description
The Engineering Analyst 2 is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of hardware, software and network infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks.
Responsibilities:
Qualifications:
Able to adapt quickly to changing priorities and work effectively in a dynamic environment.
Education:
---
Job Family Group:
Technology---
Job Family:
Systems & Engineering---
Time Type:
Full time---
Most Relevant Skills
Please see the requirements listed above.---
Other Relevant Skills
For complementary skills, please see above and/or contact the recruiter.---
Data Engineering
Posted today
Job Viewed
Job Description
Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow.
As a senior Data Engineer, youll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Python, Pyspark, Oracle PL/SQL, SQL, Hive, Databricks, Airflow. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Git HubAction, and GIT Hub.
WHAT YOULL DO:
- Develop, troubleshoot, debug and make application enhancements and create code leveraging Python, SQL as the core development languages.
- Develop new BE functionalities working closely with the FE team.
- Contribute to the expansion of NRPS scope
Must have
- 6-10 Years of years of applicable software engineering experience
- Must have a strong experience in Python
- Must have a strong experience in Oracle PL/SQL
- Strong fundamentals with experience in Hive, Airflow
- Must have SQL knowledge.
Good to have
- Good to have experience in Scala and Databricks.
- Good to have experience in Linux and KSH
- Good to have experience with DevOps Technologies as GIT Hub, GIT Hub action, Docker.
- Good to have experience in the Retail Domain.
- Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business
- Minimum B.S. degree in Computer Science, Computer Engineering or related field
Please share your profile to
Learning Support Specialist (AI, ML, Data Science, Data Engineering)
Posted today
Job Viewed
Job Description
About Emeritus:
Emeritus is committed to teaching the skills of the future by making high-quality education accessible and affordable to individuals, companies, and governments around the world. It does this by collaborating with more than 50 top-tier universities across the United States, Europe, Latin America, Southeast Asia, India and China.
Emeritus’ short courses, degree programs, professional certificates, and senior executive programs help individuals learn new skills and transform their lives, companies and organizations. Its unique model of state-of-the-art technology, curriculum innovation, and hands-on instruction from senior faculty, mentors and coaches has educated more than 250,000 individuals across 80+ countries.
Founded in 2015, Emeritus, part of Eruditus Group, has more than 2,000 employees globally and offices in Mumbai, New Delhi, Shanghai, Singapore, Palo Alto, Mexico City, New York, Boston, London, and Dubai. Following its $650 million Series E funding round in August 2021, the Company is valued at $3.2 billion, and is backed by Accel, SoftBank Vision Fund 2, the Chan Zuckerberg Initiative, Leeds Illuminate, Prosus Ventures, Sequoia Capital India, and Bertelsmann.
About the Role:
The Learning Support Specialist serves as both a subject matter expert and mentor, playing a pivotal role in the learning experience. You will guide learners through their educational journey in programs focused on one or more areas including machine learning, artificial intelligence, data engineering, data science, and data analytics , supporting learners from beginners to career-advancing professionals.
Day-to-day, you will:
- Respond to learner questions with clear, actionable guidance
- Provide constructive feedback on assignments and projects
- Break down complex technical concepts into digestible explanations
- Mentor learners at varying experience levels, ensuring each feels supported and motivated
- Collaborate with internal teams to identify course improvements
- Help resolve delivery challenges and escalations
What we’re looking for : A professional who combines deep technical expertise with strong interpersonal skills and genuine passion for education. The ideal candidate seamlessly blends technical knowledge with empathy and exceptional communication abilities.
This is a full-time, remote position in a dynamic edtech environment where learner success is a top priority.
Skills and Qualifications:
- We’re looking for candidates with ANY ONE of these backgrounds:
- Professional experience: 2+ years’ in data engineering, data science, or data analytics OR
- Academic background : PhD (or pursuing PhD) in computer science with a focus on data-related specialties OR
- Teaching experience : Teaching, tutoring, or teaching assistant experience in data, mathematics, or ML/AI OR
- Support experience : Learning support in data-related technical bootcamps or higher education
- Strong background in mathematics (statistics, calculus, linear algebra).
- Strong academic or professional grounding in machine learning and artificial intelligence .
- Proficiency in Python and libraries such as NumPy, Pandas (JavaScript experience is a plus).
- Familiarity with GitHub and version control workflows.
- Experience with at least one data visualization tool (e.g., Tableau, Power BI).
- Comfort with cloud platforms (Azure preferred) is optional but advantageous.
- Strong written and verbal communication skills for working with a diverse learner base.
- Experience with learning management systems (e.g., Canvas) is a plus.
Preferred Qualifications:
- Familiarity with Slack, Teams, or similar collaboration tools.
- Experience with support/service software or ticketing systems .
- Exposure to bug tracking and feedback tools .
Emeritus provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
In press:
Learning Support Specialist (AI, ML, Data Science, Data Engineering)
Posted today
Job Viewed
Job Description
About Emeritus:
Emeritus is committed to teaching the skills of the future by making high-quality education accessible and affordable to individuals, companies, and governments around the world. It does this by collaborating with more than 50 top-tier universities across the United States, Europe, Latin America, Southeast Asia, India and China.
Emeritus’ short courses, degree programs, professional certificates, and senior executive programs help individuals learn new skills and transform their lives, companies and organizations. Its unique model of state-of-the-art technology, curriculum innovation, and hands-on instruction from senior faculty, mentors and coaches has educated more than 250,000 individuals across 80+ countries.
Founded in 2015, Emeritus, part of Eruditus Group, has more than 2,000 employees globally and offices in Mumbai, New Delhi, Shanghai, Singapore, Palo Alto, Mexico City, New York, Boston, London, and Dubai. Following its $650 million Series E funding round in August 2021, the Company is valued at $3.2 billion, and is backed by Accel, SoftBank Vision Fund 2, the Chan Zuckerberg Initiative, Leeds Illuminate, Prosus Ventures, Sequoia Capital India, and Bertelsmann.
About the Role:
The Learning Support Specialist serves as both a subject matter expert and mentor, playing a pivotal role in the learning experience. You will guide learners through their educational journey in programs focused on one or more areas including machine learning, artificial intelligence, data engineering, data science, and data analytics , supporting learners from beginners to career-advancing professionals.
Day-to-day, you will:
- Respond to learner questions with clear, actionable guidance
- Provide constructive feedback on assignments and projects
- Break down complex technical concepts into digestible explanations
- Mentor learners at varying experience levels, ensuring each feels supported and motivated
- Collaborate with internal teams to identify course improvements
- Help resolve delivery challenges and escalations
What we’re looking for : A professional who combines deep technical expertise with strong interpersonal skills and genuine passion for education. The ideal candidate seamlessly blends technical knowledge with empathy and exceptional communication abilities.
This is a full-time, remote position in a dynamic edtech environment where learner success is a top priority.
Skills and Qualifications:
- We’re looking for candidates with ANY ONE of these backgrounds:
- Professional experience: 2+ years’ in data engineering, data science, or data analytics OR
- Academic background : PhD (or pursuing PhD) in computer science with a focus on data-related specialties OR
- Teaching experience : Teaching, tutoring, or teaching assistant experience in data, mathematics, or ML/AI OR
- Support experience : Learning support in data-related technical bootcamps or higher education
- Strong background in mathematics (statistics, calculus, linear algebra).
- Strong academic or professional grounding in machine learning and artificial intelligence .
- Proficiency in Python and libraries such as NumPy, Pandas (JavaScript experience is a plus).
- Familiarity with GitHub and version control workflows.
- Experience with at least one data visualization tool (e.g., Tableau, Power BI).
- Comfort with cloud platforms (Azure preferred) is optional but advantageous.
- Strong written and verbal communication skills for working with a diverse learner base.
- Experience with learning management systems (e.g., Canvas) is a plus.
Preferred Qualifications:
- Familiarity with Slack, Teams, or similar collaboration tools.
- Experience with support/service software or ticketing systems .
- Exposure to bug tracking and feedback tools .
Emeritus provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
In press:
Senior Data Engineering

Posted 5 days ago
Job Viewed
Job Description
**The Role:**
**We are looking for a candidate to join our team who will be involved in the ongoing development of our Enterprise Data Warehouse (EDW) The Media and Marketing Data Enginee role will include participating in the loading and extraction of data, including external sources through API , Storage buckets( S3,Blob storage) and marketing specific data integrations. The ideal candidate will be involved in all stages of the project lifecycle, from initial planning through to deployment in production. A key focus of the role will be data analysis, data modeling, and ensuring these aspects are successfully implemented in the production environment.**
**Your Contribution:**
**Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you'll need for success at Logitech. In this role you will:**
+ **Design, Develop, document, and test ETL solutions using industry standard tools.**
+ **Ability to design Physical and Reporting Data models for seamless cross-functional and cross-systems data reporting.**
+ **Enhance point-of-sale datasets with additional data points to provide stakeholders with useful insights.**
+ **Ensure data integrity by rigorously validating and reconciling data obtained from third-party providers.**
+ **Collaborate with data providers and internal teams to address customer data discrepancies and enhance data quality.**
+ **Work closely across our D&I teams to deliver datasets optimized for consumption in reporting and visualization tools like Tableau**
+ **Collaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into data solutions.**
+ **Participate in the design discussion with enterprise architects and recommend design improvements**
+ **Develop and maintain conceptual, logical, and physical data models with their corresponding metadata.**
+ **Work closely with cross-functional teams to integrate data solutions.**
+ **Create and maintain clear documentation for data processes, data models, and pipelines.**
+ **Integrate Snowflake with various data sources and third-party tools.**
+ **Manage code versioning and deployment of Snowflake objects using CI/CD practices**
**Key Qualifications:**
**For consideration, you must bring the following** **minimum** **skills and behaviors to our team:**
+ **A total of 6 to 10 years of experience in ETL design, development, and populating data warehouses. This includes experience with heterogeneous OLTP sources such as Oracle R12 ERP systems and other cloud technologies.**
+ **At least 3 years of hands-on experience with Pentaho Data Integration or similar ETL tools.**
+ **Practical experience working with cloud-based Data Warehouses such as Snowflake and Redshift.**
+ **Significant hands-on experience with Snowflake utilities, including SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, Snowflake AI/ML and stored procedures.**
+ **Worked on API based integrations and marketing data**
+ **Design and develop complex data pipelines and ETL workflows in Snowflake using advanced SQL, UDFs, UDTFs, and stored procedures (JavaScript/SQL).**
+ **Comprehensive expertise in databases, data acquisition, ETL strategies, and the tools and technologies within Pentaho DI and Snowflake.**
+ **Experience** **in** **working** **with** **complex** **SQL Functions & Transformation of data on large data sets.**
+ **Demonstrated experience in designing complex ETL processes for extracting data from various sources, including XML files, JSON, RDBMS, and flat files.**
+ **Exposure to standard support ticket management tools.**
+ **A strong understanding of Business Intelligence and Data warehousing concepts and methodologies.**
+ **Extensive experience in data analysis and root cause analysis, along with proven problem-solving and analytical thinking capabilities.**
+ **A solid understanding of software engineering principles and proficiency in working with Unix/Linux/Windows operating systems, version control, and office software.**
+ **A deep understanding of data warehousing principles and cloud architecture, including SQL optimization techniques for building efficient and scalable data systems.**
+ **Familiarity with Snowflake's unique features, such as its multi-cluster architecture and shareable data capabilities.**
+ **Excellent skills in writing and optimizing SQL queries to ensure high performance and data accuracy across all systems.**
+ **The ability to troubleshoot and resolve data quality issues promptly, maintaining data integrity and reliability.**
+ **Strong communication skills are essential for effective collaboration with both technical and non-technical teams to ensure a clear understanding of data engineering requirements.**
**In addition,** **preferable** **skills and behaviors include:**
+ **Exposure to Oracle ERP environment,**
+ **Basic understanding of Reporting tools like OBIEE, Tableau**
+ **Exposure to Marketing data platform like Adverity,Fivertran etc**
+ **Exposure to Customer data platform**
**Education:**
+ **BS/BTech/MCA/MS in computer science Information Systems or a related technical field or equivalent industry expertise.**
+ **Fluency in English.**
**Logitech is the sweet spot for people who are passionate about products, making a mark, and having fun doing it. As a company, we're small and flexible enough for every person to take initiative and make things happen. But we're big enough in our portfolio, and reach, for those actions to have a global impact. That's a pretty sweet spot to be in and we're always striving to keep it that way.**
**"** **_All qualified applicants will receive consideration for employment_** **_without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability."_**
Across Logitech we empower collaboration and foster play. We help teams collaborate/learn from anywhere, without compromising on productivity or continuity so it should be no surprise that most of our jobs are open to work from home from most locations. Our hybrid work model allows some employees to work remotely while others work on-premises. Within this structure, you may have teams or departments split between working remotely and working in-house.
Logitech is an amazing place to work because it is full of authentic people who are inclusive by nature as well as by design. Being a global company, we value our diversity and celebrate all our differences. Don't meet every single requirement? Not a problem. If you feel you are the right candidate for the opportunity, we strongly recommend that you apply. We want to meet you!
We offer comprehensive and competitive benefits packages and working environments that are designed to be flexible and help you to care for yourself and your loved ones, now and in the future. We believe that good health means more than getting medical care when you need it. Logitech supports a culture that encourages individuals to achieve good physical, financial, emotional, intellectual and social wellbeing so we all can create, achieve and enjoy more and support our families. We can't wait to tell you more about them being that there are too many to list here and they vary based on location.
All qualified applicants will receive consideration for employment without regard to race, sex, age, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.
If you require an accommodation to complete any part of the application process, are limited in the ability, are unable to access or use this online application process and need an alternative method for applying, you may contact us toll free at for assistance and we will get back to you as soon as possible.
Manager, Data Engineering

Posted 5 days ago
Job Viewed
Job Description
+ Lead and mentor a high-performing teams of local and remote engineers
+ Prioritize team workload, allocate tasks effectively, and ensure team members have the resources to succeed
+ Provide technical expertise and guidance to the team
+ Evaluate and mentor adherence to coding standards, best practices, and architectural guidelines
+ Oversee the design, development, maintenance, scalability, reliability, and performance of the Industrial Systems' data platform pipelines and architecture.
+ Contribute to the long-term strategic direction of the Data Platform with a focus on enterprise use
+ Enforce and ensure data quality, data governance, and security standards
+ Identify and consolidate common tasks across teams to improve efficiency and reduce redundancy
+ Communicate decisions effectively and transparently to internal and external customers
+ Stay updated on industry trends and emerging technologies to inform technical decisions
+ Lead to implement various business customers' requests and logics into the data assets with optimized design and code development
Qualifications Required:
+ Bachelor's degree in computer science, Information Technology, Information Systems, or Data Analytics
+ 5+ years experience in Industrial Systems data (e.g. Engineering / Manufacturing / Supply Chain / Finance / Purchasing / HR)
+ 5+ years of progressive responsibilities in a complex data environment
+ 3+ years' experience leading a software/data engineering team
+ 3+ years of experience in Big Data Environments or expertise with Big Data tools
+ Expertise in Google Cloud Platform
+ Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).
Qualifications Required:
+ Bachelor's degree in computer science, Information Technology, Information Systems, or Data Analytics
+ 5+ years experience in Industrial Systems data (e.g. Engineering / Manufacturing / Supply Chain / Finance / Purchasing / HR)
+ 5+ years of progressive responsibilities in a complex data environment
+ 3+ years' experience leading a software/data engineering team
+ 3+ years of experience in Big Data Environments or expertise with Big Data tools
+ Expertise in Google Cloud Platform
+ Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).
**Requisition ID** : 48940
Be The First To Know
About the latest Big data technologies Jobs in Chennai !
Associate - Data Engineering
Posted today
Job Viewed
Job Description
Associate - Data Engineering
Roles & Responsibilities:
- We are looking for a Data Engineering who will be majorly responsible for building and maintaining ETL/ ELT pipelines.
- You are expected to build and manage Data warehouse solutions, data models, creating ETL processes, implementing data quality checks etc.
- Support data enablement across applications and build Data Quality enhancements.
- Responsible for performing EDA (exploratory data analysis), troubleshoot data related issues, integrating various data sets etc.
Required Technical Skills
- Extensive experience in Python, Pyspark, SQL.
- Strong experience in Data Warehouse, ETL, Data Modelling, ETL Pipelines, Snowflake database.
- Must be proficient in Databricks, Dataiku, Alteryx.
- Good experience in creating and operationalizing ARDs.
- Hands-on experience in cloud services like Azure, AWS- S3, Glue, Lambda, CloudWatch, Athena.
- Sound knowledge in end-to-end Data management, data ops, quality and data governance.
- Experience in developing agents to automate the data processes and improve quality and governance
- Exposure to Data Lineage and Metadata to build AI ready foundational data
- Know-how of Agentic AI and other automation technologies will be a plus.
- Knowledge of SFDC, Waterfall/ Agile methodology.
Qualifications
- Bachelor's or master's Engineering/ MCA or equivalent degree.
- 1-3 years of relevant industry experience as Data Engineer.
- Experience working in Pharma domain working on various data sets such as IQVIA, Veeva, Symphony; Claims, CRM, Sales etc.
- High motivation, good work ethic, working independently, self-organized and personal initiative.
- Ability to work collaboratively and providing the support to the team.
- Excellent written and verbal communication skills.
- Strong analytical and problem-solving skills.
Location
- Preferably Hyderabad/ Chennai, India
About Us
Chryselys is a US based Pharma Analytics & Business consulting company that delivers data-driven insights leveraging AI-powered, cloud-native platforms to achieve high-impact transformations.
Chryselys was founded in the heart of US Silicon Valley in November 2019 with the vision of delivering high-value business consulting, solutions, and services to clients in the healthcare and life sciences space. We are trusted partners for organizations that seek to achieve high-impact transformations and reach their higher-purpose mission.
Chryselys India supports our global clients to achieve high-impact transformations and reach their higher-purpose mission.
Please visit:
for more details.
Data Engineering Manager
Posted today
Job Viewed
Job Description
Short Description
We are seeking an experienced candidate to lead multiple teams responsible for the development and maintenance of Fords Industrial System Data. The ideal candidate will have a strong technical background in data and/or software engineering, along with proven leadership skills. This role requires the ability to design landing & curation solutions, prioritize team tasks, make timely decisions, and guide the teams to deliver high-quality results. The leader must be knowledgeable in data governance, customer consent, and security standards.
Description
Responsibilities:
- Lead and mentor high-performing teams of local and remote engineers
- Prioritize team workload, allocate tasks effectively, and ensure team members have the resources to succeed
- Provide technical expertise and guidance to the team
- Evaluate and mentor adherence to coding standards, best practices, and architectural guidelines
- Oversee the design, development, maintenance, scalability, reliability, and performance of the Industrial Systems' data platform pipelines and architecture.
- Contribute to the long-term strategic direction of the Data Platform with a focus on enterprise use
- Enforce and ensure data quality, data governance, and security standards
- Identify and consolidate common tasks across teams to improve efficiency and reduce redundancy
- Communicate decisions effectively and transparently to internal and external customers
- Stay updated on industry trends and emerging technologies to inform technical decisions
- Lead to implement various business customers' requests and logics into the data assets with optimized design and code development
Qualifications Required:
- Bachelor's degree in computer science, Information Technology, Information Systems, or Data Analytics
- 5+ years' experience in Industrial Systems data (e.g. Engineering / Manufacturing / Supply Chain / Finance / Purchasing / HR)
- 5+ years of progressive responsibilities in a complex data environment
- 3+ years' experience leading a software/data engineering team
- 3+ years of experience in Big Data Environments or expertise with Big Data tools
- Expertise in Google Cloud Platform
- Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).
Manager, Data Engineering
Posted today
Job Viewed
Job Description
Responsibilities:
- Lead and mentor a high-performing teams of local and remote engineers
- Prioritize team workload, allocate tasks effectively, and ensure team members have the resources to succeed
- Provide technical expertise and guidance to the team
- Evaluate and mentor adherence to coding standards, best practices, and architectural guidelines
- Oversee the design, development, maintenance, scalability, reliability, and performance of the Industrial Systems' data platform pipelines and architecture.
- Contribute to the long-term strategic direction of the Data Platform with a focus on enterprise use
- Enforce and ensure data quality, data governance, and security standards
- Identify and consolidate common tasks across teams to improve efficiency and reduce redundancy
- Communicate decisions effectively and transparently to internal and external customers
- Stay updated on industry trends and emerging technologies to inform technical decisions
- Lead to implement various business customers' requests and logics into the data assets with optimized design and code development
Qualifications Required:
- Bachelor's degree in computer science, Information Technology, Information Systems, or Data Analytics
- 5+ years experience in Industrial Systems data
(e.g. Engineering / Manufacturing / Supply Chain / Finance / Purchasing / HR)
- 5+ years of progressive responsibilities in a complex data environment
- 3+ years' experience leading a software/data engineering team
- 3+ years of experience in Big Data Environments or expertise with Big Data tools
- Expertise in Google Cloud Platform
- Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).
Qualifications Required:
- Bachelor's degree in computer science, Information Technology, Information Systems, or Data Analytics
- 5+ years experience in Industrial Systems data
(e.g. Engineering / Manufacturing / Supply Chain / Finance / Purchasing / HR)
- 5+ years of progressive responsibilities in a complex data environment
- 3+ years' experience leading a software/data engineering team
- 3+ years of experience in Big Data Environments or expertise with Big Data tools
- Expertise in Google Cloud Platform
- Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).