6,273 Big Data Technologies jobs in India

Data engineer - Big data technologies

Chennai, Tamil Nadu Citigroup

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

The Engineering Analyst 2 is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of hardware, software and network infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks.
**Responsibilities:**
+ Perform system and application monitoring, capacity planning and systems tests to ensure products meet performance requirements
+ Evaluate technologies, develop prototypes, contribute to design issues, and implement solutions
+ Work with various internal and external teams to identify and resolve problems
+ Consult with end users and clients to identify and correct systems problems or propose solutions
+ Assist in the development of software and systems tools used by integration teams to create end user packages
+ Provide support for operating systems and in-house applications, including third party applications, as needed
+ Perform coding, analysis, testing or other appropriate functions in order to identify problems and propose solutions
+ Adhere to Citi technology standards, audit requirements and corporate compliance issues and requirements
+ Apply knowledge of engineering procedures and concepts and basic knowledge of other technical areas to day to day activities
+ Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
**Qualifications:**
+ 2-4 years of relevant experience in an Engineering role
+ Experience working in Financial Services or a large complex and/or global environment
+ Project Management experience
+ Consistently demonstrates clear and concise written and verbal communication
+ Comprehensive knowledge of design metrics, analytics tools, benchmarking activities and related reporting to identify best practices
+ Demonstrated analytic/diagnostic skills
+ Ability to work in a matrix environment and partner with virtual teams
+ Ability to work independently, multi-task, and take ownership of various parts of a project or initiative
+ Ability to work under pressure and manage to tight deadlines or unexpected changes in expectations or requirements
+ Proven track record of operational process change and improvement
+ Deep understanding of retail banking products and functions (Deposits, Savings and Checking, Money market funds, Certificate of Deposits) Payments, Fund Transfers etc)
+ Deep understanding of Card products, associated processes and life cycle.
+ Understanding of Private banking and wealth management
+ Experience in Hadoop technologies, Data warehousing technologies
+ Comfortable in SQL
+ Excellent written and verbal communication skills, with the ability to present complex financial information clearly and concisely to both technical and non-technical audiences.
+ Ability to work effectively both independently and as part of a team. Strong collaboration and relationship-building skills are crucial for success in this role.
+ A self-starter who takes initiative and is driven to achieve results
+ Possesses a strong analytical mindset and meticulous attention to detail to ensure accuracy and completeness in all tasks
Able to adapt quickly to changing priorities and work effectively in a dynamic environment.
+ Snowflake experience
+ Experience in Data lineage identification, DQ rules implementation
+ Risk and Finance Regulatory reports exposure
**Education:**
+ Bachelor's degree/University degree or equivalent experience
---
**Job Family Group:**
Technology
---
**Job Family:**
Systems & Engineering
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
This advertiser has chosen not to accept applicants from your region.

Data engineer - Big data technologies

Chennai, Tamil Nadu 12542 Citicorp Services India Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

The Engineering Analyst 2 is an intermediate level position responsible for a variety of engineering activities including the design, acquisition and development of hardware, software and network infrastructure in coordination with the Technology team. The overall objective of this role is to ensure quality standards are being met within existing and planned frameworks.

Responsibilities:

  • Perform system and application monitoring, capacity planning and systems tests to ensure products meet performance requirements
  • Evaluate technologies, develop prototypes, contribute to design issues, and implement solutions
  • Work with various internal and external teams to identify and resolve problems
  • Consult with end users and clients to identify and correct systems problems or propose solutions
  • Assist in the development of software and systems tools used by integration teams to create end user packages
  • Provide support for operating systems and in-house applications, including third party applications, as needed
  • Perform coding, analysis, testing or other appropriate functions in order to identify problems and propose solutions
  • Adhere to Citi technology standards, audit requirements and corporate compliance issues and requirements
  • Apply knowledge of engineering procedures and concepts and basic knowledge of other technical areas to day to day activities
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
  • Qualifications:

  • 2-4 years of relevant experience in an Engineering role
  • Experience working in Financial Services or a large complex and/or global environment
  • Project Management experience
  • Consistently demonstrates clear and concise written and verbal communication
  • Comprehensive knowledge of design metrics, analytics tools, benchmarking activities and related reporting to identify best practices
  • Demonstrated analytic/diagnostic skills
  • Ability to work in a matrix environment and partner with virtual teams
  • Ability to work independently, multi-task, and take ownership of various parts of a project or initiative
  • Ability to work under pressure and manage to tight deadlines or unexpected changes in expectations or requirements
  • Proven track record of operational process change and improvement
  • Deep understanding of retail banking products and functions (Deposits, Savings and Checking, Money market funds, Certificate of Deposits) Payments, Fund Transfers etc)
  • Deep understanding of Card products, associated processes and life cycle.
  • Understanding of Private banking and wealth management
  • Experience in Hadoop technologies, Data warehousing technologies
  • Comfortable in SQL
  • Excellent written and verbal communication skills, with the ability to present complex financial information clearly and concisely to both technical and non-technical audiences.
  • Ability to work effectively both independently and as part of a team. Strong collaboration and relationship-building skills are crucial for success in this role.
  • A self-starter who takes initiative and is driven to achieve results
  • Possesses a strong analytical mindset and meticulous attention to detail to ensure accuracy and completeness in all tasks
  • Able to adapt quickly to changing priorities and work effectively in a dynamic environment.

  • Snowflake experience
  • Experience in Data lineage identification, DQ rules implementation
  • Risk and Finance Regulatory reports exposure
  • Education:

  • Bachelor’s degree/University degree or equivalent experience
  • ---

    Job Family Group:

    Technology

    ---

    Job Family:

    Systems & Engineering

    ---

    Time Type:

    Full time

    ---

    Most Relevant Skills

    Please see the requirements listed above.

    ---

    Other Relevant Skills

    For complementary skills, please see above and/or contact the recruiter.

    ---

    This advertiser has chosen not to accept applicants from your region.

    Senior Technical Architect Big Data Technologies

    Mumbai, Maharashtra ₹1500000 - ₹2000000 Y Bitkraft Technologies

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Summary

    - Bitkraft Technologies LLP is looking for Technical Architect to join our software engineering team
    - You will be working across the stack on cutting edge web development projects for our custom services business.
    - As a Senior Technical Architect, you play a pivotal role in designing, developing, and implementing cutting-edge data processing solutions
    - The ideal candidate will have a deep understanding of distributed systems, big data technologies, and real-time data processing frameworks.
    - If you love solving problems, are a team player and want to work in a fast paced environment with core technical and business challenges, we would like to meet you.
    - Essential Skills
    - Deep understanding of big data technologies, including Hadoop, Spark, Kafka, and Flink.
    - Proven experience in designing and implementing scalable, high-performance data processing solutions.
    - Strong knowledge of real-time data processing concepts and frameworks.
    - Familiarity with GraphQL and its use in API design.
    - Familiarity with cloud platforms (AWS, Azure, GCP) and container orchestration (Kubernetes).
    - Other Essential Skills / Requirements
    - Great attention to detail
    - Strong work ethic and commitment to meet deadlines and support team members meet goals
    - Be flexible with working across time zones with overseas customers if required
    - Ability to work independently and as part of a team.
    - Desirable Skills
    - Experience with machine learning frameworks and tools.
    - Knowledge of data governance and compliance standards.
    - Familiarity with CI/CD practices and DevOps methodologies.
    - Key Responsibilities
    - Provide technical leadership and guidance to development teams, ensuring adherence to best practices and architectural
    - standards.
    - Design and architect scalable, high-performance data processing solutions using technologies like Kafka, Spark, Stream Processing, batch Processing, Apache Flink, Hadoop, and GraphQL.
    - Develop and optimize data pipelines to efficiently extract, transform, and load (ETL) data from various sources into target systems.
    - Implement real-time data processing solutions using technologies like Kafka and Flink to enable rapid insights and decision-making.
    - Design and implement batch processing workflows using Hadoop or other big data frameworks to handle large datasets.
    - Create GraphQL APIs to expose data services, ensuring efficient and flexible data access.
    - Research and evaluate new technologies and tools to stay abreast of industry trends and identify opportunities for improvement.
    - Monitor and optimize the performance of data processing systems to ensure maximum efficiency and scalability.
    - Collaborate with cross-functional teams (e.g., data scientists, data engineers, product managers) to deliver high-quality data solutions.
    - Work closely with product management, data science, and operations teams to understand requirements and deliver data solutions that align with business goals.
    - Experience 5 to 10 years
    - About Bitkraft Technologies LLP
    - Bitkraft Technologies LLP is an award winning Software Engineering Consultancy focused on Enterprise Software Solutions, Mobile Apps Development, ML/AI Solution Engineering, Extended Reality, Managed Cloud Services and Technology Skill-sourcing, with an extraordinary track record.
    - We are driven by technology and push the limits of what can be done to realise the business needs of our customers.
    - Our team is committed towards delivering products of the highest standards and we take pride in creating robust user-driven solutions that meet business needs.
    - Bitkraft boasts of clients across over 10+ countries including US, UK, UAE, Oman, Australia and India to name a few.
    - )

    This advertiser has chosen not to accept applicants from your region.

    Director Data Science & Data Engineering

    Bengaluru, Karnataka eBay

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts.

    Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet.

    Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all.

    Director – Data Science & Data Engineering
    Shape the Future of AI-Driven eCommerce Discovery

    About the Role
    We're reimagining how people discover products in eCommerce—and we're looking for a visionary leader who blends technical depth with product intuition. If you're passionate about structured data, large language models, and building high-impact data products, this role is tailor-made for you.

    As Director of Data Science & Data Engineering, you’ll lead a talented team of data scientists, analysts, and engineers working at the cutting edge of AI/ML, product analytics, and taxonomy design. Your mission? Drive innovation in product discovery through smarter data, scalable infrastructure, and breakthrough AI-powered solutions.

    You’ll join the Product Knowledge org and play a key role in designing the backbone of next-gen search, recommendations, and generative AI experiences.

    This is a high-impact, high-agency role—perfect for a hands-on leader who thrives in fast-paced, collaborative environments.

    What You’ll Work On

    Lead and inspire a cross-functional team to:

  • Transform Product Data into Insights
    Conduct deep-dive SQL and Python analyses to uncover opportunities in taxonomy, ontology, and catalog structure that enhance discovery and user experience.

  • Harness the Power of Generative AI
    Use prompt engineering and LLMs to create innovative tools for classification, taxonomy validation, and data enrichment.

  • Build & Evaluate AI/ML Models
    Design frameworks to evaluate product knowledge models, semantic embeddings, and ML-based categorization systems.

  • Drive Data-Informed Strategy
    Translate complex findings into clear, actionable insights for Product and Engineering teams. Influence roadmap decisions on entity resolution, catalog optimization, and knowledge graph development.

  • Partner Across Functions
    Collaborate closely with Applied Research, Engineering, and Product teams to build and deploy high-impact data and AI solutions at scale.

  • Experiment & Innovate Fast
    Prototype quickly, validate hypotheses, and iterate on structured data and AI-driven solutions that push boundaries.

  • What You Bring

  • 12+ years of experience in data science or analytics roles, including 5+ years leading teams

  • Proven track record building data products, knowledge graphs, and scalable data pipelines

  • Deep understanding of eCommerce search, recommendation systems, and product analytics

  • Hands-on experience with LLMs, prompt engineering, and RAG techniques (preferred)

  • Strong communication skills and ability to influence cross-functional stakeholders

  • Experience evaluating ML models with custom metrics and robust frameworks

  • Startup mindset—comfortable with ambiguity, bias for action, and fast iteration

  • Why Join Us

  • Be at the forefront of AI-powered product discovery in eCommerce

  • Own high-impact initiatives in a startup-style culture with real autonomy

  • Work alongside world-class talent across AI, Product, and Engineering

  • Build solutions that scale—serving millions of users and shaping the future of shopping

  • Ready to lead the next wave of AI + Data innovation in commerce? Let’s build the future together.

    Please see the for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay.

    eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at . We will make every effort to respond to your request for accommodation as soon as possible. View our to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities.

    The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our for more information.

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Mumbai, Maharashtra Godrej Capital

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Godrej Capital is a subsidiary of Godrej Industries and is the holding company for Godrej Housing finance & Godrej Finance. With a digital-first approach and a keen focus on customer-centric product innovation, Godrej Capital offers Home Loans, Loan Against Property, Property Loans, Business Loans and is positioned to diversify into other customer segments and launch new products. The company is focused on building a long-term, sustainable retail financial services business in India, anchored by Godrej Group’s 125+year legacy of trust and excellence. Godrej Capital has a special focus on learning and capability development across its employee base and is committed to diversity, equity, and inclusion as a guiding principle.


    The organization has been consistently recognized as a Great Place to Work™ receiving certifications in 2022 and 2023. As it stands, Godrej Capital holds a spot among India's Top 50 Best Workplaces in BFSI 2023 and is also recognized as one of India’s Great Mid-Size Workplaces 2023. Beyond that, it has also had the honor of being named the Best Organization for Women by The Economic Times in both 2022 and 2023, and the Best Workplaces for Women by Great Place to Work in 2022 and in 2023.


    Function

    Information Technology

    Job Purpose

    • The role incumbent will be responsible for managing, expanding and optimizing our data pipeline architecture and optimizing data flow and collection for cross functional teams. The incumbent will support our team of data analysts and scientists on data initiatives and will ensure optimal and timely data delivery. Candidate must be self-driven and comfortable supporting the data needs of multiple teams, systems and products. Candidate will play a major role as we work on building a superior and scalable architecture to enable leveraging data to the fullest extent.


    Role

    • Create and maintain optimal data pipeline architecture.
    • Assemble large, complex data sets that meet functional / non-functional business requirements.
    • Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
    • Working knowledge of message queuing, stream processing, and big data data (optional)
    • Perform sanity testing, issue reporting and tracking.
    • Assist teams in UAT testing and resolve issues as per criticality,
    • Handle audit and compliance activities for data platform.
    • Track and manage system availability and maintenance tasks.

    Qualification & experience

    • Years of experience: 3-5 years of experience
    • Qualification – Engineering / Certified Data Engineer

    Essential skills

    • Experience with data pipeline and workflow management tools.
    • Knowledge of AWS cloud services, Data-Lake, Glue / Python/ PySpark/ Kafka/ API/ Change Data Capture, Streaming data, data modelling will be a key advantage.
    • Experience with relational SQL and NoSQL databases.
    • Exposure to lending systems and domain
    • Machine Learning skills

    Ideal candidate (in terms of current role/ organization/ industry)

    • An individual inclined to learn and explore new technologies and utilise the best out of the resources in hand.
    • Able to influence and work in a collaborative manner
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Mumbai, Maharashtra ₹1500000 - ₹2800000 Y Godrej Capital

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Godrej Capital is a subsidiary of Godrej Industries and is the holding company for Godrej Housing finance & Godrej Finance. With a digital-first approach and a keen focus on customer-centric product innovation, Godrej Capital offers Home Loans, Loan Against Property, Property Loans, Business Loans and is positioned to diversify into other customer segments and launch new products. The company is focused on building a long-term, sustainable retail financial services business in India, anchored by Godrej Group's 125+year legacy of trust and excellence. Godrej Capital has a special focus on learning and capability development across its employee base and is committed to diversity, equity, and inclusion as a guiding principle.

    The organization has been consistently recognized as a Great Place to Work receiving certifications in 2022 and 2023. As it stands, Godrej Capital holds a spot among India's Top 50 Best Workplaces in BFSI 2023 and is also recognized as one of India's Great Mid-Size Workplaces 2023. Beyond that, it has also had the honor of being named the Best Organization for Women by The Economic Times in both 2022 and 2023, and the Best Workplaces for Women by Great Place to Work in 2022 and in 2023.

    Function

    Information Technology

    Job Purpose

    • The role incumbent will be responsible for managing, expanding and optimizing our data pipeline architecture and optimizing data flow and collection for cross functional teams. The incumbent will support our team of data analysts and scientists on data initiatives and will ensure optimal and timely data delivery. Candidate must be self-driven and comfortable supporting the data needs of multiple teams, systems and products. Candidate will play a major role as we work on building a superior and scalable architecture to enable leveraging data to the fullest extent.

    Role

    • Create and maintain optimal data pipeline architecture.
    • Assemble large, complex data sets that meet functional / non-functional business requirements.
    • Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
    • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
    • Working knowledge of message queuing, stream processing, and big data data (optional)
    • Perform sanity testing, issue reporting and tracking.
    • Assist teams in UAT testing and resolve issues as per criticality,
    • Handle audit and compliance activities for data platform.
    • Track and manage system availability and maintenance tasks.

    Qualification & experience

    • Years of experience: 3-5 years of experience
    • Qualification – Engineering / Certified Data Engineer

    Essential skills

    • Experience with data pipeline and workflow management tools.
    • Knowledge of AWS cloud services, Data-Lake, Glue / Python/ PySpark/ Kafka/ API/ Change Data Capture, Streaming data, data modelling will be a key advantage.
    • Experience with relational SQL and NoSQL databases.
    • Exposure to lending systems and domain
    • Machine Learning skills

    Ideal candidate (in terms of current role/ organization/ industry)

    • An individual inclined to learn and explore new technologies and utilise the best out of the resources in hand.
    • Able to influence and work in a collaborative manner
    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    Bengaluru, Karnataka ₹1500000 - ₹2500000 Y PwC India

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title: Data Engineering Senior Associate – Microsoft Fabric, Azure (Databricks & ADF), PySpark

    Experience: 4–10 Years

    Location: PAN India

    Job Summary:

    We are looking for a skilled and experienced Data Engineer with 4-10 years of experience in building scalable data solutions on the Microsoft Azure ecosystem. The ideal candidate must have strong hands-on experience with Microsoft Fabric, Azure Databricks along with strong PySpark, Python and SQL expertise. Familiarity with Data Lake, Data Warehouse concepts, and end-to-end data pipelines is essential.

    Key Responsibilities:

    · Requirement gathering and analysis

    · Design and implement data pipelines using Microsoft Fabric & Databricks

    · Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage

    · Implement data security and governance measures

    · Monitor and optimize data pipelines for performance and efficiency

    · Troubleshoot and resolve data engineering issues

    · Provide optimized solution for any problem related to data engineering

    · Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc.

    · Strong knowledge on Databricks, Delta tables

    Required Skills:

    · 4–10 years of experience in Data Engineering or related roles.

    · Hands-on experience in Microsoft Fabric

    · Hands-on experience in Azure Databricks

    · Proficiency in PySpark for data processing and scripting.

    · Strong command over Python & SQL – writing complex queries, performance tuning, etc.

    · Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas).

    · Hands on experience in performance tuning & optimization on Databricks & MS Fabric.

    · Ensure alignment with overall system architecture and data flow.

    · Understanding CI/CD practices in a data engineering context.

    · Excellent problem-solving and communication skills.

    · Exposure to BI tools like Power BI, Tableau, or Looker.

    Good to Have:

    · Experienced in Azure DevOps.

    · Familiarity with data security and compliance in the cloud.

    · Experience with different databases like Synapse, SQL DB, Snowflake etc.

    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Big data technologies Jobs in India !

    Data Engineering

    Pune, Maharashtra ₹800000 - ₹2500000 Y ZF

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Become our next FutureStarter

    Are you ready to make an impact? ZF is looking for talented individuals to join our team. As a FutureStarter, you'll have the opportunity to shape the future of mobility. Join us and be part of something extraordinary

    Data Engineering & Analytical Consultant

    Country/Region: IN

    Location:
    Pune, MH, IN,

    Req ID 81572 | Pune, India, ZF India Pvt. Ltd.

    Job Description

    About the team

    ZF Aftermarket is a EUR 3 billion powerhouse with a combined team of about 8,000 employees. We have a total of 120 locations in 40 countries worldwide – including 90 logistics centers – and more than 650 service partners with a strong presence in both automotive and industrial markets. This makes us the second largest aftermarket organization worldwide.

    What you can look forward to as Data Engineer & Analytics Specialist

    • KPI Reporting & Business Alignment: Develop and maintain KPI reports in close collaboration with Senior Management and SME, ensuring they drive strategic decision-making supported by clear storytelling and actionable dashboards
    • Business Needs Translation: Investigate and translate business requirements, ensuring that technical outputs align with business goals and are usable by stakeholders
    • Stakeholder Collaboration: Build strong relationships with business teams to understand their challenges and bridge the gap between business and technical teams
    • Data Visualization & Communication: Create interactive dashboards and reports using Power BI, ensuring they are accessible and valuable to both subject matter experts and business users
    • Vision for Business Outputs: Define the end-user experience for data products, ensuring clarity, usability, and strategic alignment
    • Data Transformation & Integration: Translate business requirements into data requirement, process, clean, and integrate data from various sources to meet business requirements using Python, SQL and Power BI

    Your profile as Data Engineer & Analytics Specialist

    • Strong domain knowledge in B2B portfolio management
    • Total experience of 6-10 years in Data engineering/ Advanced analytics roles.
    • Proven experience in building / setting up KPI reporting with Power BI & SQL (Python / pyspark highly valued) supported by strong visualizations
    • Significant experience setting up customer engagement platforms and driving digital transformation initiatives. Ability to manage programs and targets with rigor and strategic oversight
    • Strong leadership, communication, and problem-solving skills. Experience working in a remote, matrix-based organization and leading distributed project teams
    • Entrepreneurial mindset with resilience to overcome roadblocks and drive change

    Why you should choose ZF in India

    • Innovative Environment: ZF is at the forefront of technological advancements, offering a dynamic and innovative work environment that encourages creativity and growth.
    • Diverse and Inclusive Culture: ZF fosters a diverse and inclusive workplace where all employees are valued and respected, promoting a culture of collaboration and mutual support.
    • Career Development: ZF is committed to the professional growth of its employees, offering extensive training programs, career development opportunities, and a clear path for advancement.
    • Global Presence: As a part of a global leader in driveline and chassis technology, ZF provides opportunities to work on international projects and collaborate with teams worldwide.
    • Sustainability Focus: ZF is dedicated to sustainability and environmental responsibility, actively working towards creating eco-friendly solutions and reducing its carbon footprint.
    • Employee Well-being: ZF prioritizes the well-being of its employees, providing comprehensive health and wellness programs, flexible work arrangements, and a supportive work-life balance.

    Be part of our ZF team as Data Engineering & Analytical Consultant and apply now

    Contact

    Veerabrahmam Darukumalli

    What does DEI (Diversity, Equity, Inclusion) mean for ZF as a company?

    At ZF, we continuously strive to build and maintain a culture where inclusiveness is lived and diversity is valued. We actively seek ways to remove barriers so that all our employees can rise to their full potential. We aim to embed this vision in our legacy through how we operate and build our products as we shape the future of mobility.

    Find out how we work at ZF:

    Job Segment: R&D Engineer, Analytics, Database, SQL, User Experience, Engineering, Management, Technology

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    ₹104000 - ₹130878 Y Ltimindtree

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    LTIMindtree hiring for Celonis- Mumbai/Chennai location.

    Exp-8 to 12 yrs

    Notice period-immediate to 30 days

    Location- Mumbai/Chennai

    Mandatory skills-

    • Celonis process mining tool, SQL, PQL ETL, Basic knowledge of Data Engineering, analytics
    • Nice to have Technical Skills: Experience of Celonis with SAP

    If interested Share me these details along with cv in my email -

    Total Experience in Celonis-

    Relevant SQL and ETL -

    Current CTC-

    Expected CTC-

    Holding offers if any-

    Current Location-

    Preferred Location-

    Notice period-

    Skills-

    Current Company-

    Date of Birth-

    Pan no-

    Share your passport size photo-

    Availability for interview -

    Job Description-

    Knowledge with SQL PQL ETL

    Experience on Celonis Process mining tool

    Basic knowledge of Data engineering analytics

    Should have exposure to International Clients

    Basic understanding of Business Processes

    This advertiser has chosen not to accept applicants from your region.

    Data Engineering

    ₹1500000 - ₹2500000 Y Cygnus Professionals

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Role & responsibilities

    Experienced Data Warehouse Developer/DBA with expertise in building data pipelines, curating data, modeling data for analytics, and providing subject matter expertise with experience implementing end-to-end data warehouse/data lake solutions including ETL development, data profiling, data curation, data modeling and deployment with expertice in Postgres, Reddit, Redis, C#, Python.

    • Expert in Databases creating, governance and ETL development.
    • Expert in Postgres, Reddit, Redis, C#, Python
    • Proven track record of data warehouse/data lake data development including building data pipelines and data modeling.
    • Strong experience in SQL, data curation, ETL tooling and pipeline development.
    • Hands-on experience with RDBMS/DWs and writing ETL pipelines.
    • Technical skills and experience using relational databases (e.g. Oracle InForm, Oracle DMW, MS SQL Server or MS Access)
    • Domain knowledge in the pharmaceutical industry and GxP preferred.
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Big Data Technologies Jobs