2,020 Data Engineering jobs in India

Data Engineering

Chennai, Tamil Nadu EXL

Posted today

Job Viewed

Tap Again To Close

Job Description

Responsibilities:

  • Work with stakeholders to understand the data requirements to design, develop, and maintain complex ETL processes.
  • Create the data integration and data diagram documentation.
  • Lead the data validation, UAT and regression test for new data asset creation.
  • Create and maintain data models, including schema design and optimization.
  • Create and manage data pipelines that automate the flow of data, ensuring data quality and consistency.

Qualifications and Skills:

  • Strong knowledge on Python and Pyspark
  • Expectation is to have ability to write Pyspark scripts for developing data workflows.
  • Strong knowledge on SQL, Hadoop, Hive, Azure, Databricks and Greenplum
  • Expectation is to write SQL to query metadata and tables from different data management system such as, Oracle, Hive, Databricks and Greenplum.
  • Familiarity with big data technologies like Hadoop, Spark, and distributed computing frameworks.
  • Expectation is to use Hue and run Hive SQL queries, schedule Apache Oozie jobs to automate the data workflows.
  • Good working experience of communicating with the stakeholders and collaborate effectively with the business team for data testing.
  • Expectation is to have strong problem-solving and troubleshooting skills.
  • Expectation is to establish comprehensive data quality test cases, procedures and implement automated data validation processes.
  • Degree in Data Science, Statistics, Computer Science or other related fields or an equivalent combination of education and experience.
  • 3-7 years of experience in Data Engineer.
  • Proficiency in programming languages commonly used in data engineering, such as Python, Pyspark, SQL.
  • Experience in Azure cloud computing platform, such as developing ETL processes using Azure Data Factory, big data processing and analytics with Azure Databricks.
  • Strong communication, problem solving and analytical skills with the ability to do time management and multi-tasking with attention to detail and accuracy.
This advertiser has chosen not to accept applicants from your region.

Data Engineering

Mumbai, Maharashtra ₹1500000 - ₹2800000 Y Godrej Capital

Posted today

Job Viewed

Tap Again To Close

Job Description

Godrej Capital is a subsidiary of Godrej Industries and is the holding company for Godrej Housing finance & Godrej Finance. With a digital-first approach and a keen focus on customer-centric product innovation, Godrej Capital offers Home Loans, Loan Against Property, Property Loans, Business Loans and is positioned to diversify into other customer segments and launch new products. The company is focused on building a long-term, sustainable retail financial services business in India, anchored by Godrej Group's 125+year legacy of trust and excellence. Godrej Capital has a special focus on learning and capability development across its employee base and is committed to diversity, equity, and inclusion as a guiding principle.

The organization has been consistently recognized as a Great Place to Work receiving certifications in 2022 and 2023. As it stands, Godrej Capital holds a spot among India's Top 50 Best Workplaces in BFSI 2023 and is also recognized as one of India's Great Mid-Size Workplaces 2023. Beyond that, it has also had the honor of being named the Best Organization for Women by The Economic Times in both 2022 and 2023, and the Best Workplaces for Women by Great Place to Work in 2022 and in 2023.

Function

Information Technology

Job Purpose

  • The role incumbent will be responsible for managing, expanding and optimizing our data pipeline architecture and optimizing data flow and collection for cross functional teams. The incumbent will support our team of data analysts and scientists on data initiatives and will ensure optimal and timely data delivery. Candidate must be self-driven and comfortable supporting the data needs of multiple teams, systems and products. Candidate will play a major role as we work on building a superior and scalable architecture to enable leveraging data to the fullest extent.

Role

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Working knowledge of message queuing, stream processing, and big data data (optional)
  • Perform sanity testing, issue reporting and tracking.
  • Assist teams in UAT testing and resolve issues as per criticality,
  • Handle audit and compliance activities for data platform.
  • Track and manage system availability and maintenance tasks.

Qualification & experience

  • Years of experience: 3-5 years of experience
  • Qualification – Engineering / Certified Data Engineer

Essential skills

  • Experience with data pipeline and workflow management tools.
  • Knowledge of AWS cloud services, Data-Lake, Glue / Python/ PySpark/ Kafka/ API/ Change Data Capture, Streaming data, data modelling will be a key advantage.
  • Experience with relational SQL and NoSQL databases.
  • Exposure to lending systems and domain
  • Machine Learning skills

Ideal candidate (in terms of current role/ organization/ industry)

  • An individual inclined to learn and explore new technologies and utilise the best out of the resources in hand.
  • Able to influence and work in a collaborative manner
This advertiser has chosen not to accept applicants from your region.

Data Engineering

Hyderabad, Andhra Pradesh ₹1500000 - ₹2500000 Y Optum

Posted today

Job Viewed

Tap Again To Close

Job Description

Description -

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.

Primary Responsibilities:

  • Follow the development practices, policies, and reporting expectations in the Optum Team
  • Be able to implement manual and automation testing frameworks
  • Participate in the daily standups, project meetings and retrospectives
  • Clearly articulate questions and requirements for the work being assigned
  • The ability to produce solutions to problems independently but can ask for help when needed
  • Work in collaboration with the Optum Team leads to ensure development and testing milestones are agreed upon and achieved
  • Raise blockers early to ensure proper communication so these issues can be addressed by the Optum Team
  • Follow all Optum Regulatory/Compliance requirements
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Qualifications - External

Required Qualifications:

  • BA/BS in Computer Science or a related field
  • 5+ years of experience in Data Engineering and Java technologies in the candidates most recent position
  • Experience working in an Agile development environment
  • Experience building ETL pipelines
  • Experience in Databricks, SQL. Synapse, Scala, Python, Java
  • Experience using a source control repository, preferably GIT
  • High level knowledge of development fundamentals and core language concepts with Java/Scala
  • Proven problem solving skills, able to analyze logs and trouble shoot issues
  • Proven willingness to learn new technologies and eagerness to think outside the box
  • Good understanding of OOP principals
  • Good understanding of secure coding best practices, remediation of security vulnerability general knowledge

Preferred Qualifications:

  • Azure App services, Azure Functions experience
  • Rest API development experience
  • CI/CD experience
This advertiser has chosen not to accept applicants from your region.

Data Engineering

Mumbai, Maharashtra ₹600000 - ₹1800000 Y Morgan Stanley

Posted today

Job Viewed

Tap Again To Close

Job Description

Data Engineer _ Director_Software Engineering

Profile Description
Division IST Location Mumbai

We're seeking someone to join our team as Data Engineer as a part of our Institutional Securities_Technology, to will play a key role in helping transform how Morgan Stanley operates

In
stitutional Securities_Technology
Institutional Securities Technology (IST) develops and oversees the overall technology strategy and bespoke technology solutions to drive and enable the institutional businesses and enterprise-wide functions. IST's 'clients' include Fixed Income, Equities, Commodities, Investment Banking, Research, Prime Brokerage and Global Capital Markets.

Advisory & Sales Distribution
Advisory is responsible for end-to-end process life cycle support for Investment Banking, Global Capital Markets and Research with a focus on efficiency, scale, and time to market. Sales Distribution develops technology for the firm's IED and FID Sales and Distribution business units, creating application synergies across the businesses various platforms

Software Engineering
This is Director position that develops and maintains software solutions that support business needs.

Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals.

At Morgan Stanley India, we support the Firm's global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm's infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there's ample opportunity to move across the businesses for those who show passion and grit in their work.

Interested in joining a team that's eager to create, innovate and make an impact on the world? Read on…

What You'll Do In The Role
Strong hand-on experience of Snowflake keeping up to date with available latest features.

Strong hands-on experience on python programming language.

Strong hands-on experience on writing complex SQLs.

Strong understanding of how distributed platforms works.

Ability to develop, maintain and distribute the code in modularized fashion.

Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach

Basic understanding of Unix shell script and Unix OS platform

Excellent problem solving and analytical skill

Excellent verbal and written communication skills

Ability to multi-task and function efficiently in a fast-paced environment

Self-starter with flexibility and adaptability in a dynamic work environment

Good Understanding of Agile Methodology 5+ years of experience in Data Engineer role.

5+ years of experience in designing & building real time data pipelines and analytical solutions using big data and cloud technology ecosystem (Azure, Snowflake, Databricks etc)

Expertise with Data Lake/Big Data Projects implementation in cloud (preferably Azure+Snowflake)

What You'll Bring To The Role
Experience in Data Reporting Tools (preferably Business Objects)

Experience in Visualization Tools (preferably Power BI)

Financial Domain experience will be a plus

What You Can Expect From Morgan Stanley
We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 85 years. At our foundation are five core values — putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back — that guide our more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you'll find trusted colleagues, committed mentors and a culture that values diverse perspectives, individual intellect and cross-collaboration. Our Firm is differentiated by the caliber of our diverse team, while our company culture and commitment to inclusion define our legacy and shape our future, helping to strengthen our business and bring value to clients around the world. Learn more about how we put this commitment to action: We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry.

What You Can Expect From Morgan Stanley
We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren't just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you'll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There's also ample opportunity to move about the business for those who show passion and grit in their work.

To learn more about our offices across the globe, please copy and paste into your browser.

Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.

This advertiser has chosen not to accept applicants from your region.

Data Engineering

Karnataka, Karnataka ₹104000 - ₹1308780 Y ofi

Posted today

Job Viewed

Tap Again To Close

Job Description

About Us
We are a global leader in food & beverage ingredients.
Pioneers at heart, we operate at the forefront of consumer trends to provide food & beverage manufacturers with products and ingredients that will delight their consumers. Making a positive impact on people and planet is all part of the delight. With a deep-rooted presence in the countries where our ingredients are grown, we are closer to farmers, enabling better quality, and more reliable, traceable and transparent supply. Supplying products and ingredients at scale is just the start. We add value through our unique, complementary portfolio of natural, delicious and nutritious products. With our fresh thinking, we help our customers unleash the sensory and functional attributes of cocoa, coffee, dairy, nuts and spices so they can create naturally good food & beverage products that meet consumer expectations. And whoever we're with, whatever we're doing, we always
make it real
.

About The Role
At ofi, we are on a mission to leverage data to drive innovation and business transformation. We are looking for a Data & ML Engineering Lead to spearhead our engineering team and contribute to our mission of delivering data-driven solutions. As a Data & ML Engineering Lead, you will be responsible for managing the engineering team and overseeing the design and implementation of our data infrastructure. You will work closely with data scientists, analysts, and other stakeholders to ensure the seamless flow and integrity of data across the organization.

Job Description
Key Responsibilities:

  • Data Engineering:
  • ETL: Design ETL processes and workflows that can provide sustainable access to an evolving data platform.
  • Tooling: Use technologies like Python, SQL, container technologies such as Docker and Kubernetes, cloud solutions such as Azure to acquire, ingest and transform big datasets.
  • Infrastructure: Manage data infrastructure including Snowflake data warehouse in a way that data consumers have efficient access (dependency management, data integrity, database optimization).
  • Governance: Lead rollout of data governance & security systems.
  • Data assets: Participate in data collection, collation, structuring and cleaning. Maintain data quality through statistical control.
  • Tool development: Develop tools that support access, integration, modelling and visualizing of data.
  • Software development: Ensure code is maintainable, scalable and debuggable.
  • Machine Learning:
  • Front-end integration: Enable model output consumption by the organization by designing and orchestrating production pipeline and front-end integration (e.g., w/ salesforce).
  • Maintenance: Ensure production tasks execute free of interruption and on schedule.
  • Software development: ensure that data science code is maintainable, scalable and debuggable.
  • Tool development: Automate repeatable routines present in ML tasks (offering templates for ML solution deployment) and drive performance improvement in production environment.
  • Performance optimization: find run-time performance improvement and decide which ML technologies will be used in production environment.
  • Platform Ownership:
  • Platform ownership: End-on-end platform ownership including stakeholders' management.
  • Architecture strategy: Implement data and ML architecture based on business goals.
  • Project management: Manage resourcing and timelines for projects related data engineering and model operationalization life cycles
  • Individual skills & mindset
  • Problem solving: Fierce curiosity, strong analytical skills and strong sense of ownership
  • Collaboration: Build a sense of trust and rapport that creates a comfortable & effective workplace and an ability to work as part of an agile team (product owner, developers, etc.)
  • People leading: Coach data and ML engineers and analytics COE members
  • Team-player: Contribute to knowledge development (e.g., tools and code base)

Qualifications

  • Bachelor's or master's degree in Computer Science, Data Analytics, Data Science, Information Technology, etc.
  • Proven 8+ years of experience in data engineering, with 2+ years as the Data Engineering Lead.
  • Proficiency in data pipeline tools and technologies, particularly within the Snowflake ecosystem.
  • Extensive experience with SQL databases and multiple programming languages.
  • Experience in working with data quality tools such as Informatica.

Preferred Skills:

  • Knowledge of SAP ERP (S4HANA and ECC) including its integration with Snowflake.
  • Functional understanding of customer experience, manufacturing, supply chain, and financial concepts

ofi is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, nationality, disability, protected veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law.
Applicants are requested to complete all required steps in the application process including providing a resume/CV in order to be considered for open roles.

This advertiser has chosen not to accept applicants from your region.

Data Engineering

Bengaluru, Karnataka ₹1500000 - ₹2500000 Y PwC India

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Engineering Senior Associate – Microsoft Fabric, Azure (Databricks & ADF), PySpark

Experience: 4–10 Years

Location: PAN India

Job Summary:

We are looking for a skilled and experienced Data Engineer with 4-10 years of experience in building scalable data solutions on the Microsoft Azure ecosystem. The ideal candidate must have strong hands-on experience with Microsoft Fabric, Azure Databricks along with strong PySpark, Python and SQL expertise. Familiarity with Data Lake, Data Warehouse concepts, and end-to-end data pipelines is essential.

Key Responsibilities:

· Requirement gathering and analysis

· Design and implement data pipelines using Microsoft Fabric & Databricks

· Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage

· Implement data security and governance measures

· Monitor and optimize data pipelines for performance and efficiency

· Troubleshoot and resolve data engineering issues

· Provide optimized solution for any problem related to data engineering

· Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc.

· Strong knowledge on Databricks, Delta tables

Required Skills:

· 4–10 years of experience in Data Engineering or related roles.

· Hands-on experience in Microsoft Fabric

· Hands-on experience in Azure Databricks

· Proficiency in PySpark for data processing and scripting.

· Strong command over Python & SQL – writing complex queries, performance tuning, etc.

· Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas).

· Hands on experience in performance tuning & optimization on Databricks & MS Fabric.

· Ensure alignment with overall system architecture and data flow.

· Understanding CI/CD practices in a data engineering context.

· Excellent problem-solving and communication skills.

· Exposure to BI tools like Power BI, Tableau, or Looker.

Good to Have:

· Experienced in Azure DevOps.

· Familiarity with data security and compliance in the cloud.

· Experience with different databases like Synapse, SQL DB, Snowflake etc.

This advertiser has chosen not to accept applicants from your region.

Data Engineering

₹1500000 - ₹2500000 Y Ltimindtree

Posted today

Job Viewed

Tap Again To Close

Job Description

LTIMindtree hiring for Celonis- Mumbai/Chennai location.

Exp-8 to 12 yrs

Notice period-immediate to 30 days

Location- Mumbai/Chennai

Mandatory skills-

  • Celonis process mining tool, SQL, PQL ETL, Basic knowledge of Data Engineering, analytics
  • Nice to have Technical Skills: Experience of Celonis with SAP

If interested Share me these details along with cv in my email -

Total Experience in Celonis-

Relevant SQL and ETL -

Current CTC-

Expected CTC-

Holding offers if any-

Current Location-

Preferred Location-

Notice period-

Skills-

Current Company-

Date of Birth-

Pan no-

Share your passport size photo-

Availability for interview -

Job Description-

Knowledge with SQL PQL ETL

Experience on Celonis Process mining tool

Basic knowledge of Data engineering analytics

Should have exposure to International Clients

Basic understanding of Business Processes

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineering Jobs in India !

Data Engineering

Pune, Maharashtra ₹800000 - ₹2500000 Y ZF

Posted today

Job Viewed

Tap Again To Close

Job Description

Become our next FutureStarter

Are you ready to make an impact? ZF is looking for talented individuals to join our team. As a FutureStarter, you'll have the opportunity to shape the future of mobility. Join us and be part of something extraordinary

Data Engineering & Analytical Consultant

Country/Region: IN

Location:
Pune, MH, IN,

Req ID 81572 | Pune, India, ZF India Pvt. Ltd.

Job Description

About the team

ZF Aftermarket is a EUR 3 billion powerhouse with a combined team of about 8,000 employees. We have a total of 120 locations in 40 countries worldwide – including 90 logistics centers – and more than 650 service partners with a strong presence in both automotive and industrial markets. This makes us the second largest aftermarket organization worldwide.

What you can look forward to as Data Engineer & Analytics Specialist

  • KPI Reporting & Business Alignment: Develop and maintain KPI reports in close collaboration with Senior Management and SME, ensuring they drive strategic decision-making supported by clear storytelling and actionable dashboards
  • Business Needs Translation: Investigate and translate business requirements, ensuring that technical outputs align with business goals and are usable by stakeholders
  • Stakeholder Collaboration: Build strong relationships with business teams to understand their challenges and bridge the gap between business and technical teams
  • Data Visualization & Communication: Create interactive dashboards and reports using Power BI, ensuring they are accessible and valuable to both subject matter experts and business users
  • Vision for Business Outputs: Define the end-user experience for data products, ensuring clarity, usability, and strategic alignment
  • Data Transformation & Integration: Translate business requirements into data requirement, process, clean, and integrate data from various sources to meet business requirements using Python, SQL and Power BI

Your profile as Data Engineer & Analytics Specialist

  • Strong domain knowledge in B2B portfolio management
  • Total experience of 6-10 years in Data engineering/ Advanced analytics roles.
  • Proven experience in building / setting up KPI reporting with Power BI & SQL (Python / pyspark highly valued) supported by strong visualizations
  • Significant experience setting up customer engagement platforms and driving digital transformation initiatives. Ability to manage programs and targets with rigor and strategic oversight
  • Strong leadership, communication, and problem-solving skills. Experience working in a remote, matrix-based organization and leading distributed project teams
  • Entrepreneurial mindset with resilience to overcome roadblocks and drive change

Why you should choose ZF in India

  • Innovative Environment: ZF is at the forefront of technological advancements, offering a dynamic and innovative work environment that encourages creativity and growth.
  • Diverse and Inclusive Culture: ZF fosters a diverse and inclusive workplace where all employees are valued and respected, promoting a culture of collaboration and mutual support.
  • Career Development: ZF is committed to the professional growth of its employees, offering extensive training programs, career development opportunities, and a clear path for advancement.
  • Global Presence: As a part of a global leader in driveline and chassis technology, ZF provides opportunities to work on international projects and collaborate with teams worldwide.
  • Sustainability Focus: ZF is dedicated to sustainability and environmental responsibility, actively working towards creating eco-friendly solutions and reducing its carbon footprint.
  • Employee Well-being: ZF prioritizes the well-being of its employees, providing comprehensive health and wellness programs, flexible work arrangements, and a supportive work-life balance.

Be part of our ZF team as Data Engineering & Analytical Consultant and apply now

Contact

Veerabrahmam Darukumalli

What does DEI (Diversity, Equity, Inclusion) mean for ZF as a company?

At ZF, we continuously strive to build and maintain a culture where inclusiveness is lived and diversity is valued. We actively seek ways to remove barriers so that all our employees can rise to their full potential. We aim to embed this vision in our legacy through how we operate and build our products as we shape the future of mobility.

Find out how we work at ZF:

Job Segment: R&D Engineer, Analytics, Database, SQL, User Experience, Engineering, Management, Technology

This advertiser has chosen not to accept applicants from your region.

Data Engineering

Chennai, Tamil Nadu ₹1500000 - ₹2500000 Y Nielseniq India

Posted today

Job Viewed

Tap Again To Close

Job Description

Our Connect Technology teams are working on our new Connect platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on Connect data and insights to innovate and grow.

As a senior Data Engineer, youll be part of a team of smart, highly skilled technologists who are passionate about learning and supporting cutting-edge technologies such as Python, Pyspark, Oracle PL/SQL, SQL, Hive, Databricks, Airflow. These technologies are deployed using DevOps pipelines leveraging Azure, Kubernetes, Git HubAction, and GIT Hub.

WHAT YOULL DO:

  • Develop, troubleshoot, debug and make application enhancements and create code leveraging Python, SQL as the core development languages.
  • Develop new BE functionalities working closely with the FE team.
  • Contribute to the expansion of NRPS scope

Must have

  • 6-10 Years of years of applicable software engineering experience
  • Must have a strong experience in Python
  • Must have a strong experience in Oracle PL/SQL
  • Strong fundamentals with experience in Hive, Airflow
  • Must have SQL knowledge.

Good to have

  • Good to have experience in Scala and Databricks.
  • Good to have experience in Linux and KSH
  • Good to have experience with DevOps Technologies as GIT Hub,  GIT Hub action, Docker.
  • Good to have experience in the Retail Domain.
  • Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business
  • Minimum B.S. degree in Computer Science, Computer Engineering or related field

Please share your profile to

This advertiser has chosen not to accept applicants from your region.

Data Engineering

₹1500000 - ₹2500000 Y Cygnus Professionals

Posted today

Job Viewed

Tap Again To Close

Job Description

Role & responsibilities

Experienced Data Warehouse Developer/DBA with expertise in building data pipelines, curating data, modeling data for analytics, and providing subject matter expertise with experience implementing end-to-end data warehouse/data lake solutions including ETL development, data profiling, data curation, data modeling and deployment with expertice in Postgres, Reddit, Redis, C#, Python.

  • Expert in Databases creating, governance and ETL development.
  • Expert in Postgres, Reddit, Redis, C#, Python
  • Proven track record of data warehouse/data lake data development including building data pipelines and data modeling.
  • Strong experience in SQL, data curation, ETL tooling and pipeline development.
  • Hands-on experience with RDBMS/DWs and writing ETL pipelines.
  • Technical skills and experience using relational databases (e.g. Oracle InForm, Oracle DMW, MS SQL Server or MS Access)
  • Domain knowledge in the pharmaceutical industry and GxP preferred.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineering Jobs