782 Database Developers jobs in Mumbai
Data Engineer

Posted 9 days ago
Job Viewed
Job Description
The Data Engineer will help build and maintain the cloud Data Lake platform leveraging Databricks. Candidates will be expected to contribute to all stages of the data lifecycle including data ingestion, data modeling, data profiling, data quality, data transformation, data movement, and data curation
* Architect data systems that are resilient to disruptions and failures
* Ensure high uptime for all data services
* Bring modern technologies and practices into the system to improve reliability and support rapid scaling of the business's data needs
* Scale up our data infrastructure to meet business needs
* Develop production data pipeline patterns leveraging
* Provide subject matter expertise and hands-on delivery of data acquisition, curation and consumption pipelines on Azure.
* Responsible for maintaining current and emerging state-of-the-art computer and cloud-based solutions and technologies.
* Build effective relationships with internal stakeholders
* Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog, etc.
* Hands-on experience implementing analytics solutions leveraging Python, Spark SQL, Databricks Lakehouse Architecture, Kubernetes, Docker
* All other duties as assigned
**Qualifications**
* Bachelor's degree in Computer Science, Information Technology, Management Information Systems (MIS), Data Science or related field. Applicable years of experience may be substituted for the degree requirement.
* Up to 8 years of experience in software engineering
* Experience with large and complex data projects, preferred
* Experience with large-scale data warehousing architecture and data modeling, preferred
* Worked with Cloud-based architecture such as Azure Cloud, preferred
* Experience working with big data technologies e.g. Snowflake, Redshift, Synapse, Postgres, Airflow, Kafka, Spark, DBT, preferred
* Experience implementing pub/sub and streaming use cases, preferred
* Experience in design reviews, preferred
* Experience influencing a team's technical and business strategy by making insightful contributions to team priorities and approaches, preferred
* Working knowledge of relational databases, preferred
* Expert in SQL and high-level languages such as Python, Java or Scala, preferred
* Demonstrate the ability to analyze large data sets to identify gaps and inconsistencies in ETL pipeline and provide solutions for pipeline reliability and data quality, preferred
* Experience in infrastructure as code / CICD development environment, preferred
* Proven ability to build, manage and foster a team-oriented environment
* Excellent communication (written and oral) and interpersonal skills
* Excellent organizational, multi-tasking, and time-management skills
**Job** Engineering
**Primary Location** India-Maharashtra-Mumbai
**Schedule:** Full-time
**Travel:** No
**Req ID:** 244483
**Job Hire Type** Experienced Not Applicable #BMI N/A
Data Engineer

Posted 9 days ago
Job Viewed
Job Description
**Are You Ready to Make It Happen at Mondelez International?**
**Join our Mission to Lead the Future of Snacking. Make It With Pride.**
You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs
**How you will contribute**
You will:
+ Operationalize and automate activities for efficiency and timely production of data visuals
+ Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
+ Search for ways to get new data sources and assess their accuracy
+ Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
+ Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
+ Validate information from multiple sources.
+ Assess issues that might prevent the organization from making maximum use of its information assets
**What you will bring**
A desire to drive your future and accelerate your career and the following experience and knowledge:
+ Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
+ Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
+ Ability to simplify complex problems and communicate to a broad audience
**In This Role**
As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.
**Role & Responsibilities:**
+ **Design and Build:** Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
+ **Manage Data Pipelines:** Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
+ **Ensure Data Quality:** Implement data quality and validation processes to ensure data accuracy and integrity.
+ **Optimize Data Storage:** Ensure efficient data storage and retrieval for optimal performance.
+ **Collaborate and Innovate:** Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.
**Technical Requirements:**
+ **Programming:** Python, PySpark, Go/Java
+ **Database:** SQL, PL/SQL
+ **ETL & Integration:** DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran.
+ **Data Warehousing:** SCD, Schema Types, Data Mart.
+ **Visualization:** Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker.
+ **GCP Cloud Services:** Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex.
+ **AWS Cloud Services:** S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis.
+ **Azure Cloud Services:** Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics.
+ **Supporting Technologies:** Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow.
**Soft Skills:**
+ **Problem-Solving:** The ability to identify and solve complex data-related challenges.
+ **Communication:** Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
+ **Analytical Thinking:** The capacity to analyze data and draw meaningful insights.
+ **Attention to Detail:** Meticulousness in data preparation and pipeline development.
+ **Adaptability:** The ability to stay updated with emerging technologies and trends in the data engineering field.
Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy
**Business Unit Summary**
**At Mondelez International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.**
**We have a rich portfolio of strong brands globally and locally including many household names such as** **_Oreo_** **,** **_belVita_** **and** **_LU_** **biscuits;** **_Cadbury Dairy Milk_** **,** **_Milka_** **and** **_Toblerone_** **chocolate;** **_Sour Patch Kids_** **candy and** **_Trident_** **gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum.**
**Our 80,000 makers and bakers are located in more** **than 80 countries** **and we sell our products in** **over 150 countries** **around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen-and happen fast.**
Mondelez International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.
**Job Type**
Regular
Data Science
Analytics & Data Science
At Mondelez International, our purpose is to empower people to snack right through offering the right snack, for the right moment, made the right way. That means delivering a broader range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.
We have a rich portfolio of strong brands - both global and local. Including many household names such as Oreo, belVita and LU biscuits; Cadbury Dairy Milk, Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the number 1 position globally in biscuits, chocolate and candy as well as the No. 2 position in gum
Our 80,000 Makers and Bakers are located in our operations in more than 80 countries and are working to sell our products in over 150 countries around the world. They are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen, and happen fast.
Join us and Make It An Opportunity!
Mondelez Global LLC is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected Veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law. Applicants who require accommodation to participate in the job application process may contact for assistance.
Data Engineer

Posted 9 days ago
Job Viewed
Job Description
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
Technical Professional to Design, Build and Manage the infrastructure and systems that enable organizations to collect, process, store, and analyze large volumes of data. He will be the architects and builders of the data pipelines, ensuring that data is accessible, reliable, and optimized for various uses, including analytics, machine learning, and business intelligence.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
Technical Professional to Design, Build and Manage the infrastructure and systems that enable organizations to collect, process, store, and analyze large volumes of data. You will be the architects and builders of the data pipelines, ensuring that data is accessible, reliable, and optimized for various uses, including analytics, machine learning, and business intelligence.
**Key Responsibilities:**
+ **Designing and Building Data Pipelines:** ?Creating robust, scalable, and efficient ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) pipelines to move data from various sources into data warehouses, data lakes, or other storage systems. Ingest data which is structured, unstructured, streaming, realtime.
+ **Data Architecture:** ?Designing and implementing data models, schemas, and database structures that support business requirements and data analysis needs.
+ **Data Storage and Management:** ?Selecting and managing appropriate data storage solutions (e.g., relational databases, NoSQL databases, data lakes like HDFS or S3, data warehouses like Snowflake, BigQuery, Redshift).
+ **Data Integration:** ?Connecting disparate data sources, ensuring data consistency and quality across different systems.
+ **Performance Optimization:** ?Optimizing data processing systems for speed, efficiency, and scalability, often dealing with large datasets (Big Data).
+ **Data Governance and Security:** ?Implementing measures for data quality, security, privacy, and compliance with regulations.
+ **Collaboration:** ?Working closely with Data Scientists, Data Analysts, Business Intelligence Developers, and other stakeholders to understand their data needs and provide them with clean, reliable data.
+ **Automation:** ?Automating data processes and workflows to reduce manual effort and improve reliability.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
**Required Technical and Professional Expertise**
+ 4 - 6 years of experience as an Data Engineer .
+ Programming Languages:?Strong proficiency in languages like Python, Java, Scala
+ Database Management:?Expertise in SQL and experience with various database systems (e.g., PostgreSQL, MySQL, SQL Server, Oracle).
+ Big Data Technologies:?Experience with frameworks and tools like Apache Spark, Ni-Fi, Kafka, Flink, or similar distributed processing technologies.
+ Cloud Platforms:?Proficiency with cloud data services from providers like Microsoft Azure (Azure Data Lake, Azure Synapse Analytics), Fabric, Cloudera etc
+ Data Warehousing:?Understanding of data warehousing concepts, dimensional modelling, and schema design.
+ ETL/ELT Tools:?Experience with data integration tools and platforms.
+ Version Control:?Familiarity with Git and collaborative development workflows.
**Preferred Technical and Professional Experience**
+ Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer

Posted 9 days ago
Job Viewed
Job Description
At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities.
**The Role**
Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips.
As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation.
An ELK(Elastic, Logstash & Kibana) Data Engineer is responsible for developing, implementing, and maintaining the ELK stack-based solutions within an organization. The engineer plays a crucial role in developing efficient and effective log processing, indexing, and visualization for monitoring, troubleshooting, and analysis purposes.
In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation.
Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist.
Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful.
So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth.
Your Future at Kyndryl
Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here.
**Who You Are**
Who You Are
You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others.
Required Skills and Experience
+ BS or MS degree in Computer Science or a related technical field
+ 10+ years overall IT Industry Experience.
+ 5+ years of Python or Java development experience
+ 5+ years of SQL experience (No-SQL experience is a plus)
+ 4+ years of experience with schema design and dimensional data modelling
+ 3+ years of experience with Elastic, Logstash and Kibana Ability in managing and communicating data warehouse plans to internal clients.
+ Experience designing, building, and maintaining data processing systems.
+ Experience working with Machine Learning model is a plus. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus.
+ Elastic Certification is preferrable.
Preferred Skills and Experience
- Experience working with Machine Learning model is a plus.
- Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus.
- Elastic Certification is preferrable.
**Being You**
Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way.
**What You Can Expect**
With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed.
**Get Referred!**
If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address.
Kyndryl is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Kyndryl is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
Data Engineer
Posted today
Job Viewed
Job Description
A UK based Hedge fund in looking for a Data Engineer for their Data team to architect and build the tools used to create new data-sources and monitor key production systems. This role offers hands-on exposure to the core datasets, and will partner closely with other technology teams, quants, and traders, building systems that guarantee the integrity of our data and avoid trading losses.
Responsibilities:
- Data Quality Tools
- Your key deliverable is to develop and maintain a platform which ensures the quality of our data-pipelines, and alerts stakeholders to critical production issues.
- Library Maintenance
- Maintain and improve shared libraries which are used by developers across the data-function and facilitate the rapid onboarding of new datasets.
- Cross-team Integrations
- Design, build, and document our integrations with other team’s processes, which alert them to relevant issues with our data-pipelines.
- Observability & Monitoring
- Build out the observability stack (logs, metrics, tracing) to monitor our data assurance system, as well as other tools for the management of our data-pipeline lifecycle.
- Platform Support
- Support users in best practice usage of our tooling, migrate code to the latest standards as well as investigate, debug and resolve issues with our platform.
Experience:
- 5+ years of hands-on experience in data engineering or data-adjacent engineering roles
- Strong software engineering fundamentals and proficiency in Python
- Experience with workflow orchestration frameworks (e.g. Argo, Celery, Airflow)
- Experience designing and implementing RESTful APIs and internal service integrations
- Experience with observability tools (e.g. Prometheus, Grafana, CloudWatch)
- Familiarity with Kubernetes, container orchestration, and cloud infrastructure (AWS preferred)
- Exposure to the front-end, particularly with regards to custom monitoring dashboards, is a plus
- Knowledge of data-quality frameworks (Soda/Great Expectations) is a plus
- Knowledge of ETL/ELT pipelines, and metadata management is a plus
Data Engineer
Posted today
Job Viewed
Job Description
About Us:
Celebal Technologies is a leading Solution Service company that provide Services the field of Data Science, Big Data, Enterprise Cloud & Automation. We are at the forefront of leveraging cutting-edge technologies to drive innovation and enhance our business processes. As part of our commitment to staying ahead in the industry, we are seeking a talented and experienced Data & AI Engineer with strong Azure cloud competencies to join our dynamic team.
Job Summary:
We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks. The ideal candidate will have a deep understanding of streaming architectures, Medallion data models, and performance optimization techniques in cloud environments. This role requires hands-on technical expertise, including live coding during the interview process.
Key Responsibilities
• Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming.
• Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers.
• Implement efficient ingestion using Databricks Autoloader for high-throughput data loads.
• Work with large volumes of structured and unstructured data, ensuring high availability and performance.
• Apply performance tuning techniques such as partitioning, caching, and cluster resource optimization.
• Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions.
• Establish best practices for code versioning, deployment automation, and data governance.
Required Technical Skills:
• Strong expertise in Azure Databricks and Spark Structured Streaming
• Processing modes (append, update, complete)
• Output modes (append, complete, update)
• Checkpointing and state management
• Experience with Kafka integration for real-time data pipelines
• Deep understanding of Medallion Architecture
• Proficiency with Databricks Autoloader and schema evolution
• Deep understanding of Unity Catalog and Foreign catalog
• Strong knowledge of Spark SQL, Delta Lake, and DataFrames
• Expertise in performance tuning (query optimization, cluster configuration, caching strategies)
• Must have Data management strategies
• Excellent with Governance and Access management
• Strong with Data modelling, Data warehousing concepts, Databricks as a platform
• Solid understanding of Window functions
Proven experience in:
• Merge/Upsert logic
• Implementing SCD Type 1 and Type 2
• Handling CDC (Change Data Capture) scenarios
• Retail/Telcom/Energy any one industry expertise
• Real time use case execution
• Data modelling
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
Responsibilities
This is a CONTRACT TO HIRE on-site role for a Data Engineer at Quilytics in Mumbai. The contract will be of 6 months with an opportunity to convert to full time role. As a Data Engineer, you will be responsible for data integration, data modeling, ETL (Extract Transform Load), data warehousing, data analytics, and ensuring data integrity and quality. You will be expected to understanding fundamentals of data flow and orchestrations and design and implement secure pipelines and datawarehouses. Maintaining data integrity and quality is of utmost importance.
You will collaborate with the team to design, develop, and maintain data pipelines, data platforms using Cloud ecosystems like GCP, Azure, Snowflake etc. You will be responsible for creating and managing the end-to-end data pipeline using custom scripts in python, R language or any third party tools like Dataflow, Airflow, AWS Glue, Fivetran, Alteryx etc. The data pipelines built will be used for managing various operations from data acquisition, data storage to data transformation and visualization. You will also work closely with cross-functional teams to identify data-driven solutions to business problems and help clients make data-driven decisions.
You will also be also expected to help build dashboards or any custom reports in Google sheets or Excel. Basic to mid level proficiency in creating and editing dashboards on at least one tool is a must.
Qualifications
- 2+ of experience in using python language to perform Data Engineering, Data Modeling, Data Warehousing and Data Analytics and ETL (Extract Transform Load)
- Familiarity with GUI based ETL tools like Azure data factory, AWS Glue, Fivetran, Talend, Pentaho etc. for data integration and other data operations.
- Strong programming skills in SQL, and/ or R. Python. This is a must-have skill.
- Experience in designing and implementing data pipelines and data platforms in cloud and on-premise systems
- Basic to mid level proficiency in data visualization on any of the industry accepted tools like Power BI, Looker studio or Tableau is a plus.
- Understanding of data integration and data governance principles
- Knowledge of cloud platforms such as Snowflake, AWS or Azure
- Excellent analytical and problem-solving skills and good communication and interpersonal skills
- Bachelor's or Master's degree in Data Science, Computer Science, or a related field
Be The First To Know
About the latest Database developers Jobs in Mumbai !
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
About Us: As India's fastest-growing D2C brand, we are at the forefront of innovation and transformation in the market. We’re a well-funded, rapidly growing (we have recently launched our 100th store), omnichannel D2C brand with a passionate and innovative team.
Job Summary: We are seeking a Data Engineer to help us design, build and maintain our BigQuery data warehouse by performing ETL operations and creating unified data models. You will work across various data sources to create a cohesive data infrastructure that supports our omnichannel D2C strategy.
Why Join Us: Experience the exciting world of India's billion-dollar D2C market. As a well-funded, rapidly growing omnichannel D2C brand, we are committed to changing the way India sleeps and sits. You'll have the opportunity to work with a passionate and innovative team and make a real impact on our success.
Key Responsibilities:
ETL Operations: Design, implement, and manage ETL processes to extract, transform, and load data from various sources into BigQuery.
Data Warehousing: Build and maintain a robust data warehouse in BigQuery, ensuring data integrity, security, and performance.
Data Modeling: Create and manage flat, unified data models using SQL and DBT to support business analytics and reporting needs.
Performance Optimization: Optimize ETL processes and data models to ensure timely data delivery for reporting and analytics.
Collaboration: Work closely with data analysts, product managers, and other stakeholders to understand data requirements and deliver actionable.
Documentation: Maintain comprehensive documentation of data workflows, ETL processes, and data models for reference and onboarding
Troubleshoot : Monitor and troubleshoot data pipeline issues, ensuring timely resolution to minimize disruption to business operations.
Skills and Qualifications:
- Proficiency in SQL and experience with BigQuery
- Minimum 2 years of experience in data engineering or a similar role
- Experience with data pipeline and ETL tools (e.g., Apache Airflow, Talend, AWS Glue)
- Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) and their data services
- Experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake)
- Knowledge of data modeling, data architecture, and data governance best practices.
- Excellent problem-solving skills and attention to detail
- Knowledge of DBT (Data Build Tool) for data transformation
- Self-motivated, proactive, and highly accountable
- Excellent communication skills to effectively convey technical concepts and solutions
Bonus point - Prior experience in E-commerce or D2C space
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
Data Engineer
We are hiring a Data Enginner for Metro Brands Ltd-
Workstyle- Work from Office
Work location- Kurla-Mumbai
Summary : Responsible for building and maintaining the data pipelines that enable our data analytics and machine learning workflows.
Key Responsibilities :
- Develop, test, and maintain scalable data pipelines for batch and real-time data processing.
- Implement data extraction, transformation, and loading (ETL) processes.
- Work with AWS Glue, S3, Athena, Lambda, Airflow, and other data processing frameworks.
- Optimize data workflows and ensure data quality and consistency.
- Collaborate with data scientists and analysts to understand data needs and requirements.
Required Skills and Qualifications :
- Bachelor's degree in Computer Science, Data Engineering, or a related field.
- 3+ years of experience in data engineering.
- Proficiency in SQL and experience with relational databases.
- Experience in dbt (data build tool) and cloud data warehouse like snowflake.
- Experience with AWS services like S3, Glue and Redshift.
- Strong programming skills in Python with Spark.
- Familiarity with workflow orchestration tools like Apache Airflow.
- Experience with cloud data warehousing solutions like Snowflake.
Data Engineer
Posted 6 days ago
Job Viewed
Job Description
Position: Data Engineer
Location - Mumbai
Total Experience: 8+ years of experience
Domain/Vertical Insurance and / or Finance (preferred)
Qualifications Educational • B.Sc./M.Sc./BCA/MCA – IT or Computer Science
Relevant Experience
• 7+ years of experience in Data Engineering.
• Experience programming skills in both Python and SQL with Power BI.
• Experience with infrastructure as code, ideally Terraform and Terragrunt.
• Experience into Git, GitLab and Docker for building and deploying
Technical Mandatory Skills:
• Python
• SQL , Power BI
• AWS
• Experience with infrastructure as code, ideally Terraform and Terragrunt.
• Version Control – Git
• Build & Deployment – GitLab or similar using Docker
• Good database design skills and an understanding of various data modelling techniques and approaches.
• An understanding of & exposure to different database technologies.
• Ability to engineer metadata-driven approaches for sustainability and scalability.
Please contact - Saanvi Gandhi
| 77108 44668 - Whatsapp