1248 Database Developers jobs in Mumbai
Data Engineer

Posted 10 days ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Mumbai, Mahārāshtra (IN-MH), India (IN).
Job Duties: Key Responsibilities:
· Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack.
· Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure.
· Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations.
· Collaborate seamlessly across diverse technical stacks, including Databricks, Snowflake, and Azure.
· Develop and deliver detailed presentations to effectively communicate complex technical concepts.
· Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
· Adhere to Agile practices throughout the solution development process.
· Design, build, and deploy databases and data stores to support organizational requirements.
Basic Qualifications:
7+ years' experience with Azure, Snowflake, Databricks, and Python
Minimum Skills Required: At least 7+ years' experience with the following tech stack - Azure, Snowflake, Databricks, and Python
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
Data Engineer

Posted 10 days ago
Job Viewed
Job Description
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Mumbai, Mahārāshtra (IN-MH), India (IN).
Job Duties: Key Responsibilities:
· Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack.
· Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure.
· Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations.
· Collaborate seamlessly across diverse technical stacks, including Databricks, Snowflake, and Azure.
· Develop and deliver detailed presentations to effectively communicate complex technical concepts.
· Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc.
· Adhere to Agile practices throughout the solution development process.
· Design, build, and deploy databases and data stores to support organizational requirements.
Basic Qualifications:
7+ years' experience with Azure, Snowflake, Databricks, and Python
Minimum Skills Required: At least 7+ years' experience with the following tech stack - Azure, Snowflake, Databricks, and Python
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
Data Engineer

Posted 10 days ago
Job Viewed
Job Description
**Are You Ready to Make It Happen at Mondelēz International?**
**Join our Mission to Lead the Future of Snacking. Make It With Pride.**
You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs
**How you will contribute**
You will:
+ Operationalize and automate activities for efficiency and timely production of data visuals
+ Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
+ Search for ways to get new data sources and assess their accuracy
+ Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
+ Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
+ Validate information from multiple sources.
+ Assess issues that might prevent the organization from making maximum use of its information assets
**What you will bring**
A desire to drive your future and accelerate your career and the following experience and knowledge:
+ Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
+ Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
+ Ability to simplify complex problems and communicate to a broad audience
Are You Ready to Make It Happen at Mondelēz International?
Join our Mission to Lead the Future of Snacking. Make It with Pride
**In This Role**
As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.
**Role & Responsibilities:**
+ **Design and Build:** Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
+ **Manage Data Pipelines:** Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
+ **Ensure Data Quality:** Implement data quality and validation processes to ensure data accuracy and integrity.
+ **Optimize** **Data Storage:** Ensure efficient data storage and retrieval for optimal performance.
+ **Collaborate and Innovate:** Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.
**Technical Requirements:**
+ **Programming:** Python
+ **Database:** SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines.
+ **ETL & Integration:** AecorSoft, Talend, DBT, Databricks (Optional),Fivetran.
+ **Data Warehousing:** SCD, Schema Types, Data Mart.
+ **Visualization:** PowerBI (Optional), Tableau (Optional), Looker.
+ **GCP Cloud Services:** Big Query, GCS.
+ **Supply Chain:** IMS + Shipment functional knowledge good to have.
+ **Supporting Technologies:** Erwin, Collibra, Data Governance, Airflow.
**Soft Skills:**
+ **Problem-Solving:** The ability to identify and solve complex data-related challenges.
+ **Communication:** Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
+ **Analytical Thinking:** The capacity to analyze data and draw meaningful insights.
+ **Attention to Detail:** Meticulousness in data preparation and pipeline development.
+ **Adaptability:** The ability to stay updated with emerging technologies and trends in the data **engineering field.**
Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy
**Business Unit Summary**
**At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.**
**We have a rich portfolio of strong brands globally and locally including many household names such as** **_Oreo_** **,** **_belVita_** **and** **_LU_** **biscuits;** **_Cadbury Dairy Milk_** **,** **_Milka_** **and** **_Toblerone_** **chocolate;** **_Sour Patch Kids_** **candy and** **_Trident_** **gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum.**
**Our 80,000 makers and bakers are located in more** **than 80 countries** **and we sell our products in** **over 150 countries** **around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen-and happen fast.**
Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.
**Job Type**
Regular
Data Science
Analytics & Data Science
At Mondelēz International, our purpose is to empower people to snack right through offering the right snack, for the right moment, made the right way. That means delivering a broader range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.
We have a rich portfolio of strong brands - both global and local. Including many household names such as Oreo, belVita and LU biscuits; Cadbury Dairy Milk, Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the number 1 position globally in biscuits, chocolate and candy as well as the No. 2 position in gum
Our 80,000 Makers and Bakers are located in our operations in more than 80 countries and are working to sell our products in over 150 countries around the world. They are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen, and happen fast.
Join us and Make It An Opportunity!
Mondelez Global LLC is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected Veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law. Applicants who require accommodation to participate in the job application process may contact for assistance.
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Job Description: Data Engineer
As a Data Engineer, you will own the end-to-end lifecycle of our data infrastructure. You will
design and implement robust, scalable data pipelines and architect modern data solutions
using a best-in-class technology stack. Your work will transform raw, messy data into clean,
reliable, and actionable data products that power decision-making across the business.
You’ll collaborate cross-functionally with product managers, data analysts, data scientists,
and software engineers to understand data needs and deliver high-performance data
solutions. Your impact will be measured by how effectively data is delivered, modeled, and
leveraged to drive business outcomes.
Key Responsibilities:
● Architect & Build: Design, implement and manage cloud-based data platform using a
modern ELT (Extract, Load, Transform) approach.
● Data Ingestion: Develop and maintain robust data ingestion pipelines from a variety of sources, including operational databases (MongoDB, RDS), real-time IoT streams, and
third-party APIs using services like AWS Kinesis/Lambda or Azure Event Hubs/Functions.
● Data Lake Management: Build and manage a scalable and cost-effective data lake on
AWS S3 or Azure Data Lake Storage (ADLS Gen2), using open table formats like Apache
Iceberg or Delta Lake.
● Data Transformation: Develop, test, and maintain complex data transformation models using dbt. Champion a software engineering mindset by applying principles of version control (Git), CI/CD, and automated testing to all data logic.
● Orchestration: Implement and manage data pipeline orchestration using modern tools like Dagster, Apache Airflow, or Azure Data Factory.
● Data Quality & Governance: Establish and enforce data quality standards. Implement automated testing and monitoring to ensure the reliability and integrity of all data assets.
● Performance & Cost Optimization: Continuously monitor and optimize the performance and cost of the data platform, ensuring our serverless query engines and storage layers are used efficiently.
● Collaboration: Work closely with data analysts and business stakeholders to
understand their needs, model data effectively, and deliver datasets that power our BI
tools (Metabase, Power BI).
Required Skills & Experience (Must-Haves):
● 3+ years of professional experience in a data engineering role.
● Expert-level proficiency in SQL and the ability to write complex, highly-performant queries.
● Proficient in Python based data cleaning packages and tools. Experience in python is a must.
● Hands-on experience building data solutions on a major cloud provider (AWS or Azure), utilizing core services like AWS S3/Glue/Athena or Azure ADLS/Data Factory/Synapse.
● Proven experience building and maintaining data pipelines in Python.
● Experience with NoSQL databases like MongoDB, including an understanding of its data modeling, aggregation framework, and query patterns.
● Deep understanding of data warehousing concepts, including dimensional modeling, star/snowflake schemas, and data modeling best practices.
● Hands-on experience with modern data transformation tools, specifically dbt.
● Familiarity with data orchestration tools like Apache Airflow, Dagster, or Prefect.
● Proficiency with Git and experience working with CI/CD pipelines for data projects.
Preferred Skills & Experience (Nice-to-Haves):
● Experience with real-time data streaming technologies, specifically AWS Kinesis or Azure Event Hubs.
● Experience with data cataloging and governance tools (e.g., OpenMetadata, DataHub, Microsoft Purview).
● Knowledge of infrastructure-as-code tools like Terraform or CloudFormation.
● Experience with containerization technologies (Docker, Kubernetes).
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
About the Company
We are one of the leading Niche IT Services companies working with Fortune 500 GCCs in India. Over the last 13 years, over 2000 experienced professionals have chosen to work with us. With operations in India, the Netherlands, and the USA, we have a global footprint that can provide suitable opportunities and growth paths.
Position:
Snowflake Data Engineer (Snowflake Developer)
Data Engineer with 3–4 years of experience in Snowflake and SQL
About the Role
Key Responsibilities:
- Implement data integration solutions using Snowflake Development, including data ingestion from various sources.
- Write efficient SQL queries to analyze large datasets and improve performance.
- Monitor and troubleshoot Snowflake performance issues, offering solutions and enhancements.
Responsibilities:
- 3–6 years of experience in SQL and data engineering.
- Hands-on experience with Snowflake Development.
Qualifications
- 3–6 years of experience in SQL and data engineering.
Required Skills
- Mandatory Experience with Snowflake.
- Proficiency in SQL.
- Familiarity with Python and Azure Data Factory.
Preferred Skills
- Experience in data integration solutions.
- Ability to analyze large datasets.
Notice: Immediate to 15 days only
Interested? Kindly Contact details Below :
Thanks,
Dattathreya N
Technical Recruiter
Contact:
Mail:
| linkedin.com/company/andor-tech/
#1/2, Kalyanamantapa Rd, Jakkasandra, Koramangala, Bengaluru
India | USA | Netherlands | UAE
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Who is this for
If solving business challenges drives you. This is the place to be. Fornax is a team of cross-functional individuals who solve critial business challenges using core concept of analytics, critical thinking.
We are seeking a skilled Analytics Engineer who has worked in a Retail/D2C domain. The ideal candidate will possess a strong blend of functional and technical expertise, particularly in Google Analytics, Google Ads, Facebook Ads, Amazon Ads. Good understanding of the entire D2C / E-Commerce marketing value chain.
The Analytics Engineer will play a critical role in designing, developing, and maintaining our data infrastructure. This role involves working closely with data scientists, analysts, and business stakeholders to ensure data integrity, build robust data pipelines, and deliver insightful analytics solutions. The ideal candidate has a strong background in data engineering, analytics, and a keen eye for detail.
Key Responsibilities:
Stake Holder Management & Collaboration ( 10 % ) :-
- Work with data scientists, analysts, and stakeholders to understand data needs.
- Analyze and interpret data to identify trends, opportunities, and areas for improvement.
- Analyse business needs of stakeholders and customers.
- Gather Customer requirements via workshop questionnaires, surveys, site visit, Workflow storyboards, use cases and scenario mappings.
- Translate Business Requirements into functional requirements
- Create extensive project scope documentations to keep project and client teams on the same page.
- Collaborate with cross-functional teams to integrate analytics insights into the client’s operational processes.
Data Modeling ( 50% ) :
- Develop and maintain data models to support analytics and reporting.
- Design dimensional models and star schemas to effectively organize retail data including sales, inventory, customer behavior, and product performance metrics
- Collaborate with business stakeholders to translate analytical requirements into efficient data structures and ensure models align with reporting needs
- Document data lineage, business rules, and model specifications to ensure knowledge transfer and maintain data governance standards
Data Quality Management & Governance(20%) :
- Develop and implement comprehensive data quality frameworks and monitoring systems to ensure accuracy, completeness, and consistency of retail and e-commerce data
- Lead root cause analysis of data quality issues and implement preventive measures to minimize future occurrences
- Implement data cleansing and enrichment processes to improve the overall quality of historical and incoming data
- Provide training and support to team members on data quality best practices and validation procedures
Project and Team Management (20%) :
- Lead end-to-end analytics projects from initiation to delivery, ensuring adherence to timelines, budgets, and quality standards
- Coordinate cross-functional project teams including data engineers, analysts, and business stakeholders to achieve project objectives
- Develop detailed project plans, resource allocation strategies, and risk mitigation plans for analytics initiatives
- Mentor junior team members and provide technical guidance on analytics engineering best practices and methodologies
- Facilitate project status meetings, manage deliverable timelines, and communicate progress updates to senior leadership and clients
- Establish and maintain project documentation standards, including technical specifications, testing protocols, and deployment procedures
- Identify and resolve project bottlenecks, resource constraints, and technical challenges to ensure successful project completion
- Drive continuous improvement initiatives within the team by implementing agile methodologies and optimizing workflow processes
Key Qualifications
- Education: Bachelor’s degree in Computer Science, Data Science, Engineering, or related field.
- Experience: 2+ years of experience in analytics.
Technical Skills:
- Proficiency in SQL and database technologies
- Core expertise with dbt (data build tool).
- Experience with data pipeline tools (e.g., Apache Airflow).
- Familiarity with cloud platforms (e.g., AWS, Google Cloud).
- Knowledge of Python or R.
- Experience with data visualization tools (e.g., Tableau, Power BI).
Key Responsibilities:
- Data Pipeline Development: Design, build, and maintain scalable ETL processes.
- Data Modeling: Develop and maintain data models to support analytics and reporting.
- Collaboration: Work with data scientists, analysts, and stakeholders to understand data needs.
- Data Quality: Implement data quality checks and ensure data accuracy.
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Be The First To Know
About the latest Database developers Jobs in Mumbai !
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
Job Summary
We are seeking a skilled and detail-oriented Data Engineer with deep expertise in Azure, SQL Server, and Databricks to design, build, and manage scalable data pipelines and enterprise data solutions. This role will be critical in supporting our analytics, reporting, and data science initiatives by delivering high-quality, reliable, and performant data systems.
Key Responsibilities
- Design, develop, and manage ETL/ELT pipelines using Azure Data Factory (ADF) and Databricks for batch and real-time data processing.
- Integrate data from various structured and unstructured sources including SQL Server, Azure SQL Database, Azure Data Lake Storage (ADLS), and external APIs.
- Build and maintain data models, data marts, and data warehouses using SQL Server and Azure Synapse Analytics.
- Write efficient and optimized SQL queries, stored procedures, views, and triggers in SQL Server.
- Use Databricks (Spark with Python/Scala) to process large datasets for transformation and analytics.
- Ensure data quality, integrity, security, and compliance across the pipeline using data validation, monitoring, and auditing techniques.
- Collaborate with data analysts, data scientists, and business stakeholders to define and deliver data solutions.
- Implement CI/CD pipelines using Azure DevOps .
- Monitor and optimize the performance of data pipelines and queries across Azure and SQL Server environments.
Required Skills & Qualifications:
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
- 5+ years of experience in data engineering or related roles.
- Proven experience with:
- Azure Data Services: Data Factory, ADLS, Azure SQL, Azure Synapse
- Databricks (Azure implementation), including experience with Spark (Python or Scala).
- Microsoft SQL Server: Writing advanced SQL, stored procedures, and performance tuning.
- Strong understanding of data warehousing concepts, ETL/ELT best practices, and data modelling (star/snowflake schema).
- Familiarity with data governance, RBAC, and data security practices in Azure.
- Experience with CI/CD tools like Azure DevOps and version control with Git.
- Excellent problem-solving skills and the ability to work collaboratively in a fast-paced environment.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
We’re Hiring!
Data Engineer
At Envu, we partner with our customers to design world-class, forward-thinking innovations that protect and enhance the health of environments around the world. We offer dedicated services in: Professional Pest Management, Forestry, Ornamentals, Golf, Industrial Vegetation Management, Lawn & Landscape, Mosquito Management, and Range & Pasture.
Envu brings together a broad range of perspectives to look beyond chemistry and dare to explore new paths forward. Guided by our inclusive culture, we embrace change and flexibility, tackling our customers’ toughest challenges proactively, passionately and with an entrepreneurial spirit.
We pursue our ambitions collaboratively because we know that a unified and empowered team is an unstoppable force, allowing us to achieve our vision of healthy environments for everyone, everywhere. Join Us.
Envu is proud to be Great Place to Work Certified in the US, France, and India. (June 2025 - June 2026)
FUNCTION: Global Innovation
LOCATION: Thane, India
TYPE: Permanent
Role Seniority : Managerial
GET TO KNOW YOUR AREA:
- We are seeking a Data Engineer with strong analytical and technical expertise to support our data-driven transformation. You will be responsible for building and maintaining data infrastructure and pipelines, enabling robust analytics and machine learning initiatives across multiple departments
- This role requires hands-on experience with data engineering tools and platforms, a solid foundation in Python and SQL, and ideally, some domain knowledge in chemical processes or related industries. You will work closely with stakeholders in R&D and regulatory teams to enable data accessibility, insight generation, and technical innovation
YOUR MISSION WILL BE TO:
- Assemble large, complex datasets that meet functional and non-functional business requirements
- Design and implement internal process improvements, including infrastructure re-architecture for scalability, optimized data delivery, and automation of manual workflows
- Build infrastructure for efficient data extraction, transformation, and loading (ETL) using AWS and SQL technologies
- Develop and maintain automated data pipelines across sources like SQL Server, LIMS, Veeva, and unstructured text files
- Create analytical tools and dashboards (e.g., Power BI reports, RAG systems) to drive key insights on operational efficiency and customer acquisition
- Prototype solutions for innovation teams using integrated datasets, including ML and generative AI applications
- Support stakeholders with data infrastructure needs and resolve data-related technical issues
- Lead technical projects and coordinate with external vendors/ consultants to deliver robust data solutions
- Act as a liaison between internal subject matter experts and technical partners to ensure effective knowledge transfer and implementation
ARE YOU READY FOR THE ROLE?
Main requirements:
- Bachelors in a STEM field. Background or experience in the life-sciences/chemical industry strongly preferred
- Python: Proficient in data handling and processing libraries (e.g., pandas, NumPy); experience packaging code for reuse
- Data Collection: Experience using APIs or web scraping for data acquisition
- Cloud Computing: Experience with Azure preferred; AWS or other cluster computing environments acceptable
- SQL: ability to write and optimize complex SQL queries
- Working knowledge of PowerBI, Tableau or similar tool
- Passionate about data and eager to learn and discover
- Willing to work in a highly ambiguous environment, with constantly evolving business needs, and not deterred by complexity
Environmental Science U.S. LLC is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
By applying for this position, you agree that your personal data are going to be processed and recorded by Envu for recruitment purposes only. For candidates who are not selected for this position, personal data will be kept for a period of two years and then permanently deleted.
We will soon be in touch to let you know the next steps to be taken!
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
We are seeking a skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, implementing, and maintaining our data infrastructure to support our B2B intelligence platform.
Responsibilities
- Design, build, and maintain scalable data pipelines for collecting, processing, and storing large volumes of business data
- Develop ETL processes to integrate data from various sources, including web scraping, APIs, and third-party data providers
- Implement data quality checks and monitoring systems to ensure data accuracy and integrity
- Optimize data storage and retrieval processes for high performance and scalability
- Collaborate with data scientists to implement machine learning models in production environments
- Work with the backend team to design and implement APIs for data access
- Collaborate with data scientists to develop and deploy machine learning models.
- Implement data security and privacy measures to protect sensitive information.
- Stay up to date with the latest big data technologies and best practices
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
- 4+ years of experience in data engineering roles
- Strong programming skills in Python, Scala and/or Java
- Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra)
- Proficiency with big data technologies such as Apache Spark, Hadoop, and Kafka
- Experience with cloud platforms (AWS, GCP, or Azure) and their data services
- Familiarity with data warehousing concepts and ETL processes
- Experience with data warehousing solutions (e.g., AWS Redshift, Snowflake).
- Knowledge of data modelling, data architecture, and data pipeline design
- Experience with version control systems (e.g., Git) and CI/CD practices
- Excellent problem-solving skills and attention to detail
Preferred Qualifications
- Experience in the B2B data or sales intelligence industry
- Familiarity with web scraping techniques and tools
- Knowledge of data privacy regulations (e.g., GDPR, CCPA)
- Experience with real-time data processing and streaming architectures