24,153 Data Engineer jobs in India

Neo4j Engineer

Srikakulam, Andhra Pradesh Bluetick Consultants Inc.

Job Viewed

Tap Again To Close

Job Description

Role: Neo4j Engineer

Overall IT Experience: 7+ years

Relevant experience: (Graph Databases: 4+ years, Neo4j: 2+ years)

Location: Remote


Company Description

Bluetick Consultants is a technology-driven firm that supports hiring remote developers, building technology products, and enabling end-to-end digital transformation. With previous experience in top technology companies such as Amazon, Microsoft, and Craftsvilla, we understand the needs of our clients and provide customized solutions. Our team has expertise in emerging technologies, backend and frontend development, cloud development, and mobile technologies. We prioritize staying up-to-date with the latest technological advances to create a long-term impact and grow together with our clients.


Key Responsibilities 

• Graph Database Architecture: Design and implement Neo4j graph database schemas optimized for fund administration data relationships and AI-powered queries 

• Knowledge Graph Development: Build comprehensive knowledge graphs connecting entities like funds, investors, companies, transactions, legal documents, and market data 

• Graph-AI Integration: Integrate Neo4j with AI/ML pipelines, particularly for enhanced RAG (Retrieval-Augmented Generation) systems and semantic search capabilities 

• Complex Relationship Modeling: Model intricate relationships between Limited Partners, General Partners, fund structures, investment flows, and regulatory requirements 

• Query Optimization: Develop high-performance Cypher queries for real-time analytics, relationship discovery, and pattern recognition 

• Data Pipeline Integration: Build ETL processes to populate and maintain graph databases from various data sources including FundPanel.io, legal documents, and external market data using domain specific ontologies  

• Graph Analytics: Implement graph algorithms for fraud detection, risk assessment, relationship scoring, and investment opportunity identification 

• Performance Tuning: Optimize graph database performance for concurrent users and complex analytical queries 

• Documentation & Standards: Establish graph modelling standards, query optimization guidelines, and comprehensive technical documentation 


Key Use Cases You'll Enable 

• Semantic Search Enhancement: Create knowledge graphs that improve AI search accuracy by understanding entity relationships and context 

• Investment Network Analysis: Map complex relationships between investors, funds, portfolio companies, and market segments 

• Compliance Graph Modelling: Model regulatory relationships and fund terms to support automated auditing and compliance validation 

• Customer Relationship Intelligence: Build relationship graphs for customer relations monitoring and expansion opportunity identification 

• Predictive Modelling Support: Provide graph-based features for investment prediction and risk assessment models 

• Document Relationship Mapping: Connect legal documents, contracts, and agreements through entity and relationship extraction 


Required Qualifications 

• Bachelor's degree in Computer Science, Data Engineering, or related field 

• 7+ years of overall IT Experience 

• 4+ years of experience with graph databases, with 2+ years specifically in Neo4j 

• Strong background in data modelling, particularly for complex relationship structures 

• Experience with financial services data and regulatory requirements preferred 

• Proven experience integrating graph databases with AI/ML systems 

• Understanding of knowledge graph concepts and semantic technologies 

• Experience with high-volume, production-scale graph database implementations 


Technology Skills 

• Graph Databases: Neo4j (primary), Cypher query language, APOC procedures, Neo4j Graph Data Science library 

• Programming: Python, Java, or Scala for graph data processing and integration 

• AI Integration: Experience with graph-enhanced RAG systems, vector embeddings in graph context, GraphRAG implementations 

• Data Processing: ETL pipelines, data transformation, real-time data streaming (Kafka, Apache Spark) 

• Cloud Platforms: Neo4j Aura, Azure integration, containerized deployments 

• APIs: Neo4j drivers, REST APIs, GraphQL integration 

• Analytics: Graph algorithms (PageRank, community detection, shortest path, centrality measures) 

• Monitoring: Neo4j monitoring tools, performance profiling, query optimization 

• Integration: Elasticsearch integration, vector database connections, multi-modal data handling 


Specific Technical Requirements 

• Knowledge Graph Construction: Entity resolution, relationship extraction, ontology modelling 

• Cypher Expertise: Advanced Cypher queries, stored procedures, custom functions 

• Scalability: Clustering, sharding, horizontal scaling strategies 

• Security: Graph-level security, role-based access control, data encryption 

• Version Control: Graph schema versioning, migration strategies  

• Backup & Recovery: Graph database backup strategies, disaster recovery planning 


Industry Context Understanding 

• Fund Administration: Understanding of fund structures, capital calls, distributions, and investor relationships 

• Financial Compliance: Knowledge of regulatory requirements and audit trails in financial services 

• Investment Workflows: Understanding of due diligence processes, portfolio management, and investor reporting 

• Legal Document Structures: Familiarity with LPA documents, subscription agreements, and fund formation documents 


Collaboration Requirements 

• AI/ML Team: Work closely with GenAI engineers to optimize graph-based AI applications 

• Data Architecture Team: Collaborate on overall data architecture and integration strategies 

• Backend Developers: Integrate graph databases with application APIs and microservices 

• DevOps Team: Ensure proper deployment, monitoring, and maintenance of graph database infrastructure 

• Business Stakeholders: Translate business requirements into effective graph models and queries 


Performance Expectations 

• Query Performance: Ensure sub-second response times for standard relationship queries 

• Scalability: Support 100k+ users with concurrent access to graph data 

• Accuracy: Maintain data consistency and relationship integrity across complex fund structures 

• Availability: Ensure 99.9% uptime for critical graph database services 

• Integration Efficiency: Seamless integration with existing FundPanel.io systems and new AI services 


This role offers the opportunity to work at the intersection of advanced graph technology and artificial intelligence, creating innovative solutions that will transform how fund administrators understand and leverage their data relationships.

This advertiser has chosen not to accept applicants from your region.

Job No Longer Available

This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.

However, we have similar jobs available for you below.

Senior Data Engineer / Data Engineer

Kochi, Kerala Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

LOOKING FOR IMMEDIATE JOINERS OR 15 DAYS NOTICE PERIODS AND THIS IS WORK FROM HOME OPPORTUNITY

Position: Senior Data Engineer / Data Engineer

Desired Experience: 3-8 years

Salary: Best-in-industry

You will act as a key member of the Data consulting team, working directly with the partners and senior

stakeholders of the clients designing and implementing big data and analytics solutions. Communication

and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of

the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure,

Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and

Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile

methodologies such as Scrum

Experience of working with development teams and product owners to understand their

requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics

teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps

showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or

detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems

This advertiser has chosen not to accept applicants from your region.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview



We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer trust.

As a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.


This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.


Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments infrastructure.


Key Responsibilities


Data Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilities Infrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuity


Performance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increases


Collaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharing


Required Qualifications


Experience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field

7+ years (Senior) of hands-on data engineering experience

Proven experience with large-scale data processing systems (preferably in fintech/payments domain)

Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standards


Technical Skills & RequirementsProgramming Languages:

Expert-level proficiency in both Python and Java; experience with Scala preferred


Big Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache Airflow

Cloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP services

Databases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)

Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and Correctness

Data Governance: Experience with data lineage tracking, metadata management, and data cataloging

Data Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQL Containerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experience

Monitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring tools

Data Modeling: Dimensional modeling, data vault, or similar methodologies

Streaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plus

Infrastructure as Code: Terraform, CloudFormation (preferred)

Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data services


Preferred Qualifications


Domain Expertise

Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protection


Advanced Technical Skills (Preferred)


Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniques


Leadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environments


What We Offer


Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensation

Comprehensive health and wellness benefits

Professional development opportunities and conference attendanceFlexible working arrangements

Chance to impact millions of users across India's digital payments ecosystem


Application Process


Interested candidates should submit:

Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data quality

Portfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systems

Cover letter explaining interest in fintech/payments domain and understanding of data criticality in financial services

References from previous technical managers or senior colleagues who can attest to your data quality standards






PI1255d80c7d41-30511-38316905

This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer / Data Engineer

Gurugram, Uttar Pradesh Invokhr

Posted today

Job Viewed

Tap Again To Close

Job Description

Desired Experience: 3-8 years

Salary: Best-in-industry

Location: Gurgaon ( 5 days onsite)


Overview:

You will act as a key member of the Data consulting team, working directly with the partners and senior stakeholders of the clients designing and implementing big data and analytics solutions. Communication and organisation skills are keys for this position, along with a problem-solution attitude.

What is in it for you:

Opportunity to work with a world class team of business consultants and engineers solving some of the most complex business problems by applying data and analytics techniques

Fast track career growth in a highly entrepreneurial work environment

Best-in-industry renumeration package

Essential Technical Skills:

Technical expertise with emerging Big Data technologies, such as: Python, Spark, Hadoop, Clojure, Git, SQL and Databricks; and visualization tools: Tableau and PowerBI

Experience with cloud, container and micro service infrastructures

Experience working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Hands-on experience with data modelling, query techniques and complexity analysis

Desirable Skills:

Experience/Knowledge of working in an agile environment and experience with agile methodologies such as Scrum

Experience of working with development teams and product owners to understand their requirement

Certifications on any of the above areas will be preferred.

Your duties will include:

Develop data solutions within a Big Data Azure and/or other cloud environments

Working with divergent data sets that meet the requirements of the Data Science and Data Analytics teams

Build and design Data Architectures using Azure Data factory, Databricks, Data lake, Synapse

Liaising with CTO, Product Owners and other Operations teams to deliver engineering roadmaps showing key items such as upgrades, technical refreshes and new versions

Perform data mapping activities to describe source data, target data and the high-level or detailed transformations that need to occur;

Assist Data Analyst team in developing KPIs and reporting in tools viz. Power BI, Tableau

Data Integration, Transformation, Modelling

Maintaining all relevant documentation and knowledge bases

Research and suggest new database products, services and protocols

Essential Personal Traits:

You should be able to work independently and communicate effectively with remote teams.

Timely communication/escalation of issues/dependencies to higher management.

Curiosity to learn and apply emerging technologies to solve business problems


** Interested candidate please send thier resume on - and **

This advertiser has chosen not to accept applicants from your region.

Data Engineer- Lead Data Engineer

Bengaluru, Karnataka Paytm

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview We are seeking an experienced Lead Data Engineer to join our Data Engineering team at Paytm, India's leading digital payments and financial services platform. This is a critical role responsible for designing, building, and maintaining large-scale, real-time data streams that process billions of transactions and user interactions daily. Data accuracy and stream reliability are essential to our operations, as data quality issues can result in financial losses and impact customer a Lead Data Engineer at Paytm, you will be responsible for building robust data systems that support India's largest digital payments ecosystem. You'll architect and implement reliable, real-time data streaming solutions where precision and data correctness are fundamental requirements . Your work will directly support millions of users across merchant payments, peer-to-peer transfers, bill payments, and financial services, where data accuracy is crucial for maintaining customer confidence and operational excellence.This role requires expertise in designing fault-tolerant, scalable data architectures that maintain high uptime standards while processing peak transaction loads during festivals and high-traffic events. We place the highest priority on data quality and system reliability, as our customers depend on accurate, timely information for their financial decisions. You'll collaborate with cross-functional teams including data scientists, product managers, and risk engineers to deliver data solutions that enable real-time fraud detection, personalized recommendations, credit scoring, and regulatory compliance reporting.Key technical challenges include maintaining data consistency across distributed systems with demanding performance requirements, implementing comprehensive data quality frameworks with real-time validation, optimizing query performance on large datasets, and ensuring complete data lineage and governance across multiple business domains. At Paytm, reliable data streams are fundamental to our operations and our commitment to protecting customers' financial security and maintaining India's digital payments ResponsibilitiesData Stream Architecture & Development Design and implement reliable, scalable data streams handling high-volume transaction data with strong data integrity controlsBuild real-time processing systems using modern data engineering frameworks (Java/Python stack) with excellent performance characteristicsDevelop robust data ingestion systems from multiple sources with built-in redundancy and monitoring capabilitiesImplement comprehensive data quality frameworks, ensuring the 4 C's: Completeness, Consistency, Conformity, and Correctness - ensuring data reliability that supports sound business decisionsDesign automated data validation, profiling, and quality monitoring systems with proactive alerting capabilitiesInfrastructure & Platform Management Manage and optimize distributed data processing platforms with high availability requirements to ensure consistent service deliveryDesign data lake and data warehouse architectures with appropriate partitioning and indexing strategies for optimal query performanceImplement CI/CD processes for data engineering workflows with comprehensive testing and reliable deployment proceduresEnsure high availability and disaster recovery for critical data systems to maintain business continuityPerformance & Optimization Monitor and optimize streaming performance with focus on latency reduction and operational efficiencyImplement efficient data storage strategies including compression, partitioning, and lifecycle management with cost considerationsTroubleshoot and resolve complex data streaming issues in production environments with effective response protocolsConduct proactive capacity planning and performance tuning to support business growth and data volume increasesCollaboration & Leadership Work closely with data scientists, analysts, and product teams to understand important data requirements and service level expectationsMentor junior data engineers with emphasis on data quality best practices and customer-focused approachParticipate in architectural reviews and help establish data engineering standards that prioritize reliability and accuracyDocument technical designs, processes, and operational procedures with focus on maintainability and knowledge sharingRequired QualificationsExperience & Education Bachelor's or Master's degree in Computer Science, Engineering, or related technical field7+ years (Senior) of hands-on data engineering experienceProven experience with large-scale data processing systems (preferably in fintech/payments domain)Experience building and maintaining production data streams processing TB/PB scale data with strong performance and reliability standardsTechnical Skills & RequirementsProgramming Languages: Expert-level proficiency in both Python and Java; experience with Scala preferredBig Data Technologies: Apache Spark (PySpark, Spark SQL, Spark with Java), Apache Kafka, Apache AirflowCloud Platforms: AWS (EMR, Glue, Redshift, S3, Lambda) or equivalent Azure/GCP servicesDatabases: Strong SQL skills, experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, Redis)Data Quality Management: Deep understanding of the 4 C's framework - Completeness, Consistency, Conformity, and CorrectnessData Governance: Experience with data lineage tracking, metadata management, and data catalogingData Formats & Protocols: Parquet, Avro, JSON, REST APIs, GraphQLContainerization & DevOps: Docker, Kubernetes, Git, GitLab/GitHub with CI/CD pipeline experienceMonitoring & Observability: Experience with Prometheus, Grafana, or similar monitoring toolsData Modeling: Dimensional modeling, data vault, or similar methodologiesStreaming Technologies: Apache Flink, Kinesis, or Pulsar experience is a plusInfrastructure as Code: Terraform, CloudFormation (preferred)Java-specific: Spring Boot, Maven/Gradle, JUnit for building robust data servicesPreferred QualificationsDomain Expertise Previous experience in fintech, payments, or banking industry with solid understanding of regulatory compliance and financial data requirementsUnderstanding of financial data standards, PCI DSS compliance, and data privacy regulations where compliance is essential for business operationsExperience with real-time fraud detection or risk management systems where data accuracy is crucial for customer protectionAdvanced Technical Skills (Preferred) Experience building automated data quality frameworks covering all 4 C's dimensionsKnowledge of machine learning stream orchestration (MLflow, Kubeflow)Familiarity with data mesh or federated data architecture patternsExperience with change data capture (CDC) tools and techniquesLeadership & Soft Skills Strong problem-solving abilities with experience debugging complex distributed systems in production environmentsExcellent communication skills with ability to explain technical concepts to diverse stakeholders while highlighting business valueExperience mentoring team members and leading technical initiatives with focus on building a quality-oriented cultureProven track record of delivering projects successfully in dynamic, fast-paced financial technology environmentsWhat We Offer Opportunity to work with cutting-edge technology at scaleCompetitive salary and equity compensationComprehensive health and wellness benefitsProfessional development opportunities and conference attendanceFlexible working arrangementsChance to impact millions of users across India's digital payments ecosystemApplication Process Interested candidates should submit:Updated resume highlighting relevant data engineering experience with emphasis on real-time systems and data qualityPortfolio or GitHub profile showcasing data engineering projects, particularly those involving high-throughput streaming systemsCover letter explaining interest in fintech/payments domain and understanding of data criticality in financial servicesReferences from previous technical managers or senior colleagues who can attest to your data quality standards
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka NTT America, Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

**Req ID:** 335681
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
**Key Responsibilities:**
- Develop data pipeline to integrate data movement tasks from multiple API data sources.
- Ensure data integrity, consistency, and normalization.
- Gather requirements from stakeholders to align with business needs.
- Collaborate with business analysts, data architects, and engineers to design solutions.
- Support ETL (Extract, Transform, Load) processes for data migration and integration.
- Ensure adherence to industry standards, security policies, and data governance frameworks.
- Keep up with industry trends in data modeling, big data, and AI/ML.
- Recommend improvements to data architecture for scalability and efficiency.
- Work with compliance teams to align data models with regulations (GDPR, HIPAA, etc.).
**Basic Qualifications:**
- 8+ years experience in professional servies or related field
- 3+ years experience working with databases such as Oracle, SQL Server and Azure cloud data platform.
- 3+ years of experience working with SQL tools.
- 2+ years of experience working with Azure Data Factory, Python
- 2+ years of experience working with API data integration tasks
**Preferred Qualifications:**
- Proven work experience as a Spark/PySpark development work
- Knowledge of database structure systems
- Excellent analytical and problem-solving skills
- Understanding of agile methodologies
- Undergraduate or Graduate degree preferred
- Ability to travel at least 25%.
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bangalore, Karnataka NTT DATA North America

Posted today

Job Viewed

Tap Again To Close

Job Description

**Req ID:** 335681
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Data Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).
**Key Responsibilities:**
- Develop data pipeline to integrate data movement tasks from multiple API data sources.
- Ensure data integrity, consistency, and normalization.
- Gather requirements from stakeholders to align with business needs.
- Collaborate with business analysts, data architects, and engineers to design solutions.
- Support ETL (Extract, Transform, Load) processes for data migration and integration.
- Ensure adherence to industry standards, security policies, and data governance frameworks.
- Keep up with industry trends in data modeling, big data, and AI/ML.
- Recommend improvements to data architecture for scalability and efficiency.
- Work with compliance teams to align data models with regulations (GDPR, HIPAA, etc.).
**Basic Qualifications:**
- 8+ years experience in professional servies or related field
- 3+ years experience working with databases such as Oracle, SQL Server and Azure cloud data platform.
- 3+ years of experience working with SQL tools.
- 2+ years of experience working with Azure Data Factory, Python
- 2+ years of experience working with API data integration tasks
**Preferred Qualifications:**
- Proven work experience as a Spark/PySpark development work
- Knowledge of database structure systems
- Excellent analytical and problem-solving skills
- Understanding of agile methodologies
- Undergraduate or Graduate degree preferred
- Ability to travel at least 25%.
**About NTT DATA**
NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ( possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, .
**_NTT DATA endeavors to make_** **_ **_accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at_** **_ **_._** **_This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here ( . If you'd like more information on your EEO rights under the law, please click here ( . For Pay Transparency information, please click here ( ._**
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Mumbai, Maharashtra Mondelez International

Posted today

Job Viewed

Tap Again To Close

Job Description

**Job Description**
**Are You Ready to Make It Happen at Mondelēz International?**
**Join our Mission to Lead the Future of Snacking. Make It With Pride.**
You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs
**How you will contribute**
You will:
+ Operationalize and automate activities for efficiency and timely production of data visuals
+ Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
+ Search for ways to get new data sources and assess their accuracy
+ Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
+ Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
+ Validate information from multiple sources.
+ Assess issues that might prevent the organization from making maximum use of its information assets
**What you will bring**
A desire to drive your future and accelerate your career and the following experience and knowledge:
+ Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
+ Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
+ Ability to simplify complex problems and communicate to a broad audience
Are You Ready to Make It Happen at Mondelēz International?
Join our Mission to Lead the Future of Snacking. Make It with Pride
**In This Role**
As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.
**Role & Responsibilities:**
+ **Design and Build:** Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
+ **Manage Data Pipelines:** Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
+ **Ensure Data Quality:** Implement data quality and validation processes to ensure data accuracy and integrity.
+ **Optimize** **Data Storage:** Ensure efficient data storage and retrieval for optimal performance.
+ **Collaborate and Innovate:** Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.
**Technical Requirements:**
+ **Programming:** Python
+ **Database:** SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines.
+ **ETL & Integration:** AecorSoft, Talend, DBT, Databricks (Optional),Fivetran.
+ **Data Warehousing:** SCD, Schema Types, Data Mart.
+ **Visualization:** PowerBI (Optional), Tableau (Optional), Looker.
+ **GCP Cloud Services:** Big Query, GCS.
+ **Supply Chain:** IMS + Shipment functional knowledge good to have.
+ **Supporting Technologies:** Erwin, Collibra, Data Governance, Airflow.
**Soft Skills:**
+ **Problem-Solving:** The ability to identify and solve complex data-related challenges.
+ **Communication:** Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
+ **Analytical Thinking:** The capacity to analyze data and draw meaningful insights.
+ **Attention to Detail:** Meticulousness in data preparation and pipeline development.
+ **Adaptability:** The ability to stay updated with emerging technologies and trends in the data **engineering field.**
Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy
**Business Unit Summary**
**At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.**
**We have a rich portfolio of strong brands globally and locally including many household names such as** **_Oreo_** **,** **_belVita_** **and** **_LU_** **biscuits;** **_Cadbury Dairy Milk_** **,** **_Milka_** **and** **_Toblerone_** **chocolate;** **_Sour Patch Kids_** **candy and** **_Trident_** **gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum.**
**Our 80,000 makers and bakers are located in more** **than 80 countries** **and we sell our products in** **over 150 countries** **around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen-and happen fast.**
Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.
**Job Type**
Regular
Data Science
Analytics & Data Science
At Mondelēz International, our purpose is to empower people to snack right through offering the right snack, for the right moment, made the right way. That means delivering a broader range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about.
We have a rich portfolio of strong brands - both global and local. Including many household names such as Oreo, belVita and LU biscuits; Cadbury Dairy Milk, Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the number 1 position globally in biscuits, chocolate and candy as well as the No. 2 position in gum
Our 80,000 Makers and Bakers are located in our operations in more than 80 countries and are working to sell our products in over 150 countries around the world. They are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen, and happen fast.
Join us and Make It An Opportunity!
Mondelez Global LLC is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected Veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law. Applicants who require accommodation to participate in the job application process may contact for assistance.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in India !

Data Engineer

Mohali, Punjab Copeland

Posted today

Job Viewed

Tap Again To Close

Job Description

**About Us**
We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. 
Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today!
**Software Development**
+ Develops code and solutions that transfers/transforms data across various systems
+ Maintains deep technical knowledge of various tools in the data warehouse, data hub,
+ and analytical tools.
+ Ensures data is transformed and stored in efficient methods for retrieval and use.
+ Maintains data systems to ensure optimal performance
+ Develops a deep understanding of underlying business systems involved with analytical systems.
+ Follows standard software development lifecycle, code control, code standards and
+ process standards.
+ Maintains and develops technical knowledge by self-training of current toolsets and
+ computing environments, participates in educational opportunities, maintains professional
+ networks, and participates in professional organizations related to their tech skills
**Systems Analysis**
+ Works with key stakeholders to understand business needs and capture functional and
+ technical requirements.
+ Offers ideas that simplify the design and complexity of solutions delivered.
+ Effectively communicates any expectations required of stakeholders or other resources
+ during solution delivery.
+ Develops and executes test plans to ensure successful rollout of solution, including accuracy
+ and quality of data.
**Service Management**
+ Effectively communicates to leaders and stakeholders any obstacles that occur during
+ solution delivery.
+ Defines and manages promised delivery dates.
+ Pro-actively research, analyzes, and predicts operational issues, informing leadership
+ where appropriate.
+ Offers viable options to solve unexpected/unknown issues that occur during
+ solution development and delivery.
**EDUCATION / JOB-RELATED TECHNICAL SKILLS:**
+ Bachelor's Degree in Computer Science/Information Technology or equivalent
+ Ability to effectively communicate with others at all levels of the Company both verbally and in
+ writing. Demonstrates a courteous, tactful, and professional approach with employees and
+ others.
+ Ability to work in a large, global corporate structure
+ Advanced English level, advanced level of additional language is a plus
+ Strong sense of ethics and company's core values and adherence thereto
+ Willingness to travel both domestically and internationally to support global implementations.
+ Demonstrated ability to clearly isolate and define problems, effectively evaluate alternative
solutions, and make decisions in a timely manner.
+ Good decision-making ability, ability to operate in ambiguous situations and high analytical
+ ability to judge pros-cons of approaches against objectives.
+ Candidate must have a minimum of (3) years of experience in a Data Engineer role, including the following tools/technologies:
+ Experience with relational (SQL) databases.
+ Experience with data warehouses like Oracle, SQL & Snowflake.
+ Technical expertise in data modeling, data mining and segmentation techniques.
+ Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration.
+ Experience with batch and real time data ingestion and processing frameworks.
+ Experience with languages like Python, Java, etc.
+ Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus.
**BEHAVIOR / SOFT SKILLS:**
+ Professional skills, written technical concepts
+ Leads problem solving teams
+ Able to resolve conflict efficiently
+ Works with multiple cross-functional projects
+ Drives process mapping sessions
**KORN FERRY COMPETENCIES:**
+ Customer Focus
+ Builds Networks
+ Instills Trust
+ Tech Savvy
+ Interpersonal Savvy
+ Demonstrates Self-Awareness
+ Action Oriented
+ Collaborates
+ Nimble Learner
**Our Commitment to Our People**
Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. That's why everything we do is geared toward a sustainable future-for our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial.
Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally.
Flexible and competitive benefits plans offer the right options to meet your individual/family needs. We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. 
Together, we have the opportunity - and the power - to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team!
**Our Commitment to Diversity, Equity & Inclusion**
At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise. Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. 
**Equal Opportunity Employer**
Copeland is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to sex, race, color, religion, national origin, age, marital status, political affiliation, sexual orientation, gender identity, genetic information, disability or protected veteran status. We are committed to providing a workplace free of any discrimination or harassment.
With $5B of global revenue, Copeland is a leading provider of compression products, electronics, software, and solutions across many applications within Heating, Ventilation, Air Conditioning, and Refrigeration (HVACR), where macro and regulatory trends towards environmental sustainability, leads to changes in HVACR technology. Other products include other heating applications, food service and retail, transportation, and healthcare/life sciences. This new business also has a solution portfolio that manages, monitors, and controls refrigeration units in the commercial setting, as well as software solutions that measure and monitor temperature conditions of refrigerated goods in transit, where there is a greater emphasis on energy management/sustainability solutions globally.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Pune, Maharashtra Copeland

Posted today

Job Viewed

Tap Again To Close

Job Description

**About Us**
We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. 
Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today!
**Software Development**
+ Develops code and solutions that transfers/transforms data across various systems
+ Maintains deep technical knowledge of various tools in the data warehouse, data hub,
+ and analytical tools.
+ Ensures data is transformed and stored in efficient methods for retrieval and use.
+ Maintains data systems to ensure optimal performance
+ Develops a deep understanding of underlying business systems involved with analytical systems.
+ Follows standard software development lifecycle, code control, code standards and
+ process standards.
+ Maintains and develops technical knowledge by self-training of current toolsets and
+ computing environments, participates in educational opportunities, maintains professional
+ networks, and participates in professional organizations related to their tech skills
**Systems Analysis**
+ Works with key stakeholders to understand business needs and capture functional and
+ technical requirements.
+ Offers ideas that simplify the design and complexity of solutions delivered.
+ Effectively communicates any expectations required of stakeholders or other resources
+ during solution delivery.
+ Develops and executes test plans to ensure successful rollout of solution, including accuracy
+ and quality of data.
**Service Management**
+ Effectively communicates to leaders and stakeholders any obstacles that occur during
+ solution delivery.
+ Defines and manages promised delivery dates.
+ Pro-actively research, analyzes, and predicts operational issues, informing leadership
+ where appropriate.
+ Offers viable options to solve unexpected/unknown issues that occur during
+ solution development and delivery.
**EDUCATION / JOB-RELATED TECHNICAL SKILLS:**
+ Bachelor's Degree in Computer Science/Information Technology or equivalent
+ Ability to effectively communicate with others at all levels of the Company both verbally and in
+ writing. Demonstrates a courteous, tactful, and professional approach with employees and
+ others.
+ Ability to work in a large, global corporate structure
+ Advanced English level, advanced level of additional language is a plus
+ Strong sense of ethics and company's core values and adherence thereto
+ Willingness to travel both domestically and internationally to support global implementations.
+ Demonstrated ability to clearly isolate and define problems, effectively evaluate alternative
solutions, and make decisions in a timely manner.
+ Good decision-making ability, ability to operate in ambiguous situations and high analytical
+ ability to judge pros-cons of approaches against objectives.
+ Candidate must have a minimum of (3) years of experience in a Data Engineer role, including the following tools/technologies:
+ Experience with relational (SQL) databases.
+ Experience with data warehouses like Oracle, SQL & Snowflake.
+ Technical expertise in data modeling, data mining and segmentation techniques.
+ Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration.
+ Experience with batch and real time data ingestion and processing frameworks.
+ Experience with languages like Python, Java, etc.
+ Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus.
**BEHAVIOR / SOFT SKILLS:**
+ Professional skills, written technical concepts
+ Leads problem solving teams
+ Able to resolve conflict efficiently
+ Works with multiple cross-functional projects
+ Drives process mapping sessions
**KORN FERRY COMPETENCIES:**
+ Customer Focus
+ Builds Networks
+ Instills Trust
+ Tech Savvy
+ Interpersonal Savvy
+ Demonstrates Self-Awareness
+ Action Oriented
+ Collaborates
+ Nimble Learner
**Our Commitment to Our People**
Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. That's why everything we do is geared toward a sustainable future-for our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial.
Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally.
Flexible and competitive benefits plans offer the right options to meet your individual/family needs. We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. 
Together, we have the opportunity - and the power - to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team!
**Our Commitment to Diversity, Equity & Inclusion**
At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise. Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. 
**Equal Opportunity Employer**
Copeland is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to sex, race, color, religion, national origin, age, marital status, political affiliation, sexual orientation, gender identity, genetic information, disability or protected veteran status. We are committed to providing a workplace free of any discrimination or harassment.
With $5B of global revenue, Copeland is a leading provider of compression products, electronics, software, and solutions across many applications within Heating, Ventilation, Air Conditioning, and Refrigeration (HVACR), where macro and regulatory trends towards environmental sustainability, leads to changes in HVACR technology. Other products include other heating applications, food service and retail, transportation, and healthcare/life sciences. This new business also has a solution portfolio that manages, monitors, and controls refrigeration units in the commercial setting, as well as software solutions that measure and monitor temperature conditions of refrigerated goods in transit, where there is a greater emphasis on energy management/sustainability solutions globally.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Bengaluru, Karnataka Autodesk

Posted today

Job Viewed

Tap Again To Close

Job Description

**Job Requisition ID #**
25WD90048
**Position Overview**
We are looking for an exceptional data engineer to transform, optimize, test, and maintain architectures for enterprise analytics databases, data pipelines, and processing systems, as well as optimizing data flow and collection for cross functional teams. The mission of the team is to empower decision makers and the broader data communities through trusted data assets and scalable self-serve analytics. The focus of your work will be engineering new pipelines and maintaining, creating frameworks, enhancing existing data pipeline with new features to ensure accurate data delivery to stakeholders in a timely manner, also support ad-hoc reporting requirements that facilitate data-driven actionable insights and decision making at Autodesk.
**Responsibilities**
- Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation and loading of data from a wide variety of data sources into Snowflake or Hive warehouses using Python, SQL, DBT, other data technologies
- Design, implement, test and maintain data pipelines/ new features based on stakeholders' requirements
- Develop/maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts
- Optimize/ maintain workflows/ scripts on present data warehouses and present ETL 
- Design / develop / maintain components of data processing frameworks
- Basic Knowledge of using LLMs and using AI tools like Co-pilot for development purposes
- Build and maintain data quality and durability tracking mechanisms in order to provide visibility into and address inevitable disruptions in data ingestion, processing, and storage 
- Translate deeply technical designs into business appropriate representations as well as analyze business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business
- Focus on automation use cases, CI/CD approaches and self-service modules relevant for data domains
- Address questions and concerns from downstream data consumers through appropriate channels
- Create data tools for analytics and BI teams that assist them in building and optimizing our product into an innovative industry leader 
- Stay up to date with data engineering best practices, patterns, evaluate and analyze new technologies, capabilities, open-source software in context of our data strategy to ensure we are adapting, extending, or replacing our own core technologies to stay ahead of the industry
- Contribute to Analytics engineering process
**Minimum Qualifications**
- Bachelor's degree in computer science, information systems, or a related discipline 
- 3+ years in the Data Engineer role
- Built processes supporting data transformation, data structures, metadata, dependency, data quality, and workload management 
- Working experience with Snowflake, Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe. Must have worked on Snowflake Cost optimization scenarios
- Overall solid programming skills, able to write modular, maintainable code, preferably Python & SQL
- Have experience with workflow management solutions like Airflow
- Have experience on Data transformation tools like DBT
- Basic understanding of how to use LLMs for data engineering purposes and tools like GitHub, Co-pilot
- Experience working with Git
- Experience working with big data stack environment, like, Hive, Spark and Presto
- Strong analytical, problem solving and interpersonal skills
- Familiar with Scrum
- Ready to work flexible European hours
**Preferred Qualifications **
- Snowflake 
- DBT 
- Fivetran 
- Airflow
- CI/CD (Jenkins) 
- Basic understanding of Power BI
- AWS environment, for example S3, Lambda, Glue, Cloud watch
- Basic understanding of Salesforce 
- Experience working with remote teams spread across multiple time-zones
- Strong organizational skills and attention to detail
- Have a hunger to learn and the ability to operate in a self-guided manner
#LI-MR2
**Learn More**
**About Autodesk**
Welcome to Autodesk! Amazing things are created every day with our software - from the greenest buildings and cleanest cars to the smartest factories and biggest hit movies. We help innovators turn their ideas into reality, transforming not only how things are made, but what can be made.
We take great pride in our culture here at Autodesk - it's at the core of everything we do. Our culture guides the way we work and treat each other, informs how we connect with customers and partners, and defines how we show up in the world.
When you're an Autodesker, you can do meaningful work that helps build a better world designed and made for all. Ready to shape the world and your future? Join us!
**Salary transparency**
Salary is one part of Autodesk's competitive compensation package. Offers are based on the candidate's experience and geographic location. In addition to base salaries, our compensation package may include annual cash bonuses, commissions for sales roles, stock grants, and a comprehensive benefits package.
**Diversity & Belonging**
We take pride in cultivating a culture of belonging where everyone can thrive. Learn more here: you an existing contractor or consultant with Autodesk?**
Please search for open jobs and apply internally (not on this external site).
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs