270 Data Engineering jobs in Delhi
Associate Architect - Data Engineering
Posted 11 days ago
Job Viewed
Job Description
About the Role:
We are seeking an experienced Data Architect to lead the transformation of enterprise data
solutions, with a strong focus on migrating Alteryx workflows into Azure Databricks. The
ideal candidate will have deep expertise in the Microsoft Azure ecosystem, including Azure
Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and a strong
background in data architecture, governance, and distributed computing. This role
requires both strategic thinking and hands-on architectural leadership to ensure scalable,
secure, and high-performance data solutions.
Key Responsibilities:
Define the overall migration strategy for transforming Alteryx workflows into
scalable, cloud-native data solutions on Azure Databricks.
Architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure
Data Lake, and Synapse.
Establish best practices, standards, and governance frameworks for pipeline
design, orchestration, and data lifecycle management.
Guide engineering teams in re-engineering Alteryx workflows into distributed Spark-
based architectures.
Collaborate with business stakeholders to ensure solutions align with analytics,
reporting, and advanced AI/ML initiatives.
Oversee data quality, lineage, and security compliance across the data
ecosystem.
Drive CI/CD adoption, automation, and DevOps practices for Azure Databricks
and related services.
Provide architectural leadership, design reviews, and mentorship to engineering
and analytics teams.
Optimize solutions for performance, scalability, and cost-efficiency within Azure.
Participate in enterprise architecture forums and influence data strategy across the
organization.
Required Skills and Qualifications:
10+ years of experience in data architecture, engineering, or solution design.
Proven expertise in Alteryx workflows and their modernization into Azure
Databricks (Spark, PySpark, SQL, Delta Lake).
Deep knowledge of the Microsoft Azure data ecosystem:
o Azure Data Factory (ADF)
o Azure Synapse Analytics
o Microsoft Fabric
o Azure Databricks
Strong background in data governance, lineage, security, and compliance
frameworks.
Demonstrated experience in architecting data lakes, data warehouses, and
analytics platforms.
Proficiency in Python, SQL, and Apache Spark for prototyping and design
validation.
Excellent leadership, communication, and stakeholder management skills.
Preferred Qualifications:
Microsoft Azure certifications (e.g., Azure Solutions Architect Expert, Azure Data
Engineer Associate).
Experience leading large-scale migration programs or modernization initiatives.
Familiarity with enterprise architecture frameworks (TOGAF, Zachman).
Exposure to machine learning enablement on Azure Databricks.
Strong understanding of Agile delivery and working in multi-disciplinary teams.
Associate Architect - Data Engineering
Posted 11 days ago
Job Viewed
Job Description
About the Role:
We are seeking an experienced Data Architect to lead the transformation of enterprise data
solutions, with a strong focus on migrating Alteryx workflows into Azure Databricks. The
ideal candidate will have deep expertise in the Microsoft Azure ecosystem, including Azure
Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and a strong
background in data architecture, governance, and distributed computing. This role
requires both strategic thinking and hands-on architectural leadership to ensure scalable,
secure, and high-performance data solutions.
Key Responsibilities:
Define the overall migration strategy for transforming Alteryx workflows into
scalable, cloud-native data solutions on Azure Databricks.
Architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure
Data Lake, and Synapse.
Establish best practices, standards, and governance frameworks for pipeline
design, orchestration, and data lifecycle management.
Guide engineering teams in re-engineering Alteryx workflows into distributed Spark-
based architectures.
Collaborate with business stakeholders to ensure solutions align with analytics,
reporting, and advanced AI/ML initiatives.
Oversee data quality, lineage, and security compliance across the data
ecosystem.
Drive CI/CD adoption, automation, and DevOps practices for Azure Databricks
and related services.
Provide architectural leadership, design reviews, and mentorship to engineering
and analytics teams.
Optimize solutions for performance, scalability, and cost-efficiency within Azure.
Participate in enterprise architecture forums and influence data strategy across the
organization.
Required Skills and Qualifications:
10+ years of experience in data architecture, engineering, or solution design.
Proven expertise in Alteryx workflows and their modernization into Azure
Databricks (Spark, PySpark, SQL, Delta Lake).
Deep knowledge of the Microsoft Azure data ecosystem:
o Azure Data Factory (ADF)
o Azure Synapse Analytics
o Microsoft Fabric
o Azure Databricks
Strong background in data governance, lineage, security, and compliance
frameworks.
Demonstrated experience in architecting data lakes, data warehouses, and
analytics platforms.
Proficiency in Python, SQL, and Apache Spark for prototyping and design
validation.
Excellent leadership, communication, and stakeholder management skills.
Preferred Qualifications:
Microsoft Azure certifications (e.g., Azure Solutions Architect Expert, Azure Data
Engineer Associate).
Experience leading large-scale migration programs or modernization initiatives.
Familiarity with enterprise architecture frameworks (TOGAF, Zachman).
Exposure to machine learning enablement on Azure Databricks.
Strong understanding of Agile delivery and working in multi-disciplinary teams.
Associate Architect - Data Engineering
Posted 11 days ago
Job Viewed
Job Description
About the Role:
We are seeking an experienced Data Architect to lead the transformation of enterprise data
solutions, with a strong focus on migrating Alteryx workflows into Azure Databricks. The
ideal candidate will have deep expertise in the Microsoft Azure ecosystem, including Azure
Data Factory, Databricks, Synapse Analytics, Microsoft Fabric, and a strong
background in data architecture, governance, and distributed computing. This role
requires both strategic thinking and hands-on architectural leadership to ensure scalable,
secure, and high-performance data solutions.
Key Responsibilities:
Define the overall migration strategy for transforming Alteryx workflows into
scalable, cloud-native data solutions on Azure Databricks.
Architect end-to-end data frameworks leveraging Databricks, Delta Lake, Azure
Data Lake, and Synapse.
Establish best practices, standards, and governance frameworks for pipeline
design, orchestration, and data lifecycle management.
Guide engineering teams in re-engineering Alteryx workflows into distributed Spark-
based architectures.
Collaborate with business stakeholders to ensure solutions align with analytics,
reporting, and advanced AI/ML initiatives.
Oversee data quality, lineage, and security compliance across the data
ecosystem.
Drive CI/CD adoption, automation, and DevOps practices for Azure Databricks
and related services.
Provide architectural leadership, design reviews, and mentorship to engineering
and analytics teams.
Optimize solutions for performance, scalability, and cost-efficiency within Azure.
Participate in enterprise architecture forums and influence data strategy across the
organization.
Required Skills and Qualifications:
10+ years of experience in data architecture, engineering, or solution design.
Proven expertise in Alteryx workflows and their modernization into Azure
Databricks (Spark, PySpark, SQL, Delta Lake).
Deep knowledge of the Microsoft Azure data ecosystem:
o Azure Data Factory (ADF)
o Azure Synapse Analytics
o Microsoft Fabric
o Azure Databricks
Strong background in data governance, lineage, security, and compliance
frameworks.
Demonstrated experience in architecting data lakes, data warehouses, and
analytics platforms.
Proficiency in Python, SQL, and Apache Spark for prototyping and design
validation.
Excellent leadership, communication, and stakeholder management skills.
Preferred Qualifications:
Microsoft Azure certifications (e.g., Azure Solutions Architect Expert, Azure Data
Engineer Associate).
Experience leading large-scale migration programs or modernization initiatives.
Familiarity with enterprise architecture frameworks (TOGAF, Zachman).
Exposure to machine learning enablement on Azure Databricks.
Strong understanding of Agile delivery and working in multi-disciplinary teams.
Analytics & Insights Manager (Data Engineering)
Posted 8 days ago
Job Viewed
Job Description
**The Position**
A healthier future. It's what drives us to innovate. To continuously advance science and ensure everyone has access to the healthcare they need today and for generations to come. Creating a world where we all have more time with the people we love. That's what makes us Roche.
Healthcare is evolving, and Global Procurement (GP) is responding by continuously striving for the highest possible performance, taking innovative and strategic approaches to business and supplier partnerships. Global Procurement proactively manages the entire supplier ecosystem, making a vital contribution to improving health outcomes, reducing costs for patients and global healthcare systems, and ensuring that Roche continues doing now what patients need next.
**The Opportunity:**
This role sits within the Enablement Chapter where we drive operational and financial effectiveness in Global Procurement by advancing talent growth and development, delivering actionable insights, fostering high engagement, and ensuring robust performance management. Our team is dedicated to enabling better outcomes and providing comprehensive support to GP leadership and chapters.
As an Analytics & Insights Manager (Data Engineering) in A&I Data Solutions, you will bring structured thinking, facilitation, execution, and focus to procurement enabling and functional capabilities such as analytics, operations, governance, and strategic projects. Using your specialized knowledge or expertise in data engineering and general procurement, the role proactively identifies and drives strategies and approaches that positively impact capability and functional goals. This role supports data engineering and analytics efforts in Global Procurement by maintaining data pipelines and helping to improve data accessibility and accuracy.
You will collaborate with internal procurement, finance, and other relevant colleagues to understand needs and gather feedback to develop, enhance, or deploy functional enabling services and solutions that increase procurement's effectiveness and efficiency.
You will work closely with other team members to align on requirements, develop, validate, and deploy services, solutions, and frameworks to the broader procurement function.
As an Analytics & Insights Manager (Data Engineering), you will play a variety of roles according to your experience, knowledge, and general business requirements, including but not limited to:
**Responsibilities include:**
+ Managing the transition of procurement data sources between Snowflake databases while ensuring data integrity.
+ Facilitating the integration of diverse procurement data systems and managing data pipelines in Snowflake to streamline data availability and accessibility.
+ Developing and optimizing sophisticated SQL queries for data manipulation, transformation, and reporting tasks.
+ Managing and maintaining complex data mappings to ensure accuracy and reliability.
+ Collaborating seamlessly with key stakeholders across the procurement function to gather data requirements.
+ Addressing data-related issues with advanced troubleshooting techniques.
+ Leveraging GitLab and other orchestration tools for version control and collaboration with key stakeholders, ensuring best practices in code management and CI/CD automation.
**Who you are:**
+ You hold a university degree in Computer Science, Information Systems, or related disciplines.
+ You have 2-3 years of work experience, ideally in data engineering.
+ You have procurement analytics experience (preferred).
+ You have hands-on experience with Snowflake environments.
+ You are proficient in ETL/ELT technologies, DataOps and tools such as Talend/dbt/GitLab.
+ You have expertise in SQL and preferably Python for database querying and data transformation.
+ You have knowledge of cloud-based data solutions and infrastructure.
+ You have understanding of data mapping and data quality management (preferred).
+ You have experience with Git for version control and GitLab for CI/CD automation (not required but advantageous).
+ You have experience with workflow automation tools such as Automate Now or Airflow (preferred).
+ You demonstrate curiosity, active listening and a willingness to experiment and test new ideas when appropriate, with the focus very much on continuous learning and improvement.
+ You are open-minded and inclusive, generously sharing ideas and knowledge, while being receptive to ideas and feedback from others.
+ You are fluent in English to a Business level.
Join our team and enable the strong capability expertise needed to meet the evolving needs of our customers.
**Who we are**
A healthier future drives us to innovate. Together, more than 100'000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact.
Let's build a healthier future, together.
**Roche is an Equal Opportunity Employer.**
Senior Manager - Data Engineering Lead
Posted 12 days ago
Job Viewed
Job Description
Job Title: Senior Manager - Data Engineering Lead
Qualification: Bachelor’s or master’s degree in computer science, Data Engineering, or related field.
Required skillset:
- Experience in data engineering.
- Proven experience in cloud platforms (AWS, Azure, or GCP) and data services (Glue, Synapse, Big Query, Databricks, etc.).
- Hands-on experience with tools like Apache Spark, Kafka, Airflow, dbt, and modern orchestration platforms.
- Technical Skills
- Proficient in SQL, Python/Scala/Java.
- Strong understanding of modern data Lake concepts (e.g., Snowflake, Redshift, BigQuery).
- Familiarity with CI/CD, Infrastructure as Code (e.g., Terraform), and DevOps for data.
Nice to Have:
- Prior experience working in a regulated industry (alcohol, pharma, tobacco, etc.).
- Exposure to demand forecasting, route-to-market analytics, or distributor performance management.
- Knowledge of CRM, ERP, or supply chain systems (e.g., Salesforce, SAP, Oracle).
- Familiarity with marketing attribution models and campaign performance tracking.
Preferred Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder engagement abilities.
- Passion for data-driven innovation and delivering business impact.
- Certification in cloud platforms or data engineering (e.g., Google Cloud Professional Data Engineer).
- Excellent communication and stakeholder management skills.
Key Accountabilities:
- Design and implement scalable, high-performance data architecture solutions aligned with enterprise strategy.
- Define standards and best practices for data modelling, metadata management, and data governance.
- Collaborate with business stakeholders, data scientists, and application architects to align data infrastructure with business needs.
- Guide the selection of technologies, including cloud-native and hybrid data architecture patterns (e.g., Lambda/Kappa architectures).
- Lead the development, deployment, and maintenance of end-to-end data pipelines using ETL/ELT frameworks.
- Manage ingestion from structured and unstructured data sources (APIs, files, databases, streaming sources).
- Optimize data workflows for performance, reliability, and cost efficiency.
- Ensure data quality, lineage, cataloging, and security through automated validation and monitoring.
- Oversee data lake design, implementation, and daily operations (e.g., Azure Data Lake, AWS S3, GCP BigLake).
- Implement access controls, data lifecycle management, and partitioning strategies.
- Monitor and manage performance, storage costs, and data availability in real time.
- Ensure compliance with enterprise data policies and regulatory requirements (e.g., GDPR, CCPA).
- Lead and mentor a team of data engineers and architects.
- Establish a culture of continuous improvement, innovation, and operational excellence.
- Work closely with IT, DevOps, and InfoSec teams to ensure secure and scalable infrastructure.
Flexible Working Statement: Flexibility is key to our success. From part-time and compressed hours to different locations, our people work flexibly in ways to suit them. Talk to us about what flexibility means to you so that you’re supported from day one.
Diversity statement: Our purpose is to celebrate life, every day, everywhere. And creating an inclusive culture, where everyone feels valued and that they can belong, is a crucial part of this.
We embrace diversity in the broadest possible sense. This means that you’ll be welcomed and celebrated for who you are just by being you. You’ll be part of and help build and champion an inclusive culture that celebrates people of different gender, ethnicity, ability, age, sexual orientation, social class, educational backgrounds, experiences, mindsets, and more.
Our ambition is to create the best performing, most trusted and respected consumer products companies in the world. Join us and help transform our business as we take our brands to the next level and build new ones as part of shaping the next generation of celebrations for consumers around the world.
Senior Manager - Data Engineering Lead
Posted 12 days ago
Job Viewed
Job Description
Job Title: Senior Manager - Data Engineering Lead
Qualification: Bachelor’s or master’s degree in computer science, Data Engineering, or related field.
Required skillset:
- Experience in data engineering.
- Proven experience in cloud platforms (AWS, Azure, or GCP) and data services (Glue, Synapse, Big Query, Databricks, etc.).
- Hands-on experience with tools like Apache Spark, Kafka, Airflow, dbt, and modern orchestration platforms.
- Technical Skills
- Proficient in SQL, Python/Scala/Java.
- Strong understanding of modern data Lake concepts (e.g., Snowflake, Redshift, BigQuery).
- Familiarity with CI/CD, Infrastructure as Code (e.g., Terraform), and DevOps for data.
Nice to Have:
- Prior experience working in a regulated industry (alcohol, pharma, tobacco, etc.).
- Exposure to demand forecasting, route-to-market analytics, or distributor performance management.
- Knowledge of CRM, ERP, or supply chain systems (e.g., Salesforce, SAP, Oracle).
- Familiarity with marketing attribution models and campaign performance tracking.
Preferred Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder engagement abilities.
- Passion for data-driven innovation and delivering business impact.
- Certification in cloud platforms or data engineering (e.g., Google Cloud Professional Data Engineer).
- Excellent communication and stakeholder management skills.
Key Accountabilities:
- Design and implement scalable, high-performance data architecture solutions aligned with enterprise strategy.
- Define standards and best practices for data modelling, metadata management, and data governance.
- Collaborate with business stakeholders, data scientists, and application architects to align data infrastructure with business needs.
- Guide the selection of technologies, including cloud-native and hybrid data architecture patterns (e.g., Lambda/Kappa architectures).
- Lead the development, deployment, and maintenance of end-to-end data pipelines using ETL/ELT frameworks.
- Manage ingestion from structured and unstructured data sources (APIs, files, databases, streaming sources).
- Optimize data workflows for performance, reliability, and cost efficiency.
- Ensure data quality, lineage, cataloging, and security through automated validation and monitoring.
- Oversee data lake design, implementation, and daily operations (e.g., Azure Data Lake, AWS S3, GCP BigLake).
- Implement access controls, data lifecycle management, and partitioning strategies.
- Monitor and manage performance, storage costs, and data availability in real time.
- Ensure compliance with enterprise data policies and regulatory requirements (e.g., GDPR, CCPA).
- Lead and mentor a team of data engineers and architects.
- Establish a culture of continuous improvement, innovation, and operational excellence.
- Work closely with IT, DevOps, and InfoSec teams to ensure secure and scalable infrastructure.
Flexible Working Statement: Flexibility is key to our success. From part-time and compressed hours to different locations, our people work flexibly in ways to suit them. Talk to us about what flexibility means to you so that you’re supported from day one.
Diversity statement: Our purpose is to celebrate life, every day, everywhere. And creating an inclusive culture, where everyone feels valued and that they can belong, is a crucial part of this.
We embrace diversity in the broadest possible sense. This means that you’ll be welcomed and celebrated for who you are just by being you. You’ll be part of and help build and champion an inclusive culture that celebrates people of different gender, ethnicity, ability, age, sexual orientation, social class, educational backgrounds, experiences, mindsets, and more.
Our ambition is to create the best performing, most trusted and respected consumer products companies in the world. Join us and help transform our business as we take our brands to the next level and build new ones as part of shaping the next generation of celebrations for consumers around the world.
Senior Manager - Data Engineering Lead
Posted 12 days ago
Job Viewed
Job Description
Job Title: Senior Manager - Data Engineering Lead
Qualification: Bachelor’s or master’s degree in computer science, Data Engineering, or related field.
Required skillset:
- Experience in data engineering.
- Proven experience in cloud platforms (AWS, Azure, or GCP) and data services (Glue, Synapse, Big Query, Databricks, etc.).
- Hands-on experience with tools like Apache Spark, Kafka, Airflow, dbt, and modern orchestration platforms.
- Technical Skills
- Proficient in SQL, Python/Scala/Java.
- Strong understanding of modern data Lake concepts (e.g., Snowflake, Redshift, BigQuery).
- Familiarity with CI/CD, Infrastructure as Code (e.g., Terraform), and DevOps for data.
Nice to Have:
- Prior experience working in a regulated industry (alcohol, pharma, tobacco, etc.).
- Exposure to demand forecasting, route-to-market analytics, or distributor performance management.
- Knowledge of CRM, ERP, or supply chain systems (e.g., Salesforce, SAP, Oracle).
- Familiarity with marketing attribution models and campaign performance tracking.
Preferred Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder engagement abilities.
- Passion for data-driven innovation and delivering business impact.
- Certification in cloud platforms or data engineering (e.g., Google Cloud Professional Data Engineer).
- Excellent communication and stakeholder management skills.
Key Accountabilities:
- Design and implement scalable, high-performance data architecture solutions aligned with enterprise strategy.
- Define standards and best practices for data modelling, metadata management, and data governance.
- Collaborate with business stakeholders, data scientists, and application architects to align data infrastructure with business needs.
- Guide the selection of technologies, including cloud-native and hybrid data architecture patterns (e.g., Lambda/Kappa architectures).
- Lead the development, deployment, and maintenance of end-to-end data pipelines using ETL/ELT frameworks.
- Manage ingestion from structured and unstructured data sources (APIs, files, databases, streaming sources).
- Optimize data workflows for performance, reliability, and cost efficiency.
- Ensure data quality, lineage, cataloging, and security through automated validation and monitoring.
- Oversee data lake design, implementation, and daily operations (e.g., Azure Data Lake, AWS S3, GCP BigLake).
- Implement access controls, data lifecycle management, and partitioning strategies.
- Monitor and manage performance, storage costs, and data availability in real time.
- Ensure compliance with enterprise data policies and regulatory requirements (e.g., GDPR, CCPA).
- Lead and mentor a team of data engineers and architects.
- Establish a culture of continuous improvement, innovation, and operational excellence.
- Work closely with IT, DevOps, and InfoSec teams to ensure secure and scalable infrastructure.
Flexible Working Statement: Flexibility is key to our success. From part-time and compressed hours to different locations, our people work flexibly in ways to suit them. Talk to us about what flexibility means to you so that you’re supported from day one.
Diversity statement: Our purpose is to celebrate life, every day, everywhere. And creating an inclusive culture, where everyone feels valued and that they can belong, is a crucial part of this.
We embrace diversity in the broadest possible sense. This means that you’ll be welcomed and celebrated for who you are just by being you. You’ll be part of and help build and champion an inclusive culture that celebrates people of different gender, ethnicity, ability, age, sexual orientation, social class, educational backgrounds, experiences, mindsets, and more.
Our ambition is to create the best performing, most trusted and respected consumer products companies in the world. Join us and help transform our business as we take our brands to the next level and build new ones as part of shaping the next generation of celebrations for consumers around the world.
Be The First To Know
About the latest Data engineering Jobs in Delhi !
Sr Analytics & Insights Manager (Data Engineering)
Posted 8 days ago
Job Viewed
Job Description
**The Position**
A healthier future. It's what drives us to innovate. To continuously advance science and ensure everyone has access to the healthcare they need today and for generations to come. Creating a world where we all have more time with the people we love. That's what makes us Roche.
Healthcare is evolving, and Global Procurement (GP) is responding by continuously striving for the highest possible performance, taking innovative and strategic approaches to business and supplier partnerships. Global Procurement proactively manages the entire supplier ecosystem, making a vital contribution to improving health outcomes, reducing costs for patients and global healthcare systems, and ensuring that Roche continues doing now what patients need next.
**The Opportunity:**
This role sits within the Enablement Chapter where we drive operational and financial effectiveness in Global Procurement by advancing talent growth and development, delivering actionable insights, fostering high engagement, and ensuring robust performance management. Our team is dedicated to enabling better outcomes and providing comprehensive support to GP leadership and chapters.
As a Senior Analytics & Insights Manager (Data Engineering) in A&I Data Solutions, you will bring structured thinking, facilitation, execution, and focus to procurement enabling and functional capabilities such as analytics, operations, governance, and strategic projects. The role will lead data engineering efforts in Global Procurement, streamlining data systems and analytics to boost operational efficiency. Using your specialized knowledge and in-depth expertise in data engineering and general procurement, you will proactively identify and drive strategies and approaches that positively impact capability and functional goals.
You will collaborate with internal procurement, finance, and other relevant colleagues to align on needs and opportunity identification to develop, enhance, or deploy functional enabling services and solutions that increase procurement's effectiveness and efficiency.
You will work closely with other team members, either as a peer coach, project or workstream lead, or team lead to embed best practices and deploy services, solutions, and frameworks to the broader procurement function.
As a Senior Analytics & Insights Manager (Data Engineering), you will play a variety of roles according to your experience, knowledge, and general business/team requirements, including but not limited to:
**Responsibilities include:**
+ Managing the transition of procurement data sources between databases (e.g. Snowflake) while ensuring data integrity.
+ Facilitating the integration of diverse procurement data systems and managing data pipelines in Snowflake to streamline data availability and accessibility.
+ Developing and optimizing sophisticated SQL queries for data manipulation, transformation, and reporting tasks.
+ Managing and maintaining complex data mappings to ensure accuracy and reliability.
+ Collaborating seamlessly with key stakeholders across the procurement function to gather data requirements.
+ Addressing data-related issues with advanced troubleshooting techniques.
+ Leveraging GitLab and other orchestration tools for version control and collaboration with key stakeholders, ensuring best practices in code management and CI/CD automation.
**Who you are:**
+ You hold a university degree in Computer Science, Information Systems, or related disciplines.
+ You have 5-7 years of work experience with at least 3 years of experience in data engineering.
+ You have procurement analytics experience (preferred).
+ You have hands-on experience with Snowflake environments.
+ You are proficient in ETL/ELT technologies, DataOps and tools such as Talend/dbt/GitLab.
+ You have expertise in SQL and preferably Python for database querying and data transformation.
+ You have knowledge of cloud-based data solutions and infrastructure.
+ You have an understanding of data mapping and data quality management (preferred).
+ You have experience with Git for version control and GitLab for CI/CD automation (not required but advantageous).
+ You are experienced with workflow automation tools such as Automate Now or Airflow (preferred).
+ You demonstrate curiosity, active listening, and a willingness to experiment and test new ideas when appropriate, with the focus very much on continuous learning and improvement.
+ You are open-minded and inclusive, generously sharing ideas and knowledge, while being receptive to ideas and feedback from others.
+ You are fluent in English to a Business level.
Join our team and enable the strong capability expertise needed to meet the evolving needs of our customers.
**Who we are**
A healthier future drives us to innovate. Together, more than 100'000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact.
Let's build a healthier future, together.
**Roche is an Equal Opportunity Employer.**
Senior Full Stack SDE with Data Engineering for Analytics
Posted 22 days ago
Job Viewed
Job Description
Summary
Truckmentum is seeking a Senior Full Stack Software Development Engineer (SDE) with deep data engineering experience to help us build cutting-edge software and data infrastructure for our AI-driven Trucking Science-as-a-Service platform. We’re creating breakthrough data science to transform trucking — and we’re looking for engineers who share our obsession with solving complex, real-world problems with software, data, and intelligent systems.
You’ll be part of a team responsible for the development of dynamic web applications, scalable data pipelines, and high-performance backend services that drive better decision-making across the $4 trillion global trucking industry. This is a hands-on role focused on building solutions by combining Python-based full stack development with scalable, modern data engineering.
About Truckmentum
Just about every sector of the global economy depends on trucking. In the US alone, trucks move 70%+ of all freight by weight (90+% by value) and account for $40 billion in annual spending (globally 4+ trillion per year). Despite this, almost all key decisions in trucking are made manually by people with limited decision support. This results in significant waste and lost opportunities. We view this as a great opportunity.
Truckmentum is a self-funded seed stage venture. We are now validating our key data science breakthroughs with customer data and our MVP product launch to confirm product-market fit. We will raise 4-6 million in funding this year to scale our Data Science-as-a-Service platform and bring our vision to market at scale.
Our Vision and Approach to Technology
T he back of our business cards reads “Moneyball for Trucking”, which means quantifying hard-to-quantiify hidden insights, and then using those insights to make much better business decision. If you don’t want “Moneyball for Trucking” on the back of your business card, then Truckmentum isn’t a good fit.
Great technology begins with customer obsession. We are obsessed with trucking companies' needs, opportunities, and processes, and with building our solutions into the rhythm of their businesses. We prioritize rapid development and iteration of large scale, complex data science problems, backed by actionable, dynamic data visualizations. We believe in an Agile, lean approach to software engineering, backed by a structured CI/CD approach, professional engineering practices, clean architecture, clean code and testing.
Our technology stack includes AWS Cloud, MySQL, Snowflake, Python, SQLAlchemy, Pandas, Streamlit and AGGrid to accelerate development of web visualization and interfaces.
About the Role
As a Senior Full Stack SDE with Data Engineering for Analytics, you will be responsible for designing and building the software systems, user interfaces, and data infrastructure that power Truckmentum’s analytics, data science, and decision support platform. This is a true full stack role — you’ll work across frontend, backend, and data layers using Python, Streamlit, Snowflake, and modern DevOps practices. You’ll help architect and implement a clean, extensible system that supports complex machine learning models, large-scale data processing, and intuitive business-facing applications.
You will report to the CEO (Will Payson), a transportation science expert with 25 years in trucking, who has delivered $1B+ in annual savin s for FedEx and Amazon. You will also work closely with the CMO/Head of Product, Tim Liu, who has 20+ years of experience in building and commercializing customer-focused digital platforms including in logistics.
Responsibilities and Goals
- Design and build full stack applications using Python, Streamlit, and modern web frameworks to power internal tools, analytics dashboards, and customer-facing products.
- Develop scalable data pipelines to ingest, clean, transform, and serve data from diverse sources into Snowflake and other cloud-native databases.
- Implement low-latency, high-availability backend services to support data science, decision intelligence, and interactive visualizations.
- Integrate front-end components with backend systems and ensure seamless interaction between UI, APIs, and data layers.
- Collaborate with data scientists / ML engineers to deploy models, support experimentation, and enable rapid iteration on analytics use cases.
- Define and evolve our data strategy and architecture, including schemas, governance, versioning, and access patterns across business units and use cases.
- Implement DevOps best practices, including testing, CI/CD automation, and observability, to improve reliability and reduce technical debt.
- Ensure data integrity and privacy through validation, error handling, and secure design.
- Contribute to product planning and roadmaps by working with cross-functional teams to estimate scope, propose solutions, and deliver value iteratively.
Required Qualifications
- 5+ years of professional software development experience, with a proven track record of building enterprise-grade, production-ready software applications for businesses or consumers, working in an integrated development team using Agile and Git / GitHub.
- Required technology experience with the following technologies in a business context:
- Python as primary programming language (5+ years’ experience)
- Pandas, Numpy, SQL
- AWS and/or GCP cloud configuration / deployment
- Git / GitHub
- Snowflake, and/or Redshift or Big Query
- Docker
- Airflow, Prefect or other DAG orchestration technology
- Front end engineering (e.g., HTML/CSS, JavaScript, and component-based frameworks)
- Hands-on experience with modern front-end technologies — HTML/CSS, JavaScript, and component-based frameworks (e.g., Streamlit, React, or similar).
- Experience designing and managing scalable data pipelines, data processing jobs, and ETL/ELT
- Experience in defining Data Architecture and Date Engineering Architecture, including robust pipelines, and building and using cloud services (AWS and/or GCP)
- Experience building and maintaining well-structured APIs and microservices in a cloud environment.
- Working knowledge of, and experience applying, data validation, privacy, and governance
- Comfort working in a fast-paced, startup environment with evolving priorities and an Agile mindset.
- Strong communication and collaboration skills — able to explain technical tradeoffs to both technical and non-technical stakeholders.
Desirable Experience (i.e., great but not required.)
- Desired technology experience with the following technologies in a business context:
- Snowflake
- Streamlit
- Folium, Plotly, AG Grid
- Kubernetes
- Javascript, CSS
- Flask, Fast API and SQLAlchemy
- Exposure to machine learning workflows and collaboration with data scientists or MLOps teams.
- Experience building or scaling analytics tools, business intelligence systems, or SaaS data products.
- Familiarity with geospatial data and visualization libraries (e.g., Folium, Plotly, AG Grid).
- Knowledge of CI/CD tools (e.g., GitHub Actions, Docker, Terraform) and modern DevOps practices.
- Contributions to early-stage product development — especially at high-growth startups.
- Passion for transportation and logistics, and for applying technology to operational systems.
Why Join Truckmentum
At Truckmentum, we’re not just building software — we’re rewriting the rules for one of the largest and most essential industries in the world. If you’re excited by real-world impact, data-driven decision making, and being part of a company where you’ll see your work shape the product and the business, this is your kind of team.
Some of the factors that make this a great opportunity include:
- Massive market opportunity: Trucking is a $4T+ global indust y / strong customer interest in solution
- Real business impact: Our tech has already shown a 5% operating margin gain at pilot customers.
- Builder’s culture: You’ll help define architecture, shape best practices, and influence our direction.
- Tight feedback loop: We work directly with real customers and iterate fast.
- Tech stack you’ll love: Python, Streamlit, Snowflake, Pandas, AWS — clean, modern, focused.
- Mission-driven team: We’re obsessed with bringing "Moneyball for Trucks" to life — combining science, strategy, and empathy to make the complex simple, and the invisible visible
We value intelligence, curiosity, humility, clean code, measurable impact, clear thinking, hard work and a focus on delivering results. If that sounds like your kind of team, we’d love to meet you.
- PS. If you read this far, we assume you are focused and detail oriented. If you think this job sounds interesting, please fill in a free personality profile on and email a link to the outcome to to move your application to the top the pile.
Senior Full Stack SDE with Data Engineering for Analytics
Posted 22 days ago
Job Viewed
Job Description
Summary
Truckmentum is seeking a Senior Full Stack Software Development Engineer (SDE) with deep data engineering experience to help us build cutting-edge software and data infrastructure for our AI-driven Trucking Science-as-a-Service platform. We’re creating breakthrough data science to transform trucking — and we’re looking for engineers who share our obsession with solving complex, real-world problems with software, data, and intelligent systems.
You’ll be part of a team responsible for the development of dynamic web applications, scalable data pipelines, and high-performance backend services that drive better decision-making across the $4 trillion global trucking industry. This is a hands-on role focused on building solutions by combining Python-based full stack development with scalable, modern data engineering.
About Truckmentum
Just about every sector of the global economy depends on trucking. In the US alone, trucks move 70%+ of all freight by weight (90+% by value) and account for $40 billion in annual spending (globally 4+ trillion per year). Despite this, almost all key decisions in trucking are made manually by people with limited decision support. This results in significant waste and lost opportunities. We view this as a great opportunity.
Truckmentum is a self-funded seed stage venture. We are now validating our key data science breakthroughs with customer data and our MVP product launch to confirm product-market fit. We will raise 4-6 million in funding this year to scale our Data Science-as-a-Service platform and bring our vision to market at scale.
Our Vision and Approach to Technology
T he back of our business cards reads “Moneyball for Trucking”, which means quantifying hard-to-quantiify hidden insights, and then using those insights to make much better business decision. If you don’t want “Moneyball for Trucking” on the back of your business card, then Truckmentum isn’t a good fit.
Great technology begins with customer obsession. We are obsessed with trucking companies' needs, opportunities, and processes, and with building our solutions into the rhythm of their businesses. We prioritize rapid development and iteration of large scale, complex data science problems, backed by actionable, dynamic data visualizations. We believe in an Agile, lean approach to software engineering, backed by a structured CI/CD approach, professional engineering practices, clean architecture, clean code and testing.
Our technology stack includes AWS Cloud, MySQL, Snowflake, Python, SQLAlchemy, Pandas, Streamlit and AGGrid to accelerate development of web visualization and interfaces.
About the Role
As a Senior Full Stack SDE with Data Engineering for Analytics, you will be responsible for designing and building the software systems, user interfaces, and data infrastructure that power Truckmentum’s analytics, data science, and decision support platform. This is a true full stack role — you’ll work across frontend, backend, and data layers using Python, Streamlit, Snowflake, and modern DevOps practices. You’ll help architect and implement a clean, extensible system that supports complex machine learning models, large-scale data processing, and intuitive business-facing applications.
You will report to the CEO (Will Payson), a transportation science expert with 25 years in trucking, who has delivered $1B+ in annual savin s for FedEx and Amazon. You will also work closely with the CMO/Head of Product, Tim Liu, who has 20+ years of experience in building and commercializing customer-focused digital platforms including in logistics.
Responsibilities and Goals
- Design and build full stack applications using Python, Streamlit, and modern web frameworks to power internal tools, analytics dashboards, and customer-facing products.
- Develop scalable data pipelines to ingest, clean, transform, and serve data from diverse sources into Snowflake and other cloud-native databases.
- Implement low-latency, high-availability backend services to support data science, decision intelligence, and interactive visualizations.
- Integrate front-end components with backend systems and ensure seamless interaction between UI, APIs, and data layers.
- Collaborate with data scientists / ML engineers to deploy models, support experimentation, and enable rapid iteration on analytics use cases.
- Define and evolve our data strategy and architecture, including schemas, governance, versioning, and access patterns across business units and use cases.
- Implement DevOps best practices, including testing, CI/CD automation, and observability, to improve reliability and reduce technical debt.
- Ensure data integrity and privacy through validation, error handling, and secure design.
- Contribute to product planning and roadmaps by working with cross-functional teams to estimate scope, propose solutions, and deliver value iteratively.
Required Qualifications
- 5+ years of professional software development experience, with a proven track record of building enterprise-grade, production-ready software applications for businesses or consumers, working in an integrated development team using Agile and Git / GitHub.
- Required technology experience with the following technologies in a business context:
- Python as primary programming language (5+ years’ experience)
- Pandas, Numpy, SQL
- AWS and/or GCP cloud configuration / deployment
- Git / GitHub
- Snowflake, and/or Redshift or Big Query
- Docker
- Airflow, Prefect or other DAG orchestration technology
- Front end engineering (e.g., HTML/CSS, JavaScript, and component-based frameworks)
- Hands-on experience with modern front-end technologies — HTML/CSS, JavaScript, and component-based frameworks (e.g., Streamlit, React, or similar).
- Experience designing and managing scalable data pipelines, data processing jobs, and ETL/ELT
- Experience in defining Data Architecture and Date Engineering Architecture, including robust pipelines, and building and using cloud services (AWS and/or GCP)
- Experience building and maintaining well-structured APIs and microservices in a cloud environment.
- Working knowledge of, and experience applying, data validation, privacy, and governance
- Comfort working in a fast-paced, startup environment with evolving priorities and an Agile mindset.
- Strong communication and collaboration skills — able to explain technical tradeoffs to both technical and non-technical stakeholders.
Desirable Experience (i.e., great but not required.)
- Desired technology experience with the following technologies in a business context:
- Snowflake
- Streamlit
- Folium, Plotly, AG Grid
- Kubernetes
- Javascript, CSS
- Flask, Fast API and SQLAlchemy
- Exposure to machine learning workflows and collaboration with data scientists or MLOps teams.
- Experience building or scaling analytics tools, business intelligence systems, or SaaS data products.
- Familiarity with geospatial data and visualization libraries (e.g., Folium, Plotly, AG Grid).
- Knowledge of CI/CD tools (e.g., GitHub Actions, Docker, Terraform) and modern DevOps practices.
- Contributions to early-stage product development — especially at high-growth startups.
- Passion for transportation and logistics, and for applying technology to operational systems.
Why Join Truckmentum
At Truckmentum, we’re not just building software — we’re rewriting the rules for one of the largest and most essential industries in the world. If you’re excited by real-world impact, data-driven decision making, and being part of a company where you’ll see your work shape the product and the business, this is your kind of team.
Some of the factors that make this a great opportunity include:
- Massive market opportunity: Trucking is a $4T+ global indust y / strong customer interest in solution
- Real business impact: Our tech has already shown a 5% operating margin gain at pilot customers.
- Builder’s culture: You’ll help define architecture, shape best practices, and influence our direction.
- Tight feedback loop: We work directly with real customers and iterate fast.
- Tech stack you’ll love: Python, Streamlit, Snowflake, Pandas, AWS — clean, modern, focused.
- Mission-driven team: We’re obsessed with bringing "Moneyball for Trucks" to life — combining science, strategy, and empathy to make the complex simple, and the invisible visible
We value intelligence, curiosity, humility, clean code, measurable impact, clear thinking, hard work and a focus on delivering results. If that sounds like your kind of team, we’d love to meet you.
- PS. If you read this far, we assume you are focused and detail oriented. If you think this job sounds interesting, please fill in a free personality profile on and email a link to the outcome to to move your application to the top the pile.