865 Freelance Talend Middleware Pipeline Expert Sap Integration Kinaxis jobs in India
Freelance – Talend Middleware Pipeline Expert (SAP Integration & Kinaxis)
Job Viewed
Job Description
Company Description
ThreatXIntel is a startup cyber security company dedicated to protecting businesses and organizations from cyber threats. We offer a range of services, including cloud security, web and mobile security testing, cloud security assessment, DevSecOps, and more. Our mission is to provide exceptional cyber security services that give our clients peace of mind, so they can focus on growing their business.
Role Description
We are seeking an experienced Talend Middleware Pipeline Expert to lead an enterprise-wide data migration initiative, consolidating disparate SAP system data into a unified platform using Talend Data Integration. The environment is partially set up with Kinaxis Rapid Response and Talend but requires full deployment, optimization, and roll-out to additional sites.
This role is ideal for someone who can make an immediate technical impact by designing, building, and optimizing ETL processes for near real-time integration between SAP and Kinaxis, ensuring high performance, data quality, and compliance.
Key Responsibilities
- Design, develop, and maintain Talend ETL jobs between SAP and Kinaxis environments.
- Integrate SAP data via IDocs, BAPIs, RFCs, or HANA queries .
- Collaborate with Kinaxis teams to manage data loads and interfaces .
- Automate ETL processes to enable near real-time processing in Rapid Response .
- Troubleshoot, optimize, and document data workflows for performance and quality.
- Work in SAP ECC or S/4HANA environments with AWS or Azure hosting.
Required Skills & Experience
- 5+ years of hands-on Talend Middleware Pipeline development for SAP integrations.
- Proficiency in Talend Studio, Talend Cloud, and Talend Data Integration .
- Experience with Kinaxis Rapid Response and Talend combined.
- Strong SQL and API (REST/SOA) development skills.
- SAP integration knowledge (IDoc, BAPI, RFC, HANA queries ).
- Familiarity with AWS or Azure cloud environments.
Preferred
- Kinaxis Certified Consultant .
- DoD project support experience.
Job No Longer Available
This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.
However, we have similar jobs available for you below.
Cyber Data Pipeline Engineer
Posted 1 day ago
Job Viewed
Job Description
Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted 1 day ago
Job Viewed
Job Description
Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted 1 day ago
Job Viewed
Job Description
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
Monitor the health and performance of the data pipeline infrastructure.
Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
Hands-on experience in developing and validating field extraction using regular expressions.
A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
Strong analytical and troubleshooting skills
Excellent verbal & written communication skills
Appreciation of Agile methodologies, specifically Kanban
Desired Skills
Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
Infrastructure automation and integration experience, ideally using Python and Ansible
Familiarity with cybersecurity concepts, event types, and monitoring requirements.
Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted today
Job Viewed
Job Description
Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Cyber Data Pipeline Engineer
Posted today
Job Viewed
Job Description
Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Senior - Dev-Data-Pipeline
Posted today
Job Viewed
Job Description
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Context
KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.
We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.
Job Description
Role Objective:
Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).
Location- Mumbai
Years of Experience - 3-5 yrs
Roles & Responsibilities:
Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.
Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.
Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.
Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.
Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.
Technical Skills:
Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)
Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement
Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)
SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding
Soft Skills-
Very good communication and presentation skills
Must be able to articulate the thoughts and convince key stakeholders
Should be able to guide and upskill team members
Good to Have:
Programming Language: Knowledge and hands-on experience with languages like Python and R.
Relevant certifications related to the role
Equal employment opportunity information
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
Executive - Dev-Data-Pipeline
Posted today
Job Viewed
Job Description
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Context
KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.
We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.
Job Description
Role Objective:
Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).
Location- Mumbai
Years of Experience - 3-5 yrs
Roles & Responsibilities:
Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.
Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.
Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.
Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.
Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.
Technical Skills:
Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)
Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement
Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)
SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding
Soft Skills-
Very good communication and presentation skills
Must be able to articulate the thoughts and convince key stakeholders
Should be able to guide and upskill team members
Good to Have:
Programming Language: Knowledge and hands-on experience with languages like Python and R.
Relevant certifications related to the role
Equal employment opportunity information
KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
Be The First To Know
About the latest Freelance talend middleware pipeline expert sap integration kinaxis Jobs in India !
Data Pipeline Support Engineer
Posted today
Job Viewed
Job Description
Data Pipeline Support Engineer Location: India based, remote work. Time: 10:30 AM – 6:30 PM IST (Aligned to US Central Time) Rate: $20 - $25/h About Data Meaning Data Meaning is a front-runner in Business Intelligence and Data Analytics consulting, renowned for our high-quality consulting services throughout the US and LATAM.
Our expertise lies in delivering tailored solutions in Business Intelligence, Data Warehousing, and Project Management.
We are dedicated to bringing premier services to our diverse clientele.
We have a global team of 95+ consultants, all working remotely, embodying a collaborative, inclusive, and innovative-driven work culture.
Position Summary: We are seeking a proactive Data Pipeline Support Engineer based in India to work the 12:00–8:00 AM Central Time (US) shift The ideal candidate must have strong hands-on experience with Azure Data Factory, Alteryx (including reruns and macros), and dbt.
This role requires someone detail-oriented, capable of independently managing early-morning support for critical workflows, and comfortable collaborating across time zones in a fast-paced data operations environment.
Responsibilities: Monitor and support data jobs in Alteryx, dbt, Snowflake, and ADF during 12:00–8:00 AM CT Perform first-pass remediation (e.g., reruns, credential resets, basic troubleshooting, etc).
Escalate unresolved or complex issues to nearshore Tier 2/3 support teams.
Log all incidents and resolutions in ticketing and audit systems (ServiceNow).
Collaborate with CT-based teams for smooth cross-timezone handoffs.
Contribute to automation improvements (e.g., Alteryx macros, retry logic).
Required Skills & Qualifications: Strong, hands-on experience in Alteryx (monitoring, reruns, macros).
Working knowledge of dbt (Data Build Tool) and Snowflake (basic SQL, Snowpark, data validation).
Experience with Azure Data Factory (ADF) pipeline executions Familiarity with SAP BW, workflow chaining, and cron-based job scheduling.
Familiarity with data formats, languages, protocols, and architecture styles required to provide Azure-based integration solutions (for example,.
NET, JSON, REST, and SOAP) including Azure Functions.
Excellent communication skills in English (written and verbal).
Ability to work independently and handle incident resolution with limited documentation.
Required Certifications: Alteryx Designer Core Certification SnowPro Core Certification dbt Fundamentals Course Certificate Microsoft Certified: Azure Data Engineer Associate ITIL v4 Foundation (or equivalent) Preferred Certifications: Alteryx Designer Advanced Certification Alteryx Server Administration SnowPro Advanced: Data Engineer dbt Analytics Engineering Certification Microsoft Certified: Azure Fundamentals (AZ-900) Powered by JazzHR
Cyber Data Pipeline Engineer
Posted today
Job Viewed
Job Description
Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.
Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
- Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
- Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
- Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
- Monitor the health and performance of the data pipeline infrastructure.
- Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
- Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
- Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
- Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
- Hands-on experience in developing and validating field extraction using regular expressions.
- A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
- Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
- Strong analytical and troubleshooting skills
- Excellent verbal & written communication skills
- Appreciation of Agile methodologies, specifically Kanban
Desired Skills
- Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
- Infrastructure automation and integration experience, ideally using Python and Ansible
- Familiarity with cybersecurity concepts, event types, and monitoring requirements.
- Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
Backend and Data Pipeline Engineer
Posted 6 days ago
Job Viewed
Job Description
The Role: Backend and Data Pipeline Engineer
The Team:
We are investing in technology to develop new products that help our customers drive their growth and transformation agenda. These include new data integration, advanced analytics, and modern applications that address new customer needs and are highly visible and strategic within the organization. Do you love building products on platforms while leveraging cutting edge technology? Do you want to deliver innovative solutions to complex problems? If so, be part of our mighty team of engineers and play a key role in driving our business strategies.
The Impact:
We stand at cross-roads of innovation through Data Products to bring a competitive advantage to our business through the delivery of automotive forecasting solutions. Your work will contribute to the growth and success of our organization and provide valuable insights to our clients.
What’s in it for you:
We are looking for an innovative and mission-driven softwaredata engineer to make a significant impact by designing and developing AWS cloud native solutions that enables analysts to forecast long and short-term trends in the automotive industry. This role requires cutting edge data and cloud native technical expertise as well as the ability to work independently in a fast-paced, collaborative, and dynamic work environment.
Responsibilities:
- Design, develop, and maintain scalable data pipelines including complex algorithms
- Build and maintain backend services using Python or C# or similar, ensuring responsiveness and high performance
- Ensure data quality and integrity through robust validation processes
- Strong understanding of data integration and data modeling concepts
- Lead data integration projects and mentor junior engineers
- Collaborate with cross-functional teams to gather data requirements
- Collaborate with data scientists and analysts to optimize data flow and storage for advanced analytics
- Take ownership of the modules you work on, deliver on time and with quality, ensure software development best practices
- Utilize Redis for caching and data storage solutions to enhance application performance.
What We’re Looking For:
- Bachelor’s degree in computer science, or a related field
- Strong analytical and problem-solving skills
- 7+ years of experience in Data Engineering/Advanced Analytics
- Proficiency in Python and experience with Flask for backend development.
- Strong knowledge of object-oriented programming.
- AWS Proficiency is a big plus: ECR.