865 Freelance Talend Middleware Pipeline Expert Sap Integration Kinaxis jobs in India

Freelance – Talend Middleware Pipeline Expert (SAP Integration & Kinaxis)

Nellore, Andhra Pradesh ThreatXIntel

Job Viewed

Tap Again To Close

Job Description

Company Description

ThreatXIntel is a startup cyber security company dedicated to protecting businesses and organizations from cyber threats. We offer a range of services, including cloud security, web and mobile security testing, cloud security assessment, DevSecOps, and more. Our mission is to provide exceptional cyber security services that give our clients peace of mind, so they can focus on growing their business.


Role Description

We are seeking an experienced Talend Middleware Pipeline Expert to lead an enterprise-wide data migration initiative, consolidating disparate SAP system data into a unified platform using Talend Data Integration. The environment is partially set up with Kinaxis Rapid Response and Talend but requires full deployment, optimization, and roll-out to additional sites.

This role is ideal for someone who can make an immediate technical impact by designing, building, and optimizing ETL processes for near real-time integration between SAP and Kinaxis, ensuring high performance, data quality, and compliance.

Key Responsibilities

  • Design, develop, and maintain Talend ETL jobs between SAP and Kinaxis environments.
  • Integrate SAP data via IDocs, BAPIs, RFCs, or HANA queries .
  • Collaborate with Kinaxis teams to manage data loads and interfaces .
  • Automate ETL processes to enable near real-time processing in Rapid Response .
  • Troubleshoot, optimize, and document data workflows for performance and quality.
  • Work in SAP ECC or S/4HANA environments with AWS or Azure hosting.

Required Skills & Experience

  • 5+ years of hands-on Talend Middleware Pipeline development for SAP integrations.
  • Proficiency in Talend Studio, Talend Cloud, and Talend Data Integration .
  • Experience with Kinaxis Rapid Response and Talend combined.
  • Strong SQL and API (REST/SOA) development skills.
  • SAP integration knowledge (IDoc, BAPI, RFC, HANA queries ).
  • Familiarity with AWS or Azure cloud environments.

Preferred

  • Kinaxis Certified Consultant .
  • DoD project support experience.


This advertiser has chosen not to accept applicants from your region.

Job No Longer Available

This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.

However, we have similar jobs available for you below.

Cyber Data Pipeline Engineer

Bengaluru, Karnataka Dexian India

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years

Overview

We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.


Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile

A successful applicant will contribute to several important initiatives including:

  • Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
  • Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
  • Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
  • Monitor the health and performance of the data pipeline infrastructure.
  • Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
  • Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills

  • Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
  • Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
  • Hands-on experience in developing and validating field extraction using regular expressions.
  • A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
  • Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
  • Strong analytical and troubleshooting skills
  • Excellent verbal & written communication skills
  • Appreciation of Agile methodologies, specifically Kanban

Desired Skills

  • Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
  • Infrastructure automation and integration experience, ideally using Python and Ansible
  • Familiarity with cybersecurity concepts, event types, and monitoring requirements.
  • Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
This advertiser has chosen not to accept applicants from your region.

Cyber Data Pipeline Engineer

Bengaluru, Karnataka Dexian India

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years

Overview

We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.

Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile

A successful applicant will contribute to several important initiatives including:

  • Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
  • Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
  • Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
  • Monitor the health and performance of the data pipeline infrastructure.
  • Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
  • Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills

  • Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
  • Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
  • Hands-on experience in developing and validating field extraction using regular expressions.
  • A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
  • Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
  • Strong analytical and troubleshooting skills
  • Excellent verbal & written communication skills
  • Appreciation of Agile methodologies, specifically Kanban

Desired Skills

  • Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
  • Infrastructure automation and integration experience, ideally using Python and Ansible
  • Familiarity with cybersecurity concepts, event types, and monitoring requirements.
  • Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
This advertiser has chosen not to accept applicants from your region.

Cyber Data Pipeline Engineer

Bengaluru, Karnataka Dexian India

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years
Overview
We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.

Our key stakeholders are cyber teams including security response, investigations and insider threat.
Role Profile
A successful applicant will contribute to several important initiatives including:
Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
Monitor the health and performance of the data pipeline infrastructure.
Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
Create and maintain detailed documentation for pipeline architecture, processes, and integrations.
Required Skills
Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
Hands-on experience in developing and validating field extraction using regular expressions.
A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
Strong analytical and troubleshooting skills
Excellent verbal & written communication skills
Appreciation of Agile methodologies, specifically Kanban
Desired Skills
Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
Infrastructure automation and integration experience, ideally using Python and Ansible
Familiarity with cybersecurity concepts, event types, and monitoring requirements.
Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
This advertiser has chosen not to accept applicants from your region.

Cyber Data Pipeline Engineer

Bangalore, Karnataka Dexian India

Posted today

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years

Overview

We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.


Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile

A successful applicant will contribute to several important initiatives including:

  • Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
  • Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
  • Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
  • Monitor the health and performance of the data pipeline infrastructure.
  • Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
  • Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills

  • Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
  • Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
  • Hands-on experience in developing and validating field extraction using regular expressions.
  • A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
  • Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
  • Strong analytical and troubleshooting skills
  • Excellent verbal & written communication skills
  • Appreciation of Agile methodologies, specifically Kanban

Desired Skills

  • Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
  • Infrastructure automation and integration experience, ideally using Python and Ansible
  • Familiarity with cybersecurity concepts, event types, and monitoring requirements.
  • Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
This advertiser has chosen not to accept applicants from your region.

Cyber Data Pipeline Engineer

Bangalore, Karnataka Dexian India

Posted today

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years

Overview

We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.

Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile

A successful applicant will contribute to several important initiatives including:

  • Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
  • Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
  • Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
  • Monitor the health and performance of the data pipeline infrastructure.
  • Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
  • Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills

  • Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
  • Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
  • Hands-on experience in developing and validating field extraction using regular expressions.
  • A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, and encryption.
  • Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
  • Strong analytical and troubleshooting skills
  • Excellent verbal & written communication skills
  • Appreciation of Agile methodologies, specifically Kanban

Desired Skills

  • Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
  • Infrastructure automation and integration experience, ideally using Python and Ansible
  • Familiarity with cybersecurity concepts, event types, and monitoring requirements.
  • Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)

This advertiser has chosen not to accept applicants from your region.

Senior - Dev-Data-Pipeline

Mumbai, Maharashtra KPMG India

Posted today

Job Viewed

Tap Again To Close

Job Description

About KPMG in India

KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.

KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.

Context

KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.

We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.

Job Description

Role Objective:

Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).

Location- Mumbai

Years of Experience - 3-5 yrs

Roles & Responsibilities:

Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.

Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.

Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.

Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.

Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.

Technical Skills:

Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)

Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement

Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)

SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding

Soft Skills-

Very good communication and presentation skills

Must be able to articulate the thoughts and convince key stakeholders

Should be able to guide and upskill team members

Good to Have:

Programming Language: Knowledge and hands-on experience with languages like Python and R.

Relevant certifications related to the role


Equal employment opportunity information

KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
This advertiser has chosen not to accept applicants from your region.

Executive - Dev-Data-Pipeline

Mumbai, Maharashtra KPMG India

Posted today

Job Viewed

Tap Again To Close

Job Description

About KPMG in India

KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.

KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.

Context

KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment.

We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects.

Job Description

Role Objective:

Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend).

Location- Mumbai

Years of Experience - 3-5 yrs

Roles & Responsibilities:

Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution.

Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape.

Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance.

Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices.

Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution.

Technical Skills:

Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.)

Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement

Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.)

SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding

Soft Skills-

Very good communication and presentation skills

Must be able to articulate the thoughts and convince key stakeholders

Should be able to guide and upskill team members

Good to Have:

Programming Language: Knowledge and hands-on experience with languages like Python and R.

Relevant certifications related to the role


Equal employment opportunity information

KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Bachelors
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Freelance talend middleware pipeline expert sap integration kinaxis Jobs in India !

Data Pipeline Support Engineer

Data Meaning

Posted today

Job Viewed

Tap Again To Close

Job Description

Data Pipeline Support Engineer Location: India based, remote work.  Time: 10:30 AM – 6:30 PM IST (Aligned to US Central Time) Rate: $20 - $25/h About Data Meaning Data Meaning is a front-runner in Business Intelligence and Data Analytics consulting, renowned for our high-quality consulting services throughout the US and LATAM.

Our expertise lies in delivering tailored solutions in Business Intelligence, Data Warehousing, and Project Management.

We are dedicated to bringing premier services to our diverse clientele.

We have a global team of 95+ consultants, all working remotely, embodying a collaborative, inclusive, and innovative-driven work culture.

Position Summary: We are seeking a proactive Data Pipeline Support Engineer based in India to work the 12:00–8:00 AM Central Time (US) shift The ideal candidate must have strong hands-on experience with Azure Data Factory, Alteryx (including reruns and macros), and dbt.

This role requires someone detail-oriented, capable of independently managing early-morning support for critical workflows, and comfortable collaborating across time zones in a fast-paced data operations environment.

Responsibilities: Monitor and support data jobs in Alteryx, dbt, Snowflake, and ADF during 12:00–8:00 AM CT Perform first-pass remediation (e.g., reruns, credential resets, basic troubleshooting, etc).

Escalate unresolved or complex issues to nearshore Tier 2/3 support teams.

Log all incidents and resolutions in ticketing and audit systems (ServiceNow).

Collaborate with CT-based teams for smooth cross-timezone handoffs.

Contribute to automation improvements (e.g., Alteryx macros, retry logic).

Required Skills & Qualifications: Strong, hands-on experience in Alteryx (monitoring, reruns, macros).

Working knowledge of dbt (Data Build Tool) and Snowflake (basic SQL, Snowpark, data validation).

Experience with Azure Data Factory (ADF) pipeline executions Familiarity with SAP BW, workflow chaining, and cron-based job scheduling.

Familiarity with data formats, languages, protocols, and architecture styles required to provide Azure-based integration solutions (for example,.

NET, JSON, REST, and SOAP) including Azure Functions.

Excellent communication skills in English (written and verbal).

Ability to work independently and handle incident resolution with limited documentation.

Required Certifications: Alteryx Designer Core Certification SnowPro Core Certification dbt Fundamentals Course Certificate Microsoft Certified: Azure Data Engineer Associate ITIL v4 Foundation (or equivalent) Preferred Certifications: Alteryx Designer Advanced Certification Alteryx Server Administration SnowPro Advanced: Data Engineer dbt Analytics Engineering Certification Microsoft Certified: Azure Fundamentals (AZ-900)   Powered by JazzHR

This advertiser has chosen not to accept applicants from your region.

Cyber Data Pipeline Engineer

Bengaluru, Karnataka Dexian India

Posted today

Job Viewed

Tap Again To Close

Job Description

Experience - 7-14 years

Overview

We are seeking a skilled and motivated Data Pipeline Engineer to join our team. In this role, you will manage and maintain critical data pipeline platforms that collect, transform, and transmit cyber events data to downstream platforms, such as ElasticSearch and Splunk. You will be responsible for ensuring the reliability, scalability, and performance of the pipeline infrastructure while building complex integrations with cloud and on-premises cyber systems.


Our key stakeholders are cyber teams including security response, investigations and insider threat.

Role Profile

A successful applicant will contribute to several important initiatives including:

  • Collaborate with Cyber teams to identify, onboard, and integrate new data sources into the platform.
  • Design and implement data mapping, transformation, and routing processes to meet analytics and monitoring requirements.
  • Developing automation tools that integrate with in-house developed configuration management frameworks and APIs
  • Monitor the health and performance of the data pipeline infrastructure.
  • Working as a top-level escalation point to perform complex troubleshoots, working with other infrastructure teams to resolve issues
  • Create and maintain detailed documentation for pipeline architecture, processes, and integrations.

Required Skills

  • Hands-on experience deploying and managing large-scale dataflow products like Cribl, Logstash or Apache NiFi
  • Hands-on experience integrating data pipelines with cloud platforms (e.g., AWS, Azure, Google Cloud) and on-premises systems.
  • Hands-on experience in developing and validating field extraction using regular expressions.
  • A solid understanding of Operating Systems and Networking concepts: Linux/Unix system administration, HTTP and encryption.
  • Good understanding of software version control, deployment & build tools using DevOps SDLC practices (Git, Jenkins, Jira)
  • Strong analytical and troubleshooting skills
  • Excellent verbal & written communication skills
  • Appreciation of Agile methodologies, specifically Kanban

Desired Skills

  • Enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, MQ
  • Infrastructure automation and integration experience, ideally using Python and Ansible
  • Familiarity with cybersecurity concepts, event types, and monitoring requirements.
  • Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS)
This advertiser has chosen not to accept applicants from your region.

Backend and Data Pipeline Engineer

Karnataka, Karnataka JRD Systems

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

The Role: Backend and Data Pipeline Engineer


The Team:

We are investing in technology to develop new products that help our customers drive their growth and transformation agenda. These include new data integration, advanced analytics, and modern applications that address new customer needs and are highly visible and strategic within the organization. Do you love building products on platforms while leveraging cutting edge technology? Do you want to deliver innovative solutions to complex problems? If so, be part of our mighty team of engineers and play a key role in driving our business strategies.


The Impact:

We stand at cross-roads of innovation through Data Products to bring a competitive advantage to our business through the delivery of automotive forecasting solutions. Your work will contribute to the growth and success of our organization and provide valuable insights to our clients.

What’s in it for you:

We are looking for an innovative and mission-driven softwaredata engineer to make a significant impact by designing and developing AWS cloud native solutions that enables analysts to forecast long and short-term trends in the automotive industry. This role requires cutting edge data and cloud native technical expertise as well as the ability to work independently in a fast-paced, collaborative, and dynamic work environment.


Responsibilities:

- Design, develop, and maintain scalable data pipelines including complex algorithms

- Build and maintain backend services using Python or C# or similar, ensuring responsiveness and high performance

- Ensure data quality and integrity through robust validation processes

- Strong understanding of data integration and data modeling concepts

- Lead data integration projects and mentor junior engineers

- Collaborate with cross-functional teams to gather data requirements

- Collaborate with data scientists and analysts to optimize data flow and storage for advanced analytics

- Take ownership of the modules you work on, deliver on time and with quality, ensure software development best practices

- Utilize Redis for caching and data storage solutions to enhance application performance.


What We’re Looking For:

- Bachelor’s degree in computer science, or a related field

- Strong analytical and problem-solving skills

- 7+ years of experience in Data Engineering/Advanced Analytics

- Proficiency in Python and experience with Flask for backend development.

- Strong knowledge of object-oriented programming.

- AWS Proficiency is a big plus: ECR.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Freelance Talend Middleware Pipeline Expert Sap Integration Kinaxis Jobs