15 Apache Nifi jobs in India

Apache Nifi Engineer

Chennai, Tamil Nadu GSSTech Group

Posted today

Job Viewed

Tap Again To Close

Job Description

We’re looking for a highly skilled Integration Specialist with expertise in Apache NiFi , Java , and Spring Boot to support one of the top banks in the UAE from our offshore development center.

Education

Degree, Post graduate in Computer Science or related field (or equivalent industry experience)

Experience

  • Minimum 5 years experience in implementing end to end integration solutions using NiFi processors
  • Minimum 5 years experience in Java and Spring boot with Microservices
  • Minimum 3 years experience in application security such as SSL Certificates, cryptography
  • Minimum 2 years experience in Distributed architecture

Technical Skills

  • Excellent in designing and developing NiFi and MiNiFi Flows using various processors along with failover scenarios
  • Excellent in SSL Certificates, communicate protocols such as SFTP, Site to Site and cryptography
  • Well versed in distributed architecture using zookeeper
  • Excellent in Java and microservices
  • Familiar with Distributed services resiliency and monitoring in a production environment.

Functional Skills

  • Experience in following best Coding, Security, Unit testing and Documentation standards and practices
  • Experience in Banking/Financial domain is highly desired
  • Experience in Agile methodology
  • Ensure quality of technical and application architecture and design of systems across the organization
  • Effectively research and benchmark technology against other best in class technologies

Soft Skills

  • Should be excellent in communication, should have positive attitude towards work and should be eager to learn new things
  • Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization
  • Should have excellent interpersonal skills to interact and present the ideas to senior management in IT and Business alike
  • Should be able to train/mentor team members

Requirements

Notice Period: Immediate joiners or candidates with a maximum 30 days notice period will be preferred.

About us

Global Software Solutions Group (GSS) has been a leading and award winning player in the field of real-time payments and has established partnerships with leading Global software providers with a vision to be a single-window provider of technology solutions to the banking industry. We are also the strategic vendor of ENBD and FAB for their resourcing needs. Our headquarters are in Dubai Internet City. Our key clients are FAB, Finance house, Al Maryah Community bank, United Arab bank, EDB, Lulu Exchange, Lari Exchange, Deem finance. Our Website is gsstechgroup.com.

This advertiser has chosen not to accept applicants from your region.

Freelance Interviewer-Apache Nifi

Thiruvananthapuram, Kerala Futuremug

Posted today

Job Viewed

Tap Again To Close

Job Description

Freelance Interviewer-Apache Nifi

Experience:7 to 10 years


About the company

We are an HR Tech company based in Trivandrum, offering hiring support to MNCs across India through interview ,assessments and recruitment services. We have a network of 4000+ experienced professionals who take interviews in their available time slots.

Were looking for experienced professionals across various domains who can take up freelance interviews for our clients. Interviews are conducted remotely, and schedules are flexible based on your availability. And currently we are looking at panels for the given description.


Experience

  • Minimum 5 years experience in implementing end to end integration solutions using NiFi processors
  • Minimum 5 years experience in Java and Spring boot with Microservices
  • Minimum 3 years experience in application security such as SSL Certificates, cryptography
  • Minimum 2 years experience in Distributed architecture

Technical Skills

  • Excellent in designing and developing NiFi and MiNiFi Flows using various processors along with failover scenarios
  • Excellent in SSL Certificates, communicate protocols such as SFTP, Site to Site and cryptography
  • Well versed in distributed architecture using zookeeper
  • Excellent in Java and microservices
  • Familiar with Distributed services resiliency and monitoring in a production environment.

Functional Skills

  • Experience in following best Coding, Security, Unit testing and Documentation standards and practices
  • Experience in Banking/Financial domain is highly desired
  • Experience in Agile methodology
  • Ensure quality of technical and application architecture and design of systems across the organization
  • Effectively research and benchmark technology against other best in class technologies

Soft Skills

  • Should be excellent in communication, should have positive attitude towards work and should be eager to learn new things
  • Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization
  • Should have excellent interpersonal skills to interact and present the ideas to senior management in IT and Business alike
  • Should be able to train/mentor team members


This advertiser has chosen not to accept applicants from your region.

Freelance Interviewer-Apache Nifi

Thiruvananthapuram, Kerala Futuremug

Posted 24 days ago

Job Viewed

Tap Again To Close

Job Description

freelance

Freelance Interviewer-Apache Nifi

Experience:7 to 10 years


About the company

We are an HR Tech company based in Trivandrum, offering hiring support to MNCs across India through interview ,assessments and recruitment services. We have a network of 4000+ experienced professionals who take interviews in their available time slots.

Were looking for experienced professionals across various domains who can take up freelance interviews for our clients. Interviews are conducted remotely, and schedules are flexible based on your availability. And currently we are looking at panels for the given description.


Experience

  • Minimum 5 years experience in implementing end to end integration solutions using NiFi processors
  • Minimum 5 years experience in Java and Spring boot with Microservices
  • Minimum 3 years experience in application security such as SSL Certificates, cryptography
  • Minimum 2 years experience in Distributed architecture

Technical Skills

  • Excellent in designing and developing NiFi and MiNiFi Flows using various processors along with failover scenarios
  • Excellent in SSL Certificates, communicate protocols such as SFTP, Site to Site and cryptography
  • Well versed in distributed architecture using zookeeper
  • Excellent in Java and microservices
  • Familiar with Distributed services resiliency and monitoring in a production environment.

Functional Skills

  • Experience in following best Coding, Security, Unit testing and Documentation standards and practices
  • Experience in Banking/Financial domain is highly desired
  • Experience in Agile methodology
  • Ensure quality of technical and application architecture and design of systems across the organization
  • Effectively research and benchmark technology against other best in class technologies

Soft Skills

  • Should be excellent in communication, should have positive attitude towards work and should be eager to learn new things
  • Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization
  • Should have excellent interpersonal skills to interact and present the ideas to senior management in IT and Business alike
  • Should be able to train/mentor team members


This advertiser has chosen not to accept applicants from your region.

Lead Data Engineer | Bangalore | Apache Nifi | 6+ yrs Exp

Bengaluru, Karnataka Michael Page

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Be a part of large Conglomorate's central team
  • Stable and large scale of operations
  • About Our Client

    A digital services company, part of a large Indian conglomerate, focuses on creating digital products and services for consumers and businesses

    Job Description

  • Architect and build scalable data ingestion, transformation, and processing pipelines on Azure Data Lake and Databricks.
  • Lead migration from legacy systems to modern, cloud-native data platforms.
  • Implement data governance and cataloging using Unity Catalog.
  • Ensure strict compliance with data privacy regulations (GDPR, DPDP) and work with InfoSec to embed security best practices.
  • Drive cloud cost optimization and performance tuning across Azure services.
  • Collaborate with cross-functional teams including Product, Analytics, DevOps, and InfoSec.
  • Mentor and guide engineers and analysts to deliver high-impact solutions.
  • The Successful Applicant

  • Proven expertise in Azure Cloud (Data Lake, Data Factory, Event Hubs, Key Vault).
  • Hands-on experience with Apache Spark (PySpark/Scala), Kafka, NiFi, Delta Lake, and Databricks.
  • Deep understanding of PII data encryption, tokenization, and access controls.
  • Familiarity with Unity Catalog or similar data governance tools.
  • Skilled in CI/CD, infrastructure as code (Terraform/ARM), and containerization (Docker/Kubernetes).
  • Strong analytical, communication, and leadership skills.
  • Experience delivering complex projects in Agile/Scrum environments.
  • This advertiser has chosen not to accept applicants from your region.

    Data Flow Engineer

    Noida, Uttar Pradesh Clearwater Analytics

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Summary:  

    The Data Flow Engineer's primary role is to create and manage data connections, perform validations, and execute transformations. Their work is integral to the ongoing process of iterative improvement, with a particular focus on enhancing auto-reconciliation within the system through advanced technology. 

    Responsibilities:  

  • Import and validate file delivery for new clients. 

  • Automate daily process monitoring and reporting. 

  • Establish connections through external APIs and FTPs. 

  • Ensure timely and dependable consumption of external portfolio data. 

  • Normalize external datasets into a standardized Clearwater format facilitating the in-take process. 

  • Mine data from existing feeds to identify, design, and implement solutions to improve auto-reconciliation. 

  • Execute improvements requested from Operations and Development groups. 

  • Apply acquired skills, procedures, and decision-making best practices to complete various issues, such as normalizing new feeds and improving automation. 

  • Understand and reference or explain the general workflow, tools, and Clearwater value proposition. 

  • Use critical thinking to address issues and offer solutions for both internal and external parties, ensuring best practices are employed. 

  • Clearly and effectively communicate the technical aspects of Clearwater systems and our best practices with non-technical internal and external stakeholders. 

  • Engage in light on-call duties. 

  • Required Skills:  

  • Securities, accounting, and financial experience. 

  • Strong understanding of SQL and relational database principles. 

  • Experience with scripting programming languages like Groovy, Perl or Python. 

  • Experience with industry-standard data transmission protocols preferred. 

  • Securities, accounting, and financial experience preferred. 

  • Strong computer skills, including proficiency in Microsoft Office.

  • Excellent attention to detail and strong documentation skills.

  • Outstanding verbal and written communication skills.

  • Strong organizational and interpersonal skills. 

  • Exceptional problem-solving abilities. 

  • Education and Experience:  

  • Bachelor's degree in Math, Computer Information Systems, or other relevant degrees. 

  • 2+ years of relevant experience. 

  • Experience with industry-standard data transmission protocols. 

  • This advertiser has chosen not to accept applicants from your region.

    Solution Architect (Network Traffic & Flow Data Systems)

    Pune, Maharashtra Programming.com

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:

    Solution Architect (Network Traffic & Flow Data Systems)


    Location :

    Pune, India (with Travel to Onsite)



    Experience Required :

    15+ years in solution architecture with at least 5 years in telecom data systems, network traffic monitoring, or real-time data streaming platforms.


    Overview :

    We are seeking a senior Solution Architect to lead the design, integration, and delivery of a large-scale network traffic and data flow system.

    This role is accountable for ensuring architectural integrity, zero-error tolerance, and robust fallback mechanisms across the entire solution lifecycle. The architect will oversee subscriber data capture, DPI, DR generation, Kafka integration, DWH ingestion, and secure API-based retrieval, ensuring compliance and security regulations.


    Key Responsibilities :

    •   Own the end-to-end architecture spanning subscriber traffic capture, DPI, DR generation, Kafka streaming, and data lake ingestion.

    •   Design and document system architecture, data flow diagrams, and integration blueprints across DPI and traffic classification systems, nProbe, Kafka, Spark, and Cloudera CDP

    •   Implement fallback and error-handling mechanisms to ensure zero data loss and high availability across all layers.

    •   Lead cross-functional collaboration with network engineers, Kafka developers, data platform teams, and security stakeholders.

    •   Ensure data governance, encryption, and compliance using tools like Apache Ranger, Atlas, SDX, and HashiCorp Vault.

    •   Oversee API design and exposure for customer access, including advanced search, session correlation, and audit logging.

    •   Drive SIT/UAT planning, performance benchmarking, and production rollout readiness.

    •   Provide technical leadership across multiple vendors and internal teams, ensuring alignment with Business requirements and regulatory standards.




    Required Skills & Qualifications :

    •   Proven experience in telecom-grade architecture involving DPI, IPFIX/NetFlow, and subscriber metadata enrichment.

    •   Deep knowledge of Apache Kafka, Spark Structured Streaming, and Cloudera CDP (HDFS, Hive, Iceberg, Ranger).

    •   Experience integrating nProbe with Kafka and downstream analytics platforms.

    •   Strong understanding of QoE metrics, A/B party correlation, and application traffic classification.

    •   Expertise in RESTful API design, schema management (Avro/JSON), and secure data access protocols.

    •   Familiarity with network interfaces (Gn/Gi, Radius, DNS) and traffic filtering strategies.

    •   Experience implementing fallback mechanisms, error queues, and disaster recovery strategies.

    •   Excellent communication, documentation, and stakeholder management skills.


    • Cloudera Certified Architect / Kafka Developer / AWS or GCP Solution Architect, Security certifications (e.g., CISSP, CISM) will be advantageous
    This advertiser has chosen not to accept applicants from your region.

    Solution Architect (Network Traffic & Flow Data Systems)

    Pune, Maharashtra Programming.com

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:
    Solution Architect (Network Traffic & Flow Data Systems)

    Location :
    Pune, India (with Travel to Onsite)

    Experience Required :
    15+ years in solution architecture with at least 5 years in telecom data systems, network traffic monitoring, or real-time data streaming platforms.

    Overview :
    We are seeking a senior Solution Architect to lead the design, integration, and delivery of a large-scale network traffic and data flow system.
    This role is accountable for ensuring architectural integrity, zero-error tolerance, and robust fallback mechanisms across the entire solution lifecycle. The architect will oversee subscriber data capture, DPI, DR generation, Kafka integration, DWH ingestion, and secure API-based retrieval, ensuring compliance and security regulations.

    Key Responsibilities :
    •   Own the end-to-end architecture spanning subscriber traffic capture, DPI, DR generation, Kafka streaming, and data lake ingestion.
    •   Design and document system architecture, data flow diagrams, and integration blueprints across DPI and traffic classification systems, nProbe, Kafka, Spark, and Cloudera CDP
    •   Implement fallback and error-handling mechanisms to ensure zero data loss and high availability across all layers.
    •   Lead cross-functional collaboration with network engineers, Kafka developers, data platform teams, and security stakeholders.
    •   Ensure data governance, encryption, and compliance using tools like Apache Ranger, Atlas, SDX, and HashiCorp Vault.
    •   Oversee API design and exposure for customer access, including advanced search, session correlation, and audit logging.
    •   Drive SIT/UAT planning, performance benchmarking, and production rollout readiness.
    •   Provide technical leadership across multiple vendors and internal teams, ensuring alignment with Business requirements and regulatory standards.

    Required Skills & Qualifications :
    •   Proven experience in telecom-grade architecture involving DPI, IPFIX/NetFlow, and subscriber metadata enrichment.
    •   Deep knowledge of Apache Kafka, Spark Structured Streaming, and Cloudera CDP (HDFS, Hive, Iceberg, Ranger).
    •   Experience integrating nProbe with Kafka and downstream analytics platforms.
    •   Strong understanding of QoE metrics, A/B party correlation, and application traffic classification.
    •   Expertise in RESTful API design, schema management (Avro/JSON), and secure data access protocols.
    •   Familiarity with network interfaces (Gn/Gi, Radius, DNS) and traffic filtering strategies.
    •   Experience implementing fallback mechanisms, error queues, and disaster recovery strategies.
    •   Excellent communication, documentation, and stakeholder management skills.

    Cloudera Certified Architect / Kafka Developer / AWS or GCP Solution Architect, Security certifications (e.g., CISSP, CISM) will be advantageous
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Apache nifi Jobs in India !

    Solution Architect (Network Traffic & Flow Data Systems)

    Pune, Maharashtra Programming.com

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:

    Solution Architect (Network Traffic & Flow Data Systems)


    Location :

    Pune, India (with Travel to Onsite)



    Experience Required :

    15+ years in solution architecture with at least 5 years in telecom data systems, network traffic monitoring, or real-time data streaming platforms.


    Overview :

    We are seeking a senior Solution Architect to lead the design, integration, and delivery of a large-scale network traffic and data flow system.

    This role is accountable for ensuring architectural integrity, zero-error tolerance, and robust fallback mechanisms across the entire solution lifecycle. The architect will oversee subscriber data capture, DPI, DR generation, Kafka integration, DWH ingestion, and secure API-based retrieval, ensuring compliance and security regulations.


    Key Responsibilities :

    •   Own the end-to-end architecture spanning subscriber traffic capture, DPI, DR generation, Kafka streaming, and data lake ingestion.

    •   Design and document system architecture, data flow diagrams, and integration blueprints across DPI and traffic classification systems, nProbe, Kafka, Spark, and Cloudera CDP

    •   Implement fallback and error-handling mechanisms to ensure zero data loss and high availability across all layers.

    •   Lead cross-functional collaboration with network engineers, Kafka developers, data platform teams, and security stakeholders.

    •   Ensure data governance, encryption, and compliance using tools like Apache Ranger, Atlas, SDX, and HashiCorp Vault.

    •   Oversee API design and exposure for customer access, including advanced search, session correlation, and audit logging.

    •   Drive SIT/UAT planning, performance benchmarking, and production rollout readiness.

    •   Provide technical leadership across multiple vendors and internal teams, ensuring alignment with Business requirements and regulatory standards.




    Required Skills & Qualifications :

    •   Proven experience in telecom-grade architecture involving DPI, IPFIX/NetFlow, and subscriber metadata enrichment.

    •   Deep knowledge of Apache Kafka, Spark Structured Streaming, and Cloudera CDP (HDFS, Hive, Iceberg, Ranger).

    •   Experience integrating nProbe with Kafka and downstream analytics platforms.

    •   Strong understanding of QoE metrics, A/B party correlation, and application traffic classification.

    •   Expertise in RESTful API design, schema management (Avro/JSON), and secure data access protocols.

    •   Familiarity with network interfaces (Gn/Gi, Radius, DNS) and traffic filtering strategies.

    •   Experience implementing fallback mechanisms, error queues, and disaster recovery strategies.

    •   Excellent communication, documentation, and stakeholder management skills.


    • Cloudera Certified Architect / Kafka Developer / AWS or GCP Solution Architect, Security certifications (e.g., CISSP, CISM) will be advantageous
    This advertiser has chosen not to accept applicants from your region.

    Solution Architect (Network Traffic & Flow Data Systems)

    Pune, Maharashtra Programming.com

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:

    Solution Architect (Network Traffic & Flow Data Systems)


    Location :

    Pune, India (with Travel to Onsite)



    Experience Required :

    15+ years in solution architecture with at least 5 years in telecom data systems, network traffic monitoring, or real-time data streaming platforms.


    Overview :

    We are seeking a senior Solution Architect to lead the design, integration, and delivery of a large-scale network traffic and data flow system.

    This role is accountable for ensuring architectural integrity, zero-error tolerance, and robust fallback mechanisms across the entire solution lifecycle. The architect will oversee subscriber data capture, DPI, DR generation, Kafka integration, DWH ingestion, and secure API-based retrieval, ensuring compliance and security regulations.


    Key Responsibilities :

    •   Own the end-to-end architecture spanning subscriber traffic capture, DPI, DR generation, Kafka streaming, and data lake ingestion.

    •   Design and document system architecture, data flow diagrams, and integration blueprints across DPI and traffic classification systems, nProbe, Kafka, Spark, and Cloudera CDP

    •   Implement fallback and error-handling mechanisms to ensure zero data loss and high availability across all layers.

    •   Lead cross-functional collaboration with network engineers, Kafka developers, data platform teams, and security stakeholders.

    •   Ensure data governance, encryption, and compliance using tools like Apache Ranger, Atlas, SDX, and HashiCorp Vault.

    •   Oversee API design and exposure for customer access, including advanced search, session correlation, and audit logging.

    •   Drive SIT/UAT planning, performance benchmarking, and production rollout readiness.

    •   Provide technical leadership across multiple vendors and internal teams, ensuring alignment with Business requirements and regulatory standards.




    Required Skills & Qualifications :

    •   Proven experience in telecom-grade architecture involving DPI, IPFIX/NetFlow, and subscriber metadata enrichment.

    •   Deep knowledge of Apache Kafka, Spark Structured Streaming, and Cloudera CDP (HDFS, Hive, Iceberg, Ranger).

    •   Experience integrating nProbe with Kafka and downstream analytics platforms.

    •   Strong understanding of QoE metrics, A/B party correlation, and application traffic classification.

    •   Expertise in RESTful API design, schema management (Avro/JSON), and secure data access protocols.

    •   Familiarity with network interfaces (Gn/Gi, Radius, DNS) and traffic filtering strategies.

    •   Experience implementing fallback mechanisms, error queues, and disaster recovery strategies.

    •   Excellent communication, documentation, and stakeholder management skills.


    • Cloudera Certified Architect / Kafka Developer / AWS or GCP Solution Architect, Security certifications (e.g., CISSP, CISM) will be advantageous
    This advertiser has chosen not to accept applicants from your region.

    Solution Architect (Network Traffic & Flow Data Systems)

    Pune, Maharashtra Programming.com

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title:

    Solution Architect (Network Traffic & Flow Data Systems)


    Location :

    Pune, India (with Travel to Onsite)



    Experience Required :

    15+ years in solution architecture with at least 5 years in telecom data systems, network traffic monitoring, or real-time data streaming platforms.


    Overview :

    We are seeking a senior Solution Architect to lead the design, integration, and delivery of a large-scale network traffic and data flow system.

    This role is accountable for ensuring architectural integrity, zero-error tolerance, and robust fallback mechanisms across the entire solution lifecycle. The architect will oversee subscriber data capture, DPI, DR generation, Kafka integration, DWH ingestion, and secure API-based retrieval, ensuring compliance and security regulations.


    Key Responsibilities :

    •   Own the end-to-end architecture spanning subscriber traffic capture, DPI, DR generation, Kafka streaming, and data lake ingestion.

    •   Design and document system architecture, data flow diagrams, and integration blueprints across DPI and traffic classification systems, nProbe, Kafka, Spark, and Cloudera CDP

    •   Implement fallback and error-handling mechanisms to ensure zero data loss and high availability across all layers.

    •   Lead cross-functional collaboration with network engineers, Kafka developers, data platform teams, and security stakeholders.

    •   Ensure data governance, encryption, and compliance using tools like Apache Ranger, Atlas, SDX, and HashiCorp Vault.

    •   Oversee API design and exposure for customer access, including advanced search, session correlation, and audit logging.

    •   Drive SIT/UAT planning, performance benchmarking, and production rollout readiness.

    •   Provide technical leadership across multiple vendors and internal teams, ensuring alignment with Business requirements and regulatory standards.




    Required Skills & Qualifications :

    •   Proven experience in telecom-grade architecture involving DPI, IPFIX/NetFlow, and subscriber metadata enrichment.

    •   Deep knowledge of Apache Kafka, Spark Structured Streaming, and Cloudera CDP (HDFS, Hive, Iceberg, Ranger).

    •   Experience integrating nProbe with Kafka and downstream analytics platforms.

    •   Strong understanding of QoE metrics, A/B party correlation, and application traffic classification.

    •   Expertise in RESTful API design, schema management (Avro/JSON), and secure data access protocols.

    •   Familiarity with network interfaces (Gn/Gi, Radius, DNS) and traffic filtering strategies.

    •   Experience implementing fallback mechanisms, error queues, and disaster recovery strategies.

    •   Excellent communication, documentation, and stakeholder management skills.


    • Cloudera Certified Architect / Kafka Developer / AWS or GCP Solution Architect, Security certifications (e.g., CISSP, CISM) will be advantageous
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Apache Nifi Jobs