24 Apache Nifi jobs in India

Apache NiFi Specialist

Hyderabad, Andhra Pradesh LTIMindtree

Posted today

Job Viewed

Tap Again To Close

Job Description

Primary Skill: Apache Kafka, NiFi Administration, Jenkins, Ansible, Linux

Secondary Skill:Python, PostgreSQL

YOE: 5-8 years

Job Description

Apache NiFi Administration experience like building clusters

NiFi cluster version upgrades

Strong troubleshooting skills on Linux

Good knowledge of Kafka administration

This advertiser has chosen not to accept applicants from your region.

Apache Nifi Kafka Engineer

Bengaluru, Karnataka ₹104000 - ₹130878 Y CGI

Posted today

Job Viewed

Tap Again To Close

Job Description

Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at

Job Title: Apache Nifi Kafka Engineer

Experience: 5- 8 Years

Category: Software Development/ Engineering

Shift: General

Main location: India, Tamil Nadu, Chennai

Position ID: J

Employment Type: Full Time

Education Qualification: Bachelor's degree in computer science or related field or higher with minimum 5 years of relevant experience.

Position Description: We are seeking a highly experienced Senior Data Engineer with a strong background in building end-to-end ETL and ELT data pipelines and working across both AWS and GCP cloud platforms. The ideal candidate will have deep expertise in Big Data ecosystems, real-time and batch processing, and experience working with large-scale distributed systems.

Your future duties and responsibilities

  • Design, develop, and deploy end-to-end scalable data pipelines for ingesting, transforming, and storing structured, semi-structured, and unstructured data.
  • Work with large-scale data processing frameworks like PySpark, Spark SQL, and Spark Streaming to enable real-time and batch analytics.
  • Optimize and manage ETL pipelines, data lakes and data warehouses on AWS (S3, Athena, Redshift, RDS, Glue, EMR) and GCP (BigQuery, DataProc, GCS).
  • Build and optimize ETL/ELT workflows for ingesting, transforming, and delivering high-volume data from diverse sources to data lakes and data warehouses
  • Develop data integration workflows using tools like Apache NiFi, Kafka, Sqoop, Hadoop, HDFS, Hive, Airflow.
  • Implement CDC, SCD Type 1, SCD Type 2, and real-time ingestion patterns.
  • Leverage Hive for data querying and performance tuning using Partitioning, Bucketing, and Compression techniques.
  • Handle multiple data formats including Parquet, ORC, Avro, JSON, XML, and CSV.
  • Collaborate with Data Scientists, Analysts, and Product teams to translate business requirements into technical solutions.
  • Mentor junior engineers, conduct technical interviews, and contribute to building a high-performance data engineering team.
  • Exposure to CI/CD pipelines and container orchestration (Docker, Kubernetes).
  • Participate in Agile discussions, client meetings, sprint planning, and demos.

Required qualifications to be successful in this role

Must-Have Skills:

  • Proficient in PySpark, Spark Core, Spark SQL, and Spark Streaming.
  • Hands-on experience with AWS data services – S3, Redshift, Athena, Glue, EMR.
  • Strong exposure to Google Cloud Platform – BigQuery, DataProc, Cloud Spanner, GCS.
  • Expertise in Hive, HDFS, Kafka, Apache NiFi and Sqoop.
  • Experience with job orchestration tools like Apache Airflow or BMC Control-M.
  • Solid understanding of data modeling, data governance, and data quality best practices.
  • Strong problem-solving and communication skills with a collaborative mindset.

CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs.

Together, as owners, let's turn meaningful insights into action.

Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because…

You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction.

Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise.

You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons.

Come join our team—one of the largest IT and business consulting services firms in the world.

This advertiser has chosen not to accept applicants from your region.

Data Engineer – Apache NiFi

Bengaluru, Karnataka ₹600000 - ₹1800000 Y DIATOZ: Digital A to Z Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Position Overview

We are seeking a highly skilled 
Data Engineer
 with strong expertise in 
Apache NiFi
 and 
Groovy scripting
 to design and implement robust data pipelines. The role involves building scalable data flows, automating ingestion and transformation, and integrating multiple enterprise systems using NiFi and Groovy.

Key Responsibilities

  • Design, develop, and maintain 
    end-to-end data pipelines
     using Apache NiFi.
  • Write and maintain 
    Groovy scripts
     for advanced data transformation, parsing, validation, and enrichment.
  • Build reusable components and templates for 
    data ingestion, routing, and processing
    .
  • Integrate NiFi with 
    databases, APIs, messaging systems (Kafka, JMS, MQTT), and cloud platforms (AWS/Azure/GCP)
    .
  • Implement 
    data quality checks, schema validation, and error handling
     within NiFi flows.
  • Ensure 
    data lineage, provenance, and governance compliance
     across all pipelines.
  • Monitor, troubleshoot, and optimize NiFi flows for 
    scalability, performance, and reliability
    .
  • Work closely with Data Scientists, Analysts, and Architects to deliver business-ready datasets.
  • Support 
    CI/CD automation for NiFi pipelines
     and version control using Git.

Required Qualifications

  • Bachelor's degree in Computer Science, Data Engineering, or related field.
  • 3+ years of hands-on experience
     in building pipelines with 
    Apache NiFi
    .
  • Strong expertise in 
    Groovy scripting
     for NiFi data transformations and custom logic.
  • Experience with 
    data formats
    : JSON, XML, CSV, Avro, Parquet.
  • Solid understanding of 
    databases (SQL/NoSQL)
     and API integrations.
  • Familiarity with 
    Linux/Unix environments
    , shell scripting, and automation.
  • Experience with 
    version control (Git/GitLab/Bitbucket)
    .

Preferred / Nice to Have

  • Exposure to 
    real-time streaming platforms
     (Kafka, Spark Streaming, Flink).
  • Experience with 
    cloud data services
     (AWS Glue, S3, Kinesis, Azure Data Factory, GCP Dataflow).
  • Knowledge of 
    DevOps practices
     (Docker, Kubernetes, Jenkins, Ansible).
  • Familiarity with 
    monitoring/observability tools
     (Prometheus, Grafana, ELK stack).
  • Awareness of 
    data governance, security, and compliance standards
     (GDPR, HIPAA).

Soft Skills

  • Strong analytical and problem-solving ability.
  • Ability to translate business requirements into 
    data pipelines and flows
    .
  • Collaborative mindset for working with cross-functional teams.
  • Self-motivated and eager to learn emerging data engineering tools.

Compensation & Benefits

  • Competitive salary and benefits package.
  • Health insurance and employee wellness programs.
  • Hybrid/remote working options.
  • Professional development and certification sponsorship.
  • Collaborative and innovative work culture.
This advertiser has chosen not to accept applicants from your region.

Freelance Interviewer-Apache Nifi

Thiruvananthapuram, Kerala Futuremug

Posted today

Job Viewed

Tap Again To Close

Job Description

Freelance Interviewer-Apache Nifi

Experience:7 to 10 years


About the company

We are an HR Tech company based in Trivandrum, offering hiring support to MNCs across India through interview ,assessments and recruitment services. We have a network of 4000+ experienced professionals who take interviews in their available time slots.

Were looking for experienced professionals across various domains who can take up freelance interviews for our clients. Interviews are conducted remotely, and schedules are flexible based on your availability. And currently we are looking at panels for the given description.


Experience

  • Minimum 5 years experience in implementing end to end integration solutions using NiFi processors
  • Minimum 5 years experience in Java and Spring boot with Microservices
  • Minimum 3 years experience in application security such as SSL Certificates, cryptography
  • Minimum 2 years experience in Distributed architecture

Technical Skills

  • Excellent in designing and developing NiFi and MiNiFi Flows using various processors along with failover scenarios
  • Excellent in SSL Certificates, communicate protocols such as SFTP, Site to Site and cryptography
  • Well versed in distributed architecture using zookeeper
  • Excellent in Java and microservices
  • Familiar with Distributed services resiliency and monitoring in a production environment.

Functional Skills

  • Experience in following best Coding, Security, Unit testing and Documentation standards and practices
  • Experience in Banking/Financial domain is highly desired
  • Experience in Agile methodology
  • Ensure quality of technical and application architecture and design of systems across the organization
  • Effectively research and benchmark technology against other best in class technologies

Soft Skills

  • Should be excellent in communication, should have positive attitude towards work and should be eager to learn new things
  • Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization
  • Should have excellent interpersonal skills to interact and present the ideas to senior management in IT and Business alike
  • Should be able to train/mentor team members


This advertiser has chosen not to accept applicants from your region.

Freelance Interviewer-Apache Nifi

Thiruvananthapuram, Kerala Futuremug

Posted 25 days ago

Job Viewed

Tap Again To Close

Job Description

freelance

Freelance Interviewer-Apache Nifi

Experience:7 to 10 years


About the company

We are an HR Tech company based in Trivandrum, offering hiring support to MNCs across India through interview ,assessments and recruitment services. We have a network of 4000+ experienced professionals who take interviews in their available time slots.

Were looking for experienced professionals across various domains who can take up freelance interviews for our clients. Interviews are conducted remotely, and schedules are flexible based on your availability. And currently we are looking at panels for the given description.


Experience

  • Minimum 5 years experience in implementing end to end integration solutions using NiFi processors
  • Minimum 5 years experience in Java and Spring boot with Microservices
  • Minimum 3 years experience in application security such as SSL Certificates, cryptography
  • Minimum 2 years experience in Distributed architecture

Technical Skills

  • Excellent in designing and developing NiFi and MiNiFi Flows using various processors along with failover scenarios
  • Excellent in SSL Certificates, communicate protocols such as SFTP, Site to Site and cryptography
  • Well versed in distributed architecture using zookeeper
  • Excellent in Java and microservices
  • Familiar with Distributed services resiliency and monitoring in a production environment.

Functional Skills

  • Experience in following best Coding, Security, Unit testing and Documentation standards and practices
  • Experience in Banking/Financial domain is highly desired
  • Experience in Agile methodology
  • Ensure quality of technical and application architecture and design of systems across the organization
  • Effectively research and benchmark technology against other best in class technologies

Soft Skills

  • Should be excellent in communication, should have positive attitude towards work and should be eager to learn new things
  • Self-motivator and self-starter, Ability to own and drive things without supervision and works collaboratively with the teams across the organization
  • Should have excellent interpersonal skills to interact and present the ideas to senior management in IT and Business alike
  • Should be able to train/mentor team members


This advertiser has chosen not to accept applicants from your region.

Contract Data Engineer – Apache NiFi

Chennai, Tamil Nadu ₹1200000 - ₹3600000 Y FUND PIXEL ADVISORY LLP

Posted today

Job Viewed

Tap Again To Close

Job Description

Job description

We're Hiring: NiFi Data Engineer (Contract | 6–10 Years Experience)

Location: Hyderabad (Preferred) | Bangalore | Mumbai | Chennai | Ahmedabad | Pune

Work Mode: Hybrid – 2 Days/Week from Mindtrilogy Office (Mandatory)

Notice Period: Immediate Joiners Only

No. of Positions: 2

Job Type: Contract

About the Role

Mindtrilogy is seeking experienced NiFi Data Engineers (Contract) to join our data engineering team. In this role, you'll be responsible for designing and maintaining scalable data pipelines using Apache NiFi, enabling efficient data processing and transformation across our systems.

You'll work closely with cross-functional teams to power innovative solutions that drive real business outcomes.

Key Responsibilities Core Responsibilities

Design and build robust, high-performance data pipelines using Apache NiFi

Write and maintain transformation scripts in Java and Python

Develop and manage ETL/ELT workflows for diverse data sources

Ensure reliable data ingestion, transformation, and distribution processes

Collaborate with Engineers, Developers, and Product Managers to align on data requirements

Cloud & Infrastructure

Utilize AWS services including S3, Glue, Athena, EMR, and AWS Batch

Apply data security best practices (e.g., SSL/TLS, client authentication, validation, logging)

Monitoring & Optimization

Implement monitoring, logging, and alerting for production pipelines

Troubleshoot and resolve performance and reliability issues proactively

Required Skills & Qualifications

Strong expertise in Apache NiFi and building scalable data pipelines

Proficient in Java, Python, and SQL

Experience with AWS data stack (S3, Glue, EMR, Athena, Batch)

Solid scripting skills using Groovy or Python for custom data transformations

Strong grasp of data modeling, API design, and data governance

Familiarity with implementing security protocols and best practices

Soft Skills & Culture Fit

Strong communication and team collaboration skills

Self-driven with a proactive mindset

Ability to thrive in a hybrid work environment

Flexible, adaptable, and eager to take ownership

This advertiser has chosen not to accept applicants from your region.

Apache NiFi Developer/Data Engineer

Ernakulam, Kerala ₹1500000 - ₹2800000 Y Assyst International

Posted today

Job Viewed

Tap Again To Close

Job Description

Apache NiFi, ETL/ELT processes, MySQL, PostgreSQL,AWS, Azure, GCP, Python, Bash, Hadoop, Spark, Docker, Kubernetes, GDPR, HIPAA

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Apache nifi Jobs in India !

Lead Data Engineer | Bangalore | Apache Nifi | 6+ yrs Exp

Bengaluru, Karnataka Michael Page

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Be a part of large Conglomorate's central team
  • Stable and large scale of operations
  • About Our Client

    A digital services company, part of a large Indian conglomerate, focuses on creating digital products and services for consumers and businesses

    Job Description

  • Architect and build scalable data ingestion, transformation, and processing pipelines on Azure Data Lake and Databricks.
  • Lead migration from legacy systems to modern, cloud-native data platforms.
  • Implement data governance and cataloging using Unity Catalog.
  • Ensure strict compliance with data privacy regulations (GDPR, DPDP) and work with InfoSec to embed security best practices.
  • Drive cloud cost optimization and performance tuning across Azure services.
  • Collaborate with cross-functional teams including Product, Analytics, DevOps, and InfoSec.
  • Mentor and guide engineers and analysts to deliver high-impact solutions.
  • The Successful Applicant

  • Proven expertise in Azure Cloud (Data Lake, Data Factory, Event Hubs, Key Vault).
  • Hands-on experience with Apache Spark (PySpark/Scala), Kafka, NiFi, Delta Lake, and Databricks.
  • Deep understanding of PII data encryption, tokenization, and access controls.
  • Familiarity with Unity Catalog or similar data governance tools.
  • Skilled in CI/CD, infrastructure as code (Terraform/ARM), and containerization (Docker/Kubernetes).
  • Strong analytical, communication, and leadership skills.
  • Experience delivering complex projects in Agile/Scrum environments.
  • This advertiser has chosen not to accept applicants from your region.

    Manager, Data Engineer - Apache Nifi, Python, PySpark, Hadoop, Cloudera platforms, and Airflow

    Pune, Maharashtra Mastercard

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    **Our Purpose**
    _Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential._
    **Title and Summary**
    Manager, Data Engineer - Apache Nifi, Python, PySpark, Hadoop, Cloudera platforms, and Airflow
    Who is Mastercard?
    Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential.
    Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all.
    Overview:
    The Mastercard Account Level Management (ALM) platform empowers real-time card level decisioning. As consumers progress along their life stages as card holders, with increasing disposable income and more refined preferences, ALM provides services to issuers so they can effectively offer more relevant benefits and rewards at each stage, to drive loyalty and spend.
    Role:
    - Managing multiple scrum teams of Software developers and testers to develop quality software solutions in a timely and cost-effective manner.
    - Successfully lead definition, development and delivery of major cross-department initiatives with broad scope and long-term business implications.
    - Provide technical leadership and direction to software development teams in the development of services and capabilities, Understand, implement and enforce Software development standards and engineering principles in the Big Data space.
    - Work closely with product and architecture teams on product definition, technical design, and overall execution for the team.
    - Ensure the project or effort is adequately staffed; trained and managed. Ensure personnel have appropriate skills and behaviors; and effectively communicate performance results; as necessary, managing each effort within approved manpower and budget guidelines
    - Technically manage and drive deliveries from scrum teams that will be responsible implementing, documenting, testing, maintaining and supporting applications in adherence with industry best practices specially in Big Data space.
    - Developing high quality, secure, scalable solutions based on technical requirements specifications and design artifacts within expected time and budget.
    - Work across on-premises and cloud environments (AWS, Azure, Databricks).
    - Design & development of high quality data driven applications and scalable data pipelines using spark, Scala/Python/Java on Hadoop or object storage like MinIO.
    - Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge.
    - Expertise to guide team in using SDLC tools such Git-based version control systems, CI/CD pipelines using Jenkins, Checkmarx, test automation tools etc
    - Proficient in working within an Agile/Scrum framework, including creating user stories with well-defined acceptance criteria, participating in sprint planning and reviews
    - Good to have experience in developing enterprise solutions (standalone applications, services & SDK) using J2EE, JDBC (SQL & NoSQL) related technologies like Spring & Spring framework, micro services on Cloud.
    - Integrating content feeds via API, JSON, XML, and RSS from both internal and external sources.
    - supporting internal and external users of the applications/systems, performing production incident management and participating in on-call escalation pager support rotation.
    - Documenting application components for support and training materials for MasterCard Quality Assurance and Quality Control processes.
    About You:
    - Hands-on technical leader who brings considerable experience doing application development and managing teams by using a broad range of technologies and dives deep into everything team does.
    - Demonstrate leadership, ethics and values to generate high trust with team, peers, and management.
    - Must be high-energy, detail-oriented and proactive with the ability to function under pressure in an independent environment.
    - Must provide the necessary skills to have a high degree of initiative and self-motivation with a willingness and ability to learn and take on challenging opportunities.
    - Strong communication skills - both verbal and written - with strong relationship building, collaborative skills and organizational skills.
    - Must be obsessed with results, and effectively communicate objectives and how success will be measured to the team and other stakeholders.
    - Have strong decision-making skills, lead retrospection and continually improve as a result.
    - Understands how to guide an engineer's career including performance evaluation, coaching, and motivation.
    **Corporate Security Responsibility**
    All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
    + Abide by Mastercard's security policies and practices;
    + Ensure the confidentiality and integrity of the information being accessed;
    + Report any suspected information security violation or breach, and
    + Complete all periodic mandatory security trainings in accordance with Mastercard's guidelines.
    This advertiser has chosen not to accept applicants from your region.

    Data Flow Engineer

    Noida, Uttar Pradesh ₹80000 - ₹120000 Y Clearwater Analytics

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Summary:

    The Data Flow Engineer's primary role is to create and manage data connections, perform validations, and execute transformations. Their work is integral to the ongoing process of iterative improvement, with a particular focus on enhancing auto-reconciliation within the system through advanced technology.

    Responsibilities:

    • Import and validate file delivery for new clients.
    • Automate daily process monitoring and reporting.
    • Establish connections through external APIs and FTPs.
    • Ensure timely and dependable consumption of external portfolio data.
    • Normalize external datasets into a standardized Clearwater format facilitating the in-take process.
    • Mine data from existing feeds to identify, design, and implement solutions to improve auto-reconciliation.
    • Execute improvements requested from Operations and Development groups.
    • Apply acquired skills, procedures, and decision-making best practices to complete various issues, such as normalizing new feeds and improving automation.
    • Understand and reference or explain the general workflow, tools, and Clearwater value proposition.
    • Use critical thinking to address issues and offer solutions for both internal and external parties, ensuring best practices are employed.
    • Clearly and effectively communicate the technical aspects of Clearwater systems and our best practices with non-technical internal and external stakeholders.
    • Engage in light on-call duties.

    Required Skills:

    • Securities, accounting, and financial experience.
    • Strong understanding of SQL and relational database principles.
    • Experience with scripting programming languages like Groovy, Perl or Python.
    • Experience with industry-standard data transmission protocols preferred.
    • Securities, accounting, and financial experience preferred.
    • Strong computer skills, including proficiency in Microsoft Office.
    • Excellent attention to detail and strong documentation skills.
    • Outstanding verbal and written communication skills.
    • Strong organizational and interpersonal skills.
    • Exceptional problem-solving abilities.

    Education and Experience:

    • Bachelor's degree in Math, Computer Information Systems, or other relevant degrees.
    • 2+ years of relevant experience.
    • Experience with industry-standard data transmission protocols.
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Apache Nifi Jobs