419 Hadoop jobs in Delhi

Hadoop Administrator

Delhi, Delhi ₹1500000 - ₹2500000 Y Redian Software

Posted today

Job Viewed

Tap Again To Close

Job Description

Hadoop Admin with experience in HDFS, YARN, Hive, Spark tuning, Linux, scripting & AWS (EMR, Glue, Athena, S3).

Responsible for security, HA, automation & cluster monitoring.

Must handle performance tuning, security, automation & cluster monitoring.

This advertiser has chosen not to accept applicants from your region.

Hadoop Administrator

Delhi, Delhi ₹600000 - ₹1200000 Y Outworx Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Level - L3

  • Administer, support Linux systems in large-scale production container environments.
  • Automate infrastructure using Ansible; manage Hadoop clusters and containers.
  • Monitor performance via Grafana/Prometheus; ensure system configuration compliance.
  • Collaborate cross-functionally; strong communication and scripting skills are essential.
  • Support DR/BCP, maintain HIPAA/PHI compliance, ensure infrastructure security.
This advertiser has chosen not to accept applicants from your region.

ICT Expert/ Data Analysis Expert

Delhi, Delhi ₹1500000 - ₹2000000 Y National Institute for Smart Government (NISG)

Posted today

Job Viewed

Tap Again To Close

Job Description

ICT Expert /Data Analysis :-

Common functions:

i. To assist the Network Planning Group (NPG) on the matters related to PM Gati Shakti and National Logistics Policy (NLP).

ii. To provide expertise for integration of interconnected multimodal network transport and infrastructure for efficient movement of people, Goods and services.

iii Assist in improving decision making through effective logistics data analytics, standardizations through streamlining of processes

iv. Assessment of project proposals included in the PM Gati Shakti National Master Plan in consultation

with other domain specific SMEs, Officers of TSU, Logistics Division and Line Ministries/Departments.

v. Examination/ evaluation of projects on PM Gati Shakti principles like logistics efficiency, utility to economic clusters, integrated approach to planning, perspective of multimodality and area development approach.

vi. Having knowledge/aware of Government policies, regulations and best practices, principles to the extent relevant to the Subject matter/PM Gati Shakti/National Logistics Policy with a focus on sustainable and inclusive development principles.

vii. Preparation of reports, presentations, and other communication materials to convey landings,

recommendations and implementation strategies and play role in capacity building.

viii. Knowledge for use of project management tools and techniques, as well as digital technologies such

as GlS, Transportation modelling softvvare etc. during evaluation of project proposals.

ix. Economic Nodes are important for critical evaluation of infrastructure projects. Accordingly basic understanding of economic nodes i.e. SEZ, CFS, lCDs, industrial nodes, Export Oriented Districts etc. is desirable.

xi. Any other work related to PM Gatishakti, Network Planning Group, National Logistics Policy in

particular and Logistics Sector in general.

x. To collaborate with government agencies, prlvate sector partners, and international organizations to develop innovative solutions for critical gaps in integrated planning.

  • Expert Assessment of all project proposals included in the PM Gati Shakti NMP in consultation with other domain specific SMEs and Directors in Technical Support Unit (TSU) and Logistics Division as well as with respective Line Ministries from the purview of synchronization of efforts

  • Undertaking interaction with users, Central/ State Government, and other stakeholders to identify gap areas

  • Assessment of LEADS and sectoral reports for identification of gap Coordination for development of lT tools for decision support
  • Assistance in appraisal/identification of telecom connectivity projects
  • To coordinate for maintenance of ICT systems with Logistics Division interaction with BISAG-N for mapping of projects/new projects
  • Development of data analytics for decision support
  • Coordination for development of lT tools for decision support
  • Coordination w.r.t. data collection, updating and storage/ Maintenance
  • Preparation of Monthly analytical reports on TSU functioning Monitoring and tracking of action points identified in EGoS and NPG meetings, lnter-Ministerial Meetings and Meetings with State Governments
  • Generate customized reports as per compliance requirement for Parliament / PMO/Cabinet Secretariat/ Office of C&lM, etc
This advertiser has chosen not to accept applicants from your region.

Freelancer - Python with Data analysis

Delhi, Delhi ₹400000 - ₹600000 Y Biz Tech Consultants

Posted today

Job Viewed

Tap Again To Close

Job Description

We are looking for freelancers experienced in Python with strong expertise in Statistics or Data Analysis.

skills include: Python (Pandas, NumPy, SciPy, Scikit-learn), R, SQL, Statistical Modeling, Hypothesis Testing, Regression, Classification.

This advertiser has chosen not to accept applicants from your region.

Data Scientist - Natural Language Processing

East Of Kailash, Delhi Esri

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview

Esri is the world leader in geographic information systems (GIS) and developer of ArcGIS, the leading mapping and analytics software used in 75 percent of Fortune companies. At the Esri R&D Center-New Delhi, we are applying cutting-edge AI and deep learning techniques to revolutionize geospatial analysis and derive insight from imagery and location data. We are passionate about applying data science and artificial intelligence to solve some of the worlds biggest challenges.

Our team develops tools, APIs, and AI models for geospatial analysts and data scientists, enabling them to leverage the latest research in spatial data science, AI and geospatial deep learning.

As a Data Scientist, you will develop deep learning models using libraries such as PyTorch and create APIs and tools for training and deploying them on satellite imagery. If you are passionate about deep learning applied to remote sensing and GIS, developing AI and deep learning models, and love maps or geospatial datasets/imagery, this is the place to be!

Responsibilities

  • Develop tools, APIs and pretrained models for geospatial AI
  • Integrate ArcGIS with popular deep learning libraries such as PyTorch
  • Fine-tune large language models (LLMs) for geospatial AI tasks and develop AI agents and assistants
  • Develop APIs and model architectures for natural language processing and deep learning on unstructured text
  • Author and maintain geospatial data science samples using ArcGIS and machine learning/deep learning libraries
  • Curate and pre/post-process data for deep learning models and transform it into geospatial information
  • Perform comparative studies of various deep learning model architectures
  • Requirements

  • 2 to 6 years of experience with Python, in data science and deep learning
  • Self-learner with coursework in and extensive knowledge of machine learning and deep learning
  • Experience with Python machine learning and deep learning libraries such as PyTorch, Scikit-learn, NumPy, Pandas
  • Expertise in one or more of these areas: Experience with transformer-based models Large language models and experience building applications using them Experience of working on NLP based tasks such as recommender system, summarization, and more
  • Experience in data visualization in Jupyter Notebooks using matplotlib and other libraries
  • Experience with hyperparameter-tuning and training models to a high level of accuracy
  • Bachelor's in computer science, engineering, or related disciplines from IITs and other top-tier engineering colleges
  • Existing work authorization for India
  • Recommended Qualifications

  • Familiarity with ArcGIS suite of products and concepts of GIS
  • Familiarity and experience using langchain/AutoGPT/BabyAGI
  • #LI-Onsite

    #LI-PK1

    About Esri

    At Esri, diversity is more than just a word on a map. When employees of different experiences, perspectives, backgrounds, and cultures come together, we are more innovative and ultimately a better place to work. We believe in having a diverse workforce that is unified under our mission of creating positive global change. We understand that diversity, equity, and inclusion is not a destination but an ongoing process. We are committed to the continuation of learning, growing, and changing our workplace so every employee can contribute to their lifes best work. Our commitment to these principles extends to the global communities we serve by creating positive change with GIS technology. For more information on Esris Racial Equity and Social Justice initiatives, please visit our website .

    If you dont meet all of the preferred qualifications for this position, we encourage you to still apply!

    Esri is an equal opportunity employer (EOE) and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need reasonable accommodation for any part of the employment process, please email and let us know the nature of your request and your contact information. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to from this e-mail address.

    Esri takes our responsibility to protect your privacy seriously. We are committed to respecting your privacy by providing transparency in how we acquire and use your information, giving you control of your information and preferences, and holding ourselves to the highest national and international standards, including CCPA and GDPR compliance.

    Requisition ID: -

    This advertiser has chosen not to accept applicants from your region.

    Big Data Engineer

    Delhi, Delhi ₹150000 - ₹200000 Y Qcentrio

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Work Location : Pan India

    Experience : 6+ Years

    Notice Period : Immediate - 30 days

    Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud

    JD and required Skills & Responsibilities :

    • Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support.

    • Solve complex business problems by utilizing a disciplined development methodology.

    • Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies.

    • Analyse the source and target system data. Map the transformation that meets the requirements.

    • Interact with the client and onsite coordinators during different phases of a project.

    • Design and implement product features in collaboration with business and Technology stakeholders.

    • Anticipate, identify, and solve issues concerning data management to improve data quality.

    • Clean, prepare, and optimize data at scale for ingestion and consumption.

    • Support the implementation of new data management projects and re-structure the current data architecture.

    • Implement automated workflows and routines using workflow scheduling tools.

    • Understand and use continuous integration, test-driven development, and production deployment frameworks.

    • Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards.

    • Analyze and profile data for the purpose of designing scalable solutions.

    • Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues.

    Required Skills :

    • 5+ years of relevant experience developing Data and analytic solutions.

    • Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark

    • Experience with relational SQL.

    • Experience with scripting languages such as Python.

    • Experience with source control tools such as GitHub and related dev process.

    • Experience with workflow scheduling tools such as Airflow.

    • In-depth knowledge of AWS Cloud (S3, EMR, Databricks)

    • Has a passion for data solutions.

    • Has a strong problem-solving and analytical mindset

    • Working experience in the design, Development, and test of data pipelines.

    • Experience working with Agile Teams.

    • Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders

    • Able to quickly pick up new programming languages, technologies, and frameworks.

    • Bachelor's degree in computer science

    This advertiser has chosen not to accept applicants from your region.

    Big Data Engineer

    Delhi, Delhi ₹90000 - ₹120000 Y Qcentrio

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    We are seeking an experienced and driven Data Engineer with 5+ years of hands-on experience in building scalable data infrastructure and systems. You will play a key role in designing and developing robust, high-performance ETL pipelines and managing large-scale datasets to support critical business functions. This role requires deep technical expertise, strong problem-solving skills, and the ability to thrive in a fast-paced, evolving environment.

    Key Responsibilities :

    • Design, develop, and maintain scalable and reliable ETL/ELT pipelines for processing large volumes of data (terabytes and beyond).

    • Model and structure data for performance, scalability, and usability.

    • Work with cloud infrastructure (preferably Azure) to build and optimize data workflows.

    • Leverage distributed computing frameworks like Apache Spark and Hadoop for large-scale data processing.

    • Build and manage data lake/lakehouse architectures in alignment with best practices.

    • Optimize ETL performance and manage cost-effective data operations.

    • Collaborate closely with cross-functional teams including data science, analytics, and software engineering.

    • Ensure data quality, integrity, and security across all stages of the data lifecycle.

    Required Skills & Qualifications :

    • 7 to 10 years of relevant experience in bigdata engineering.

    • Advanced proficiency in Python,

    • Strong skills in SQL for complex data manipulation and analysis.

    • Hands-on experience with Apache Spark, Hadoop, or similar distributed systems.

    • Proven track record of handling large-scale datasets (TBs) in production environments.

    • Cloud development experience with Azure (preferred), AWS, or GCP.

    • Solid understanding of data lake and data lakehouse architectures.

    • Expertise in ETL performance tuning and cost optimization techniques.

    • Knowledge of data structures, algorithms, and modern software engineering practices.

    Soft Skills :

    • Strong communication skills with the ability to explain complex technical concepts clearly and concisely.

    • Self-starter who learns quickly and takes ownership.

    • High attention to detail with a strong sense of data quality and reliability.

    • Comfortable working in an agile, fast-changing environment with incomplete requirements.

    Preferred Qualifications :

    • Experience with tools like Apache Airflow, Azure Data Factory, or similar.

    • Familiarity with CI/CD and DevOps in the context of data engineering.

    • Knowledge of data governance, cataloging, and access control principles.

    Skills : Python,Sql,Aws,Azure, Hadoop

    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Hadoop Jobs in Delhi !

    Senior Big Data Engineer

    Delhi, Delhi Veltris

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Veltris is a Digital Product Engineering Services partner committed to driving technology-enabled transformation across enterprises, businesses, and industries. We specialize in delivering next-generation solutions for sectors including healthcare, technology, communications, manufacturing, and finance.

    With a focus on innovation and acceleration, Veltris empowers clients to build, modernize, and scale intelligent products that deliver connected, AI-powered experiences. Our experience-centric approach, agile methodologies, and exceptional talent enable us to streamline product development, maximize platform ROI, and drive meaningful business outcomes across both digital and physical ecosystems.

    In a strategic move to strengthen our healthcare offerings and expand industry capabilities, Veltris has acquired BPK Technologies. This acquisition enhances our domain expertise, broadens our go-to-market strategy, and positions us to deliver even greater value to enterprise and mid-market clients in healthcare and beyond.

    Position-Senior Big Data Engineer

    Must have Big Data analytics platform experience.

    • Key stacks: Spark, Druid, Drill, ClickHouse.

    • 8+ years experience in Python/Java, CI/CD, infrastructure & cloud, Terraform, plus depth in:

    o Big Data pipelines: Spark, Kafka, Glue, EMR, Hudi, Schema Registry, Data Lineage.

    o Graph DBs: Neo4j, Neptune, JanusGraph, Dgraph.

    Preferred Qualifications:

    • Master’s degree (M.Tech/MS) or Ph.D. in Computer Science, Information Technology, Data Science, Artificial Intelligence, Machine Learning, Software Engineering, or a related technical field.

    • Candidates with an equivalent combination of education and relevant industry experience will also be considered.

    Disclaimer:

    The information provided herein is for general informational purposes only and reflects the current strategic direction and service offerings of Veltris. While we strive for accuracy, Veltris makes no representations or warranties regarding the completeness, reliability, or suitability of the information for any specific purpose. Any statements related to business growth, acquisitions, or future plans, including the acquisition of BPK Technologies, are subject to change without notice and do not constitute a binding commitment. Veltris reserves the right to modify its strategies, services, or business relationships at its sole discretion. For the most up-to-date and detailed information, please contact Veltris directly
    This advertiser has chosen not to accept applicants from your region.

    GCP Big Data Engineer

    Delhi, Delhi Talentmatics

    Posted 11 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal candidate will bring strong leadership capabilities, technical depth, and a proven track record of delivering end-to-end big data solutions in cloud environments.

    Key Responsibilities:-

    • Lead and mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .
    • Drive best practices for data modeling, data integration, and data quality management .
    • Collaborate with stakeholders to define data engineering strategies aligned with business goals.
    • Ensure high performance, scalability, and reliability in data systems using SQL and PySpark .

    Must-Have Skills:-

    • GCP expertise in data engineering services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
    • Strong programming in SQL & PySpark .
    • Hands-on experience in ETL pipeline design, development, and optimization .
    • Strong problem-solving and leadership skills with experience guiding data engineering teams.

    Qualification:-

    • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field .
    • Relevant certifications in GCP Data Engineering preferred.
    This advertiser has chosen not to accept applicants from your region.

    GCP Big Data Engineer

    New Delhi, Delhi Talentmatics

    Posted 11 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal candidate will bring strong leadership capabilities, technical depth, and a proven track record of delivering end-to-end big data solutions in cloud environments.

    Key Responsibilities:-

    • Lead and mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .
    • Drive best practices for data modeling, data integration, and data quality management .
    • Collaborate with stakeholders to define data engineering strategies aligned with business goals.
    • Ensure high performance, scalability, and reliability in data systems using SQL and PySpark .

    Must-Have Skills:-

    • GCP expertise in data engineering services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage).
    • Strong programming in SQL & PySpark .
    • Hands-on experience in ETL pipeline design, development, and optimization .
    • Strong problem-solving and leadership skills with experience guiding data engineering teams.

    Qualification:-

    • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field .
    • Relevant certifications in GCP Data Engineering preferred.
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Hadoop Jobs View All Jobs in Delhi