72 Data Architect jobs in Delhi

Data Architect

Delhi, Delhi Deloitte

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Your potential, unleashed.


India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.


At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.


The team

As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.

We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.


Your work profile


Job Title: Data Architect

Skills

  • Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
  • Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
  • Define and enforce data modeling standards and best practices.
  • Ensure the security, reliability, and performance of data systems.
  • Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
  • Develop and maintain metadata, data catalogs, and data lineage documentation.
  • Monitor and troubleshoot performance issues related to data pipelines and architecture.


Qualifications:

  • Bachelor's or master’s degree in computer science,Information Technology, or a related field.
  • 5 to 8 years of hands-on experience in Data Architect roles.
  • Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
  • Experience with distributed data processing frameworks like Apache Spark.
  • Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
  • Familiarity with data governance, security, and compliance principles.
  • Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.


How you’ll grow


Connect for impact


Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .


Empower to lead


You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.


Inclusion for all


At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.





Drive your career


At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.



Everyone’s welcome… entrust your happiness to us

Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.


Interview tips


We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

This advertiser has chosen not to accept applicants from your region.

Data Architect

New Delhi, Delhi R Systems

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title : Data Engineering Architect

Experience : 10-16 Years

Location : Pune & Noida

Work Mode : Hybrid - Full time


Key Responsibilities


• Data Migration & Modernization

• Lead the migration of data pipelines, models, and workloads in Redshift.

• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.

• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.

• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools

• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.

• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.

• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.

• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.

• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).

• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.

• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.

Required Skills & Experience


• 10+ years in data engineering roles with increasing responsibility.

• Proven experience leading data migration or re-platforming projects.

• Strong command of Python, SQL for data pipeline development.

• Experience working with dbt models.

• Hands-on experience with modern data platforms like postgreSQL, Redshift.

• Proficient in building streaming pipelines with tools like Kafka, rudderstack.

• Deep understanding of data modeling, partitioning, indexing, and query optimization.

• Expertise with Apache airflow for ETL orchestration.

• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.

• Experience in designing data validation frameworks and implementing DQ rules.

• Strong understanding of git hub and code migration techniques.

• Familiarity with reporting tools like tableau, power bi.

• Knowledge of financial domain. Preferably loans

This advertiser has chosen not to accept applicants from your region.

Data Architect

New Delhi, Delhi R Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title : Data Engineering Architect

Experience : 10-16 Years

Location : Pune & Noida

Work Mode : Hybrid - Full time

Key Responsibilities

• Data Migration & Modernization

• Lead the migration of data pipelines, models, and workloads in Redshift.

• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.

• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.

• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools

• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.

• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.

• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.

• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.

• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).

• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.

• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.

Required Skills & Experience

• 10+ years in data engineering roles with increasing responsibility.

• Proven experience leading data migration or re-platforming projects.

• Strong command of Python, SQL for data pipeline development.

• Experience working with dbt models.

• Hands-on experience with modern data platforms like postgreSQL, Redshift.

• Proficient in building streaming pipelines with tools like Kafka, rudderstack.

• Deep understanding of data modeling, partitioning, indexing, and query optimization.

• Expertise with Apache airflow for ETL orchestration.

• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.

• Experience in designing data validation frameworks and implementing DQ rules.

• Strong understanding of git hub and code migration techniques.

• Familiarity with reporting tools like tableau, power bi.

• Knowledge of financial domain. Preferably loans

This advertiser has chosen not to accept applicants from your region.

Data architect

New Delhi, Delhi R Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title : Data Engineering ArchitectExperience : 10-16 YearsLocation : Pune & NoidaWork Mode : Hybrid - Full timeKey Responsibilities• Data Migration & Modernization• Lead the migration of data pipelines, models, and workloads in Redshift.• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.• Contribute to data governance initiatives including metadata tracking, data lineage, and access control. Required Skills & Experience• 10+ years in data engineering roles with increasing responsibility.• Proven experience leading data migration or re-platforming projects.• Strong command of Python, SQL for data pipeline development.• Experience working with dbt models.• Hands-on experience with modern data platforms like postgre SQL, Redshift.• Proficient in building streaming pipelines with tools like Kafka, rudderstack.• Deep understanding of data modeling, partitioning, indexing, and query optimization.• Expertise with Apache airflow for ETL orchestration.• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.• Experience in designing data validation frameworks and implementing DQ rules.• Strong understanding of git hub and code migration techniques.• Familiarity with reporting tools like tableau, power bi.• Knowledge of financial domain. Preferably loans

This advertiser has chosen not to accept applicants from your region.

Data Architect

Delhi, Delhi Deloitte

Posted today

Job Viewed

Tap Again To Close

Job Description

Your potential, unleashed.

India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.

At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.

The team

As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.

We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.

Your work profile

Job Title: Data Architect

Skills

  • Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
  • Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
  • Define and enforce data modeling standards and best practices.
  • Ensure the security, reliability, and performance of data systems.
  • Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
  • Develop and maintain metadata, data catalogs, and data lineage documentation.
  • Monitor and troubleshoot performance issues related to data pipelines and architecture.

Qualifications:

  • Bachelor's or master’s degree in computer science,Information Technology, or a related field.
  • 5 to 8 years of hands-on experience in Data Architect roles.
  • Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
  • Experience with distributed data processing frameworks like Apache Spark.
  • Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
  • Familiarity with data governance, security, and compliance principles.
  • Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.

How you’ll grow

Connect for impact

Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .

Empower to lead

You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.

Inclusion for all

At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.

Drive your career

At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.

Everyone’s welcome… entrust your happiness to us

Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.

Interview tips

We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

This advertiser has chosen not to accept applicants from your region.

Data Architect

New Delhi, Delhi R Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title : Data Engineering Architect

Experience : 10-16 Years

Location : Pune & Noida

Work Mode : Hybrid - Full time

Key Responsibilities

• Data Migration & Modernization

• Lead the migration of data pipelines, models, and workloads in Redshift.

• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.

• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.

• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools

• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.

• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.

• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.

• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.

• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).

• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.

• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.

Required Skills & Experience

• 10+ years in data engineering roles with increasing responsibility.

• Proven experience leading data migration or re-platforming projects.

• Strong command of Python, SQL for data pipeline development.

• Experience working with dbt models.

• Hands-on experience with modern data platforms like postgreSQL, Redshift.

• Proficient in building streaming pipelines with tools like Kafka, rudderstack.

• Deep understanding of data modeling, partitioning, indexing, and query optimization.

• Expertise with Apache airflow for ETL orchestration.

• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.

• Experience in designing data validation frameworks and implementing DQ rules.

• Strong understanding of git hub and code migration techniques.

• Familiarity with reporting tools like tableau, power bi.

• Knowledge of financial domain. Preferably loans

This advertiser has chosen not to accept applicants from your region.

Data Architect

New Delhi, Delhi Antal International

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Summary role description:

Hiring for Data Architect for one of the world's leading independent providers of audit & assurance, tax, and advisory services.

Company description:

Our client is a well-established, U.S.-headquartered firm and one of the world's leading independent providers of audit & assurance, tax, and advisory services. With a global presence in 146 countries, they are equipped to support organizations of all sizes both public and private in tackling today’s complex challenges. The firm specializes in Audit, Tax, Advisory Services, Consulting, Operations, Supply Chain, Public Sector, Manufacturing, and Strategy. As part of their growth strategy, they are expanding their team in India to strengthen their market presence and tap into emerging opportunities.

Role details:

  • Title / Designation : Data Architect
  • Location: Delhi
  • Work Mode: 5 days Office
  • Role & responsibilities:

  • Create database solutions, evaluate requirements, and prepare design reports.
  • Design and implement database models to store and retrieve data.
  • Identify structural needs by assessing operations, applications, and programming.
  • Ensure database implementation complies with regulations.
  • Install and organize information systems.
  • Prepare database design and architecture reports for management.
  • Oversee data migration from legacy systems.
  • Monitor performance through regular tests, troubleshooting, and feature integration.
  • Recommend improvements for new and existing systems.
  • Train staff, provide individual support, and resolve system issues promptly.
  • Candidate requirements:

  • Bachelors in engineering/technology or any other equivalent degree, or MCA with BCA/equivalent degree.
  • Minimum 10+ years of experience in Data Modelling, PostgreSQL & SQL Server with 2-3 years of experience in data architecture space.
  •  Minimum 5+ years of experience in government projects
  • Good to have experience in water sector experience.
  • Preferred candidates from Tier-1 or Tier-2 colleges.
  • Selection process:

  • Two Technical Interviews
  • HR Discussions
  • Recruiter details:

  • Janu –
  • Check Your Resume for Match

    Upload your resume and our tool will compare it to the requirements for this job like recruiters do.

    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Data architect Jobs in Delhi !

    Data Architect

    New Delhi, Delhi ERM

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    An exciting opportunity has emerged for a seasoned Data Architect to become a vital member of our ERM Technology team. You will report to the Lead Enterprise Architect and join a dynamic team focused on delivering corporate and technology strategic initiatives. The role demands high-level analytical, problem-solving, and communication skills, along with a strong commitment to customer service. As the Data Architect for ERM, you will work closely with both business and technology stakeholders, utilizing your expertise in business intelligence, analytics, data engineering, data management, and data integration to significantly advance our data strategy and ecosystem.

    Key responsibilities include:

  •  Empowered to define the data and information management architecture for ERM.
  •  Collaborate with product owners, engineers, data scientists, and business stakeholders to understand data needs across the full product lifecycle.
  •  Ensure a shared understanding of our data, including its quality, ownership, and lineage throughout its lifecycle, from initial capture via client interaction to final consumption by internal and external processes and stakeholders.
  • Ensure that our data landscape effectively meets corporate and regulatory reporting requirements.
  •  Establish clear ownership and governance for comprehensive data domain models, encompassing both data in motion and data at rest.
  •  Provide expert guidance on solution architecture, engineering principles, and the implementation of data applications utilizing both existing and cutting-edge technology platforms.
  • Build a robust data community by collaborating with architects and engineers, leveraging this community to implement solutions that enhance client and business outcomes through data.
  • The successful candidate will have:

  • Proven experience as an enterprise data architect.
  • Experience in end-to-end implementation of data-intensive analytics-based projects encompassing data acquisition, ingestion, integration, transformation and consumption. 
  • Proven experience in the design, development, and implementation of data engineering technologies.
  • Strong knowledge of data management and governance principles.
  •  A strong understanding of Azure and AWS service landscapes, particularly data services.
  • Proven experience with various data modelling techniques.
  • Understanding of big data architectures and emerging trends in technology.
  • A solid familiarity with Agile methodologies, test-driven development, source control management, and automated testing.
  • ERM does not accept recruiting agency resumes. Please do not forward resumes to our jobs alias, ERM employees or any other company location. ERM is not responsible for any fees related to unsolicited resumes.

    ERM is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, marital status or disability status.

    This advertiser has chosen not to accept applicants from your region.

    Data Architect

    New Delhi, Delhi Talent Nexa Consulting

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Were Hiring for Our Client! | Data Architect | Remote | Up to 1 Cr + ESOPs



    Our client, a leading B2B/SaaS company , is looking for a Data Architect to drive innovation, scalability, and efficiency in data & platform engineering . This is a high-impact role where you'll play a crucial part in shaping the company's data infrastructure and strategy.



    What to Expect in This Role:
    • Hands-on individual contributor role initially, with the opportunity to grow into a management leadership position .
    • Architect and optimize high-scale, distributed data systems .
    • Design and implement scalable data pipelines, APIs, and data infrastructure .
    • Define and execute the data strategy & roadmap , collaborating with cross-functional teams.
    • Ensure data governance, security, and compliance best practices.
    • Preference will be given to candidates from product-based companies due to the complexity and scale of the role.



    Key Requirements:

    - Minimum 12 years in software/data engineering (2+ years in an architect role)

    - Strong expertise in distributed systems, big data, and cloud platforms (AWS/GCP/Azure)

    - Hands-on experience with Kafka, Spark, SQL, NoSQL, Data Lakes, ETL Pipelines

    - Deep understanding of system design, algorithms, and data governance

    - Education: B.Tech from a Tier 1 institute preferred

    - Product-based company experience preferred


    Why Join?

    • Initial Individual Contributor Role Transition into a leadership position as the team scales.
    • Opportunity to work on large-scale distributed data systems that power high-performance applications.
    • Be part of a team that values innovation, problem-solving, and cutting-edge technology .
    • Competitive compensation up to 1 Cr + ESOPs and a flexible remote work environment .

    Send your resume to

    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Architect Jobs View All Jobs in Delhi