143 Data Architect jobs in Noida
Data Architect
Posted 4 days ago
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Architect
Skills
- Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
- Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
- Define and enforce data modeling standards and best practices.
- Ensure the security, reliability, and performance of data systems.
- Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
- Develop and maintain metadata, data catalogs, and data lineage documentation.
- Monitor and troubleshoot performance issues related to data pipelines and architecture.
Qualifications:
- Bachelor's or master’s degree in computer science,Information Technology, or a related field.
- 5 to 8 years of hands-on experience in Data Architect roles.
- Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
- Experience with distributed data processing frameworks like Apache Spark.
- Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
- Familiarity with data governance, security, and compliance principles.
- Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Architect
Posted 11 days ago
Job Viewed
Job Description
Job Title : Data Engineering Architect
Experience : 10-16 Years
Location : Pune & Noida
Work Mode : Hybrid - Full time
Key Responsibilities
• Data Migration & Modernization
• Lead the migration of data pipelines, models, and workloads in Redshift.
• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.
• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.
• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools
• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.
• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.
• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.
• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.
• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).
• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.
• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.
Required Skills & Experience
• 10+ years in data engineering roles with increasing responsibility.
• Proven experience leading data migration or re-platforming projects.
• Strong command of Python, SQL for data pipeline development.
• Experience working with dbt models.
• Hands-on experience with modern data platforms like postgreSQL, Redshift.
• Proficient in building streaming pipelines with tools like Kafka, rudderstack.
• Deep understanding of data modeling, partitioning, indexing, and query optimization.
• Expertise with Apache airflow for ETL orchestration.
• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.
• Experience in designing data validation frameworks and implementing DQ rules.
• Strong understanding of git hub and code migration techniques.
• Familiarity with reporting tools like tableau, power bi.
• Knowledge of financial domain. Preferably loans
Data Architect
Posted today
Job Viewed
Job Description
Job Title : Data Engineering Architect
Experience : 10-16 Years
Location : Pune & Noida
Work Mode : Hybrid - Full time
Key Responsibilities
• Data Migration & Modernization
• Lead the migration of data pipelines, models, and workloads in Redshift.
• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.
• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.
• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools
• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.
• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.
• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.
• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.
• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).
• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.
• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.
Required Skills & Experience
• 10+ years in data engineering roles with increasing responsibility.
• Proven experience leading data migration or re-platforming projects.
• Strong command of Python, SQL for data pipeline development.
• Experience working with dbt models.
• Hands-on experience with modern data platforms like postgreSQL, Redshift.
• Proficient in building streaming pipelines with tools like Kafka, rudderstack.
• Deep understanding of data modeling, partitioning, indexing, and query optimization.
• Expertise with Apache airflow for ETL orchestration.
• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.
• Experience in designing data validation frameworks and implementing DQ rules.
• Strong understanding of git hub and code migration techniques.
• Familiarity with reporting tools like tableau, power bi.
• Knowledge of financial domain. Preferably loans
Data architect
Posted today
Job Viewed
Job Description
Job Title : Data Engineering ArchitectExperience : 10-16 YearsLocation : Pune & NoidaWork Mode : Hybrid - Full timeKey Responsibilities• Data Migration & Modernization• Lead the migration of data pipelines, models, and workloads in Redshift.• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.• Contribute to data governance initiatives including metadata tracking, data lineage, and access control. Required Skills & Experience• 10+ years in data engineering roles with increasing responsibility.• Proven experience leading data migration or re-platforming projects.• Strong command of Python, SQL for data pipeline development.• Experience working with dbt models.• Hands-on experience with modern data platforms like postgre SQL, Redshift.• Proficient in building streaming pipelines with tools like Kafka, rudderstack.• Deep understanding of data modeling, partitioning, indexing, and query optimization.• Expertise with Apache airflow for ETL orchestration.• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.• Experience in designing data validation frameworks and implementing DQ rules.• Strong understanding of git hub and code migration techniques.• Familiarity with reporting tools like tableau, power bi.• Knowledge of financial domain. Preferably loans
Data Architect
Posted today
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Architect
Skills
- Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
- Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
- Define and enforce data modeling standards and best practices.
- Ensure the security, reliability, and performance of data systems.
- Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
- Develop and maintain metadata, data catalogs, and data lineage documentation.
- Monitor and troubleshoot performance issues related to data pipelines and architecture.
Qualifications:
- Bachelor's or master’s degree in computer science,Information Technology, or a related field.
- 5 to 8 years of hands-on experience in Data Architect roles.
- Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
- Experience with distributed data processing frameworks like Apache Spark.
- Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
- Familiarity with data governance, security, and compliance principles.
- Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Architect
Posted today
Job Viewed
Job Description
Job Title : Data Engineering Architect
Experience : 10-16 Years
Location : Pune & Noida
Work Mode : Hybrid - Full time
Key Responsibilities
• Data Migration & Modernization
• Lead the migration of data pipelines, models, and workloads in Redshift.
• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.
• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.
• Design and build robust ETL/ELT pipelines using Python, SQL, and orchestration tools
• Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.
• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.
• Define and implement data modeling strategies (star, snowflake, normalization/demoralization) for analytics and BI layers.
• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.
• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).
• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.
• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.
Required Skills & Experience
• 10+ years in data engineering roles with increasing responsibility.
• Proven experience leading data migration or re-platforming projects.
• Strong command of Python, SQL for data pipeline development.
• Experience working with dbt models.
• Hands-on experience with modern data platforms like postgreSQL, Redshift.
• Proficient in building streaming pipelines with tools like Kafka, rudderstack.
• Deep understanding of data modeling, partitioning, indexing, and query optimization.
• Expertise with Apache airflow for ETL orchestration.
• Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.
• Experience in designing data validation frameworks and implementing DQ rules.
• Strong understanding of git hub and code migration techniques.
• Familiarity with reporting tools like tableau, power bi.
• Knowledge of financial domain. Preferably loans
Data Architect
Posted today
Job Viewed
Job Description
An exciting opportunity has emerged for a seasoned Data Architect to become a vital member of our ERM Technology team. You will report to the Lead Enterprise Architect and join a dynamic team focused on delivering corporate and technology strategic initiatives. The role demands high-level analytical, problem-solving, and communication skills, along with a strong commitment to customer service. As the Data Architect for ERM, you will work closely with both business and technology stakeholders, utilizing your expertise in business intelligence, analytics, data engineering, data management, and data integration to significantly advance our data strategy and ecosystem.
Key responsibilities include:
The successful candidate will have:
ERM does not accept recruiting agency resumes. Please do not forward resumes to our jobs alias, ERM employees or any other company location. ERM is not responsible for any fees related to unsolicited resumes.
ERM is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, marital status or disability status.
Be The First To Know
About the latest Data architect Jobs in Noida !
Data Architect
Posted today
Job Viewed
Job Description
Were Hiring for Our Client! | Data Architect | Remote | Up to 1 Cr + ESOPs
Our client, a leading B2B/SaaS company , is looking for a Data Architect to drive innovation, scalability, and efficiency in data & platform engineering . This is a high-impact role where you'll play a crucial part in shaping the company's data infrastructure and strategy.
- Hands-on individual contributor role initially, with the opportunity to grow into a management leadership position .
- Architect and optimize high-scale, distributed data systems .
- Design and implement scalable data pipelines, APIs, and data infrastructure .
- Define and execute the data strategy & roadmap , collaborating with cross-functional teams.
- Ensure data governance, security, and compliance best practices.
- Preference will be given to candidates from product-based companies due to the complexity and scale of the role.
Key Requirements:
- Minimum 12 years in software/data engineering (2+ years in an architect role)
- Strong expertise in distributed systems, big data, and cloud platforms (AWS/GCP/Azure)
- Hands-on experience with Kafka, Spark, SQL, NoSQL, Data Lakes, ETL Pipelines
- Deep understanding of system design, algorithms, and data governance
- Education: B.Tech from a Tier 1 institute preferred
- Product-based company experience preferred
Why Join?
- Initial Individual Contributor Role Transition into a leadership position as the team scales.
- Opportunity to work on large-scale distributed data systems that power high-performance applications.
- Be part of a team that values innovation, problem-solving, and cutting-edge technology .
- Competitive compensation up to 1 Cr + ESOPs and a flexible remote work environment .
Send your resume to
Data Architect
Posted today
Job Viewed
Job Description
Education: BE/B.Tech/Master of Computer Application
Technical:
Design and implement effective database solutions and data models to store and retrieve data.
Hands on experience in the design of reporting schemas, data marts and development of reporting solutions.
Prepare scalable database design and architecture in terms of defining multi-tenants’ schemas, data ingestion, data transformation and data aggregation models.
Should have expertise and working experience in at least 2 ETL tools among Informatica, SSIS, Talend & Matillion
Should have expertise and working experience in at least 2 DBMS/appliances among Redshift, SQL Server, PostgreSQL, Oracle.
Should have strong Data Warehousing, Reporting and Data Integration fundamentals.
Advanced expertise with SQL
Experience on AWS/Azure cloud data stores and it’s DB/DW related service offerings.
Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases.
Should have technical expertise and working experience in at least 2 Reporting tools among Power BI, Tableau, ,Jaspersoft and QlikView/QlikSense.
Advanced technical Competencies in SQL .