104 Data Architect jobs in Delhi
Data Architect
Posted today
Job Viewed
Job Description
Your potential, unleashed.
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Architect
Skills
- Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
- Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
- Define and enforce data modeling standards and best practices.
- Ensure the security, reliability, and performance of data systems.
- Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
- Develop and maintain metadata, data catalogs, and data lineage documentation.
- Monitor and troubleshoot performance issues related to data pipelines and architecture.
Qualifications:
- Bachelor's or master’s degree in computer science,Information Technology, or a related field.
- 5 to 8 years of hands-on experience in Data Architect roles.
- Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
- Experience with distributed data processing frameworks like Apache Spark.
- Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
- Familiarity with data governance, security, and compliance principles.
- Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report .
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Data Architect
Posted today
Job Viewed
Job Description
Role & Responsibilities
- Lead and mentor a team of data engineers, ensuring high performance and career growth.
- Architect and optimize scalable data infrastructure, ensuring high availability and reliability.
- Drive the development and implementation of data governance frameworks and best practices.
- Work closely with cross-functional teams to define and execute a data roadmap.
- Optimize data processing workflows for performance and cost efficiency.
- Ensure data security, compliance, and quality across all data platforms.
- Foster a culture of innovation and technical excellence within the data team.
Ideal Candidate
- 12-17 years of experience in software/data engineering, with at least 3+ years in a leadership role and can join in 60 days notice period.
- Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS.
- Proficiency in SQL, Python, and Scala for data processing and analytics.
- Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services.
- Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice
- Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks.
- Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery
- Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.).
- Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB.
- Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK.
- Proven ability to drive technical strategy and align it with business objectives.
- Strong leadership, communication, and stakeholder management skills.
- Must have spent minimum 3 yrs in each company.
Preferred Qualifications:
- Experience in machine learning infrastructure or MLOps is a plus.
- Exposure to real-time data processing and analytics.
- Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture.
- Prior experience in a SaaS or high-growth tech company.
Data Architect
Posted today
Job Viewed
Job Description
MandatoryStrong Data Architect, Lead Data Engineer, Engineering Manager / Director ProfileMandatory (Experience 1) - Must have 10+ YOE in Data Engineering roles, with at least 2+ years in a Leadership roleMandatory (Experience 2) - Must have 7+ YOE in hands-on Tech development with Java (Highly preferred) or Python, Node.JS, GoLangMandatory (Experience 3) - Must have recent 4+ YOE with high-growth Product startups, and should have implemented Data Engineering systems from an early stage in the CompanyMandatory (Experience 4) - Must have strong experience in large data technologies, tools like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto etc.Mandatory (Experience 5) - Strong expertise in HLD and LLD, to design scalable, maintainable data architectures.Mandatory (Team Management) - Must have managed a team of atleast 5+ Data Engineers (Read Leadership role in CV)Mandatory (Education) - Must be from Tier - 1 Colleges, preferred IITMandatory (Company) - B2B Product Companies with High data-trafficPreferred CompaniesMoEngage, Whatfix, Netcore Cloud, Clevertap, Hevo Data, Snowflake, Chargebee, , Databricks, Dataweave, Wingman, Postman, Zoho, HighRadius, Freshworks, Mindtickle.
Role & ResponsibilitiesLead and mentor a team of data engineers, ensuring high performance and career growth.Architect and optimize scalable data infrastructure, ensuring high availability and reliability.Drive the development and implementation of data governance frameworks and best practices.Work closely with cross-functional teams to define and execute a data roadmap.Optimize data processing workflows for performance and cost efficiency.Ensure data security, compliance, and quality across all data platforms.Foster a culture of innovation and technical excellence within the data team.Ideal Candidate10+ years of experience in software/data engineering, with at least 3+ years in a leadership role.Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS.Proficiency in SQL, Python, and Scala for data processing and analytics.Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services.Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google GuiceExperience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks.Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQueryDeep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.).Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB.Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK.Proven ability to drive technical strategy and align it with business objectives.Strong leadership, communication, and stakeholder management skills.Preferred Qualifications:Experience in machine learning infrastructure or MLOps is a plus.Exposure to real-time data processing and analytics.Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture.Prior experience in a SaaS or high-growth tech company.
Job Types: Full-time, Permanent
Pay: ₹7,000, ₹9,000,000.00 per year
Ability to commute/relocate:
- Delhi, Delhi: Reliably commute or planning to relocate before starting work (Preferred)
Experience:
- Data Architect: 10 years (Preferred)
Work Location: In person
Data Architect
Posted today
Job Viewed
Job Description
Data Architect Azure Databricks
Band- C2/ D1 : 10 + Years Experience
Must Have: SQL, Pyspark and Python, Lakehouse architecture, delta tables, Data Architect with functional Data lakes, knows how to set up a Data Warehouse, for a group or analytics team, in and out worked with Azure tech stacks Database. Data Modelling, engineering pipeline and getting it deployed.
Location : Gurgaon
Data Architect
Posted today
Job Viewed
Job Description
Location : Remote
Experience : Years
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments.
About the Role :
As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies.
Key Responsibilities :
Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations.
Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery.
Produce comprehensive and clear technical specification documents for the effective implementation of data solutions.
Design and structure sophisticated data models to enable advanced customer-level analytics and reporting.
Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources.
Automate data movement, cleansing, and transformation processes using various scripting languages and tools.
Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met.
Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows.
Requirements :
10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets.
5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms).
Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts.
Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.).
Proven experience with reporting and business intelligence tools such as Tableau and Power BI.
In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics).
Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences.
Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Data Architect
Posted today
Job Viewed
Job Description
Skills:
PosGreSQL, SQL server
Should be good in Data Modelling
Water sector experience will be an added advantage
Required Candidate profile
Bachelors in Engineering/ Technology or any other equivalent degree or MCA with BCA
equivalent degree.
Notice requirement - Imeediate to 30 days
Data Architect
Posted today
Job Viewed
Job Description
India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond.
At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters.
The team
As a member of the Operations Transformations team you will embark on an exciting and fulfilling journey with a group of intelligent and innovative globally aware individuals.
We work in conjuncture with various institutions solving key business problems across a broad-spectrum roles and functions, all set against the backdrop of constant industry change.
Your work profile
Job Title: Data Architect
Skills
- Design, develop, and maintain scalable data pipelines and architecture for data integration and transformation.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure architecture aligns with business goals.
- Utilize Python and PySpark to process, transform, and analyze large volumes of structured and unstructured data.
- Define and enforce data modeling standards and best practices.
- Ensure the security, reliability, and performance of data systems.
- Work with cloud-based data platforms (e.g., AWS,Azure, GCP) and big data technologies as required.
- Develop and maintain metadata, data catalogs, and data lineage documentation.
- Monitor and troubleshoot performance issues related to data pipelines and architecture.
Qualifications:
- Bachelor's or master’s degree in computer science,Information Technology, or a related field.
- 5 to 8 years of hands-on experience in Data Architect roles.
- Strong proficiency in Python and/or PySpark for data transformation and ETL processes.
- Experience with distributed data processing frameworks like Apache Spark.
- Experience working with relational and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
- Familiarity with data governance, security, and compliance principles.
- Experience with CI/CD pipelines, version control (e.g.,Git), and Agile methodologies.
How you’ll grow
Connect for impact
Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report.
Empower to lead
You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership.
Inclusion for all
At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters.
Drive your career
At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte.
Everyone’s welcome… entrust your happiness to us
Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you.
Interview tips
We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Be The First To Know
About the latest Data architect Jobs in Delhi !
Data Architect
Posted today
Job Viewed
Job Description
Job Title: Data Architect – Snowflake & Matillion
Skills: Snowflake, Matillion, Data Modeling, SQL, Python/Bash, Cloud Platforms (AWS/Azure)
Experience Required: 5+ Years
Location: Remote
Notice Period: Immediate to 15 days preferred
Job Description
We are seeking an experienced Data Architect to lead data architecture and modeling with a focus on Snowflake and Matillion. The ideal candidate is proactive, self-directed, collaborative, and passionate about building scalable cloud-based data platforms.
Key Responsibilities:
- Design and implement end-to-end data architecture leveraging Snowflake and Matillion for data ingestion, transformation, and storage.
- Define and maintain data modeling standards, data flows, and architecture best practices (dimensional, normalized, star/snowflake schemas).
- Lead evaluation and adoption of tools in the modern data stack, ensuring scalability and alignment with business goals.
- Collaborate with data engineers, analysts, and stakeholders to define requirements and create robust data pipelines.
- Ensure data security, access controls, and compliance (GDPR, HIPAA, SOC 2).
- Optimize Snowflake performance via clustering, caching, query tuning, and cost management.
- Oversee data integration from APIs, databases, flat files, and third-party platforms.
- Establish data quality and metadata management practices.
- Provide subject matter expertise in Matillion orchestration, reusable job frameworks, and performance optimization.
Qualification & Certification
Required Qualifications:
- 5+ years of experience in data engineering/architecture.
- Proven expertise with Snowflake (multi-cluster warehouses, role-based access, data sharing).
- Strong pipeline development experience with Matillion ETL.
- Strong data modeling & scalable cloud data platforms experience.
- Proficiency in SQL and scripting (Python or Bash).
- Hands-on experience with AWS/Azure.
- Familiar with CI/CD, Git, Infrastructure-as-Code (Terraform/CloudFormation).
Preferred Qualifications:
- Experience with dbt, Airflow, orchestration tools.
- Knowledge of BI/reporting tools (Power BI, Tableau, Looker).
- Familiarity with data governance/data catalog tools (Alation, Collibra, Atlan).
- Background in ML platforms or real-time pipelines.
- Industry domain knowledge (finance, healthcare, retail – nice to have).
Soft Skills / Nice to Have:
- Excellent communication & analytical skills.
- Strong customer-centric mindset and structured approach.
- Ability to work independently and in fast-paced environments.
- Strong organizational skills with time management discipline.
- Fluent in English (written & spoken).
About InfoBeans
InfoBeans is a global digital transformation and product engineering company, enabling businesses to thrive through innovation, agility, and cutting-edge technology solutions. With over 1,700 team members across the globe, we specialize in custom software development, enterprise solutions, cloud, AI/ML, UX, automation, and digital transformation services.
At InfoBeans, we live by our core purpose of “Creating WOW!”—for our clients, team members, and the community. Our collaborative culture, growth opportunities, and people-first approach make us one of the most trusted and rewarding workplaces.
Know more about us
Data Architect
Posted today
Job Viewed
Job Description
We are looking for a seasoned Data Architect to lead and shape enterprise-wide data initiatives. This role requires a strategic leader with deep technical expertise and strong stakeholder management skills.
Key Responsibilities
- Act as the architectural authority for Data Integration, Data Governance, and MDM programs.
- Define and oversee High-Level and Low-Level Solution Designs across business, application, and data domains.
- Lead solution roadmaps covering as-is vs. to-be states, business value realization, and scalability.
- Partner with business stakeholders, project managers, and engineering teams to ensure alignment of IT solutions with business needs.
- Provide leadership and direction to teams while ensuring governance and delivery success.
- Engage in presales activities to support the Data & Analytics Center of Excellence.
Required Qualifications
- 15+ years of experience in Data Engineering & Architecture roles.
- Bachelor’s degree in Computer Science, Engineering, or related field.
- Proven expertise in Data Integration, Data Governance, and Master Data Management (MDM).
- Strong knowledge of data modeling, governance, flows, quality, security, privacy, scalability, and performance.
- Excellent stakeholder management and communication skills, with the ability to influence at senior levels.
- Demonstrated leadership experience guiding teams and ensuring delivery of large-scale data programs.
Location: Kochi/ Remote
Please note: We are looking for early joiners.
Interested, Share your resume to