186 Data Architect jobs in Noida
Data Architect
Posted 4 days ago
Job Viewed
Job Description
Position Overview:
As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals.
Key Responsibilities:
Data Architecture Design:
- Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams.
- Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems.
- Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency.
Data Integration & Management:
- Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools.
- Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets.
- Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.).
Collaboration with Stakeholders:
- Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs.
- Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines.
Technology Leadership:
- Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools.
- Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization.
Data Quality & Security:
- Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems.
- Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity.
Mentorship & Leadership:
- Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management.
- Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy.
Extensive Data Architecture Expertise:
- Over 7 years of experience in data architecture, data modeling, and database management.
- Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions.
- Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines.
Advanced Knowledge of Data Platforms:
- Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus.
- Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing.
- Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker).
Data Governance & Compliance:
- Strong understanding of data governance principles, data lineage, and data stewardship.
- Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards.
Technical Leadership:
- Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise.
- Strong programming skills in languages such as Python, SQL, R, or Scala.
Pre-Sales Responsibilities:
- Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives.
- Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained.
- Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions.
- Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process.
- Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements.
To know our privacy policy, please click the link below:
Data Architect
Posted 4 days ago
Job Viewed
Job Description
Experience : 9+ years
Architect experience with Azure or AWS and Databricks
Notice : Immediate Preferred
Job Location: Noida , Mumbai , Pune , Bangalore , Gurgaon , Kochi (Hybrid Work)
Job Description:
• Develop a detailed project plan outlining tasks, timelines, milestones, and dependencies.
• Solutions architecture design and implementation
• Understand the source and outline the ADF structure. Design and schedule packages using ADF.
• Foster collaboration and communication within the team to ensure smooth workflow.
• Application performance optimization.
• Monitor and manage the allocation of resources to ensure tasks are adequately staffed.
• Created detailed technical specification, business requirements and unit test report documents.
• Ensure that the project adheres to best practices, coding standards, and technical requirements.
• Collaborating with technical leads to address technical issues and mitigate risks.
Data Architect
Posted today
Job Viewed
Job Description
Position Overview:
As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals.
Key Responsibilities:
Data Architecture Design:
- Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams.
- Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems.
- Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency.
Data Integration & Management:
- Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools.
- Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets.
- Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.).
Collaboration with Stakeholders:
- Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs.
- Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines.
Technology Leadership:
- Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools.
- Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization.
Data Quality & Security:
- Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems.
- Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity.
Mentorship & Leadership:
- Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management.
- Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy.
Extensive Data Architecture Expertise:
- Over 7 years of experience in data architecture, data modeling, and database management.
- Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions.
- Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines.
Advanced Knowledge of Data Platforms:
- Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus.
- Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing.
- Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker).
Data Governance & Compliance:
- Strong understanding of data governance principles, data lineage, and data stewardship.
- Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards.
Technical Leadership:
- Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise.
- Strong programming skills in languages such as Python, SQL, R, or Scala.
Pre-Sales Responsibilities:
- Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives.
- Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained.
- Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions.
- Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process.
- Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements.
To know our privacy policy, please click the link below:
Data Architect
Posted today
Job Viewed
Job Description
Experience : 9+ years
Architect experience with Azure or AWS and Databricks
Notice : Immediate Preferred
Job Location: Noida , Mumbai , Pune , Bangalore , Gurgaon , Kochi (Hybrid Work)
Job Description:
• Develop a detailed project plan outlining tasks, timelines, milestones, and dependencies.
• Solutions architecture design and implementation
• Understand the source and outline the ADF structure. Design and schedule packages using ADF.
• Foster collaboration and communication within the team to ensure smooth workflow.
• Application performance optimization.
• Monitor and manage the allocation of resources to ensure tasks are adequately staffed.
• Created detailed technical specification, business requirements and unit test report documents.
• Ensure that the project adheres to best practices, coding standards, and technical requirements.
• Collaborating with technical leads to address technical issues and mitigate risks.
Data Architect
Posted 2 days ago
Job Viewed
Job Description
Key Responsibilities
Design and implement scalable and robust data architectures for data warehousing and enterprise data platforms.
Develop and optimize data pipelines and ETL/ELT/CDC (Change Data Capture) workflows using tools such as Fivetran and Cloud Composer .
Collaborate with data scientists, product managers, and business stakeholders to define data requirements and create logical and physical data models.
Manage and administer various database systems, including BigQuery , SAP HANA , and PostgreSQL .
Ensure data quality, integrity, and security across all data platforms and pipelines.
Work with our AI/ML teams to design data serving layers and feature stores that support Vertex AI workloads.
Design and develop reporting frameworks and data marts to support business intelligence needs.
Integrate data platforms with various enterprise systems (CRMs, ERPs) and third-party APIs.
Define and implement data governance, master data management, and data cataloging strategies.
Contribute to the full data lifecycle: requirements gathering, architecture, data modeling, development, testing, and deployment.
Troubleshoot and resolve data platform issues to ensure high availability and optimal performance.
Document technical designs, data lineage, and architecture for cross-functional reference.
Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Software Engineering, Data Science, or a related field.
Proficiency in one or more backend languages/frameworks, with a strong preference for Python or Go .
Experience with building RESTful APIs and designing microservices for data delivery.
Solid grasp of data modeling fundamentals, including Kimball and Inmon methodologies.
Proficiency in writing complex SQL queries and experience with SQL and NoSQL databases.
Familiarity with data warehousing concepts and best practices, including CDC.
Strong version-control habits (Git) and experience with CI/CD pipelines.
Excellent problem-solving, communication, and collaboration skills.
Passion for continuous learning and adapting to emerging data technologies.
Preferred Qualifications
Hands-on experience designing and deploying production-grade data warehouses.
Deep experience with Google Cloud Platform (GCP) :
BigQuery for large-scale analytical workloads.
Cloud Composer for orchestrating complex data pipelines.
Vertex AI for AI/ML model serving and feature stores.
Experience with other cloud providers (AWS, Azure) and their data services.
Working knowledge of data governance frameworks, master data management, and data cataloging tools.
Experience with data ingestion tools like Fivetran .
Business-intelligence expertise in building dashboards and reports with Power BI or Tableau .
Familiarity with other data technologies such as SAP HANA .
Understanding of MLOps concepts and their application to data pipelines.
Contributions to open-source data projects or technical blogging/presentation
Data Architect
Posted 4 days ago
Job Viewed
Job Description
Architect experience with Azure or AWS and Databricks
Notice : Immediate Preferred
Job Location: Noida , Mumbai , Pune , Bangalore , Gurgaon , Kochi (Hybrid Work)
Job Description:
• Develop a detailed project plan outlining tasks, timelines, milestones, and dependencies.
• Solutions architecture design and implementation
• Understand the source and outline the ADF structure. Design and schedule packages using ADF.
• Foster collaboration and communication within the team to ensure smooth workflow.
• Application performance optimization.
• Monitor and manage the allocation of resources to ensure tasks are adequately staffed.
• Created detailed technical specification, business requirements and unit test report documents.
• Ensure that the project adheres to best practices, coding standards, and technical requirements.
• Collaborating with technical leads to address technical issues and mitigate risks.
Data Architect
Posted 4 days ago
Job Viewed
Job Description
As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals.
Key Responsibilities:
Data Architecture Design:
Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams.
Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems.
Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency.
Data Integration & Management:
Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools.
Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets.
Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.).
Collaboration with Stakeholders:
Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs.
Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines.
Technology Leadership:
Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools.
Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization.
Data Quality & Security:
Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems.
Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity.
Mentorship & Leadership:
Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management.
Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy.
Extensive Data Architecture Expertise:
Over 7 years of experience in data architecture, data modeling, and database management.
Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions.
Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines.
Advanced Knowledge of Data Platforms:
Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus.
Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing.
Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker).
Data Governance & Compliance:
Strong understanding of data governance principles, data lineage, and data stewardship.
Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards.
Technical Leadership:
Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise.
Strong programming skills in languages such as Python, SQL, R, or Scala.
Pre-Sales Responsibilities:
Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives.
Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained.
Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions.
Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process.
Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements.
To know our privacy policy, please click the link below:
Be The First To Know
About the latest Data architect Jobs in Noida !
Data Architect
Posted 4 days ago
Job Viewed
Job Description
A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first culture
Key Responsibilities :
You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions.Communication and organization skills are keys for this position.
-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)
-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions
-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies
-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as BigQuery, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams
-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies
-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge
-Actively participate in pre-sales engagements, PoCs, and contribute to publishing thought leadership content
-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategy
Skills & Experience Required
-Demonstrated experience in delivering multiple data and analytics solutions on GCP
-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion
-Deep expertise with GCP data warehousing and analytics services including BigQuery, Cloud Storage, and Looker
-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation
-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, DataVault etc. and working with transactional, time-series, and unstructured Datasets
-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools
-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage
-Familiar with AI, GenAI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts
-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD
-Practical experience using tools such as Terraform, Cloud Build, GitHub Actions, and scripting via Python, Bash etc.
-Familiar with software development lifecycle and cloud-native application development
-Remains hands-on with technology and stays current with industry trends and GCP service evolution
-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantage
Interested candidates can apply by sharing their resume at or apply via LinkedIn job post.
Data Architect
Posted today
Job Viewed
Job Description
About the client
A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first culture
Key Responsibilities :
You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions.Communication and organization skills are keys for this position.
-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)
-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions
-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies
-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as BigQuery, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams
-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies
-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge
-Actively participate in pre-sales engagements, PoCs, and contribute to publishing thought leadership content
-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategy
Skills & Experience Required
-Demonstrated experience in delivering multiple data and analytics solutions on GCP
-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion
-Deep expertise with GCP data warehousing and analytics services including BigQuery, Cloud Storage, and Looker
-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation
-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, DataVault etc. and working with transactional, time-series, and unstructured Datasets
-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools
-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage
-Familiar with AI, GenAI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts
-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD
-Practical experience using tools such as Terraform, Cloud Build, GitHub Actions, and scripting via Python, Bash etc.
-Familiar with software development lifecycle and cloud-native application development
-Remains hands-on with technology and stays current with industry trends and GCP service evolution
-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantage
Interested candidates can apply by sharing their resume at or apply via job post.