148 Data Architect jobs in Mumbai
Data Architect - Hadoop/Big Data
Posted today
Job Viewed
Job Description
• As individual contributor code and test modules.
• Interact and collaborate directly with software developers, product managers, and business analysts to ensure proper development and quality of service applications and products.
• Ability to do development in an Agile environment.
• Work closely with Leads and Architects to understand the requirements and translate that into code.
• Mentor junior engineers if required.
Architect – Big Data
Posted today
Job Viewed
Job Description
• Robust Business Intelligence development experience
• Experience with AWS BI Services (QuickSight, EMR, Glue, etc.)
• Aurora, Spark, and MySQL Experience is a Plus
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Role: Technical Architect - Data
Experience Level: 10 to 15 Years
Work location: Mumbai, Bangalore, Trivandrum (Hybrid)
Notice Period: Any
Role & Responsibilities:
- More than 8 years of experience in Technical, Solutioning, and Analytical roles.
- 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
- Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
- Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
- Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
- Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
- Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
- Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
- Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
- Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
- Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
- Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
- Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
- Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
- Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
Role:
- Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
- Interface with multiple stakeholders within IT and business to understand the data requirements.
- Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
- Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
- Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
- Mentor Young Talent within the Team, Define and track their growth parameters.
- Contribute to building Assets and Accelerators.
Other Skills:
- Strong Communication and Articulation Skills.
- Good Leadership Skills.
- Should be a good team player.
- Good Analytical and Problem-solving skills.
Data Architect
Posted 2 days ago
Job Viewed
Job Description
Role: Technical Architect - Data
Experience Level: 10 to 15 Years
Work location: Mumbai, Bangalore, Trivandrum (Hybrid)
Notice Period: Any
Role & Responsibilities:
- More than 8 years of experience in Technical, Solutioning, and Analytical roles.
- 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
- Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
- Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
- Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
- Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
- Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
- Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
- Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
- Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
- Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
- Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
- Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
- Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
- Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
Role:
- Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
- Interface with multiple stakeholders within IT and business to understand the data requirements.
- Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
- Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
- Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
- Mentor Young Talent within the Team, Define and track their growth parameters.
- Contribute to building Assets and Accelerators.
Other Skills:
- Strong Communication and Articulation Skills.
- Good Leadership Skills.
- Should be a good team player.
- Good Analytical and Problem-solving skills.
Data architect
Posted today
Job Viewed
Job Description
Currently we have an open position with our client - US based Boutique Consulting Firm Databricks Architect.About Us:It is a boutique consulting firm specializing in data and AI, with deep expertise in the Databricks ecosystem. We help clients build scalable data platforms and AI solutions that drive real business impact. Our expertise in implementation, optimization, and enablement empowers clients to harness the full potential of their data, unlocking significant competitive advantages and fostering innovation.Key Responsibilities:1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements.2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks.3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks.4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness.5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations.6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions.7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team.Requirements:1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics.2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake.3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance.4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives.5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders.Good to Have:1. Certifications: Databricks Certified Professional or similar certifications.2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP.3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries.
Data architect
Posted today
Job Viewed
Job Description
About the client A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first cultureKey Responsibilities: You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organization skills are keys for this position.-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as Big Query, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge-Actively participate in pre-sales engagements, Po Cs, and contribute to publishing thought leadership content-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategySkills & Experience Required -Demonstrated experience in delivering multiple data and analytics solutions on GCP-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion-Deep expertise with GCP data warehousing and analytics services including Big Query, Cloud Storage, and Looker-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, Data Vault etc. and working with transactional, time-series, and unstructured Datasets-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage-Familiar with AI, Gen AI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD-Practical experience using tools such as Terraform, Cloud Build, Git Hub Actions, and scripting via Python, Bash etc.-Familiar with software development lifecycle and cloud-native application development-Remains hands-on with technology and stays current with industry trends and GCP service evolution-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantageInterested candidates can apply by sharing their resume at or apply via job post.
Data architect
Posted today
Job Viewed
Job Description
About the client A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first cultureKey Responsibilities: You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organization skills are keys for this position.-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as Big Query, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge-Actively participate in pre-sales engagements, Po Cs, and contribute to publishing thought leadership content-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategySkills & Experience Required -Demonstrated experience in delivering multiple data and analytics solutions on GCP-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion-Deep expertise with GCP data warehousing and analytics services including Big Query, Cloud Storage, and Looker-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, Data Vault etc. and working with transactional, time-series, and unstructured Datasets-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage-Familiar with AI, Gen AI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD-Practical experience using tools such as Terraform, Cloud Build, Git Hub Actions, and scripting via Python, Bash etc.-Familiar with software development lifecycle and cloud-native application development-Remains hands-on with technology and stays current with industry trends and GCP service evolution-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantageInterested candidates can apply by sharing their resume at or apply via job post.
Be The First To Know
About the latest Data architect Jobs in Mumbai !
Data Architect
Posted today
Job Viewed
Job Description
Roles and Responsibilities
We are looking for skilled Azure Data Engineer / Architect to join our Data Lake team. The candidate shal have an experience into building and optimising large data platforms preferably in for lending business
Exp required – - Yrs
Skillsets
Architecting Data platform solution
Experience in delivery of large scale enterprise data warehouse solution
Strong written and oral communication skills
Knowledge in BFSI domain will be added advantage
Design and develop batch data pipelines independently to enable faster business analysis
Experience in Datalake / DataWareHousing projects from end to end delivery perspective.
Experience with Data modelling (relational and dimensional)
Working Experience in Microsoft Azure cloud and preferably component : Azure data lake, Azure data factory, SQL DW (Synapse) , Spark
4+ Exp in writing complex SQL queries/procedures/Views/Functions and database objects.
Minimum 3 years exp required into cloud computing preferably into Microsoft Azure .
Experience working with big data frameworks especially Spark.
Experience into R and Python would be added advantage.
Nice to have Talend/ SSIS knowledge
Azure admin knowledge will be added advantage
Proficient understanding of code versioning tools.
Excellent analytical and organisational skill
Data architect
Posted today
Job Viewed
Job Description
Data architect
Posted today
Job Viewed