148 Data Architect jobs in Mumbai

Data Architect - Hadoop/Big Data

Mumbai, Maharashtra Anicalls (Pty) Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

• As an individual contributor, design modules, apply creative problem-solving using tools and technologies.
• As individual contributor code and test modules.
• Interact and collaborate directly with software developers, product managers, and business analysts to ensure proper development and quality of service applications and products.
• Ability to do development in an Agile environment.
• Work closely with Leads and Architects to understand the requirements and translate that into code.
• Mentor junior engineers if required.
This advertiser has chosen not to accept applicants from your region.

Architect – Big Data

Mumbai, Maharashtra Anicalls (Pty) Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

• Experience using Python
• Robust Business Intelligence development experience
• Experience with AWS BI Services (QuickSight, EMR, Glue, etc.)
• Aurora, Spark, and MySQL Experience is a Plus
This advertiser has chosen not to accept applicants from your region.

Data Architect

Mumbai, Maharashtra Quantiphi

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Role: Technical Architect - Data

Experience Level: 10 to 15 Years

Work location: Mumbai, Bangalore, Trivandrum (Hybrid)

Notice Period: Any


Role & Responsibilities:


  • More than 8 years of experience in Technical, Solutioning, and Analytical roles.
  • 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
  • Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
  • Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
  • Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
  • Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
  • Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
  • Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
  • Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
  • Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
  • Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
  • Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
  • Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
  • Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
  • Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
  • Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
  • Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.


Role:


  • Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
  • Interface with multiple stakeholders within IT and business to understand the data requirements.
  • Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
  • Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
  • Mentor Young Talent within the Team, Define and track their growth parameters.
  • Contribute to building Assets and Accelerators.


Other Skills:


  • Strong Communication and Articulation Skills.
  • Good Leadership Skills.
  • Should be a good team player.
  • Good Analytical and Problem-solving skills.
This advertiser has chosen not to accept applicants from your region.

Data Architect

Mumbai, Maharashtra Quantiphi

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Role: Technical Architect - Data

Experience Level: 10 to 15 Years

Work location: Mumbai, Bangalore, Trivandrum (Hybrid)

Notice Period: Any

Role & Responsibilities:

  • More than 8 years of experience in Technical, Solutioning, and Analytical roles.
  • 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
  • Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
  • Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
  • Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
  • Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
  • Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
  • Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
  • Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
  • Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
  • Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
  • Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
  • Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
  • Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
  • Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
  • Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
  • Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.

Role:

  • Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
  • Interface with multiple stakeholders within IT and business to understand the data requirements.
  • Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
  • Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
  • Mentor Young Talent within the Team, Define and track their growth parameters.
  • Contribute to building Assets and Accelerators.

Other Skills:

  • Strong Communication and Articulation Skills.
  • Good Leadership Skills.
  • Should be a good team player.
  • Good Analytical and Problem-solving skills.
This advertiser has chosen not to accept applicants from your region.

Data architect

Thane, Maharashtra BL Consultants

Posted today

Job Viewed

Tap Again To Close

Job Description

Currently we have an open position with our client - US based Boutique Consulting Firm Databricks Architect.About Us:It is a boutique consulting firm specializing in data and AI, with deep expertise in the Databricks ecosystem. We help clients build scalable data platforms and AI solutions that drive real business impact. Our expertise in implementation, optimization, and enablement empowers clients to harness the full potential of their data, unlocking significant competitive advantages and fostering innovation.Key Responsibilities:1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements.2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks.3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks.4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness.5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations.6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions.7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team.Requirements:1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics.2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake.3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance.4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives.5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders.Good to Have:1. Certifications: Databricks Certified Professional or similar certifications.2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP.3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries.

This advertiser has chosen not to accept applicants from your region.

Data architect

Mumbai, Maharashtra InvokHR Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

About the client A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first cultureKey Responsibilities: You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organization skills are keys for this position.-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as Big Query, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge-Actively participate in pre-sales engagements, Po Cs, and contribute to publishing thought leadership content-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategySkills & Experience Required -Demonstrated experience in delivering multiple data and analytics solutions on GCP-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion-Deep expertise with GCP data warehousing and analytics services including Big Query, Cloud Storage, and Looker-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, Data Vault etc. and working with transactional, time-series, and unstructured Datasets-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage-Familiar with AI, Gen AI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD-Practical experience using tools such as Terraform, Cloud Build, Git Hub Actions, and scripting via Python, Bash etc.-Familiar with software development lifecycle and cloud-native application development-Remains hands-on with technology and stays current with industry trends and GCP service evolution-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantageInterested candidates can apply by sharing their resume at or apply via job post.

This advertiser has chosen not to accept applicants from your region.

Data architect

Thane, Maharashtra InvokHR Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

About the client A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first cultureKey Responsibilities: You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organization skills are keys for this position.-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as Big Query, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge-Actively participate in pre-sales engagements, Po Cs, and contribute to publishing thought leadership content-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategySkills & Experience Required -Demonstrated experience in delivering multiple data and analytics solutions on GCP-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion-Deep expertise with GCP data warehousing and analytics services including Big Query, Cloud Storage, and Looker-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, Data Vault etc. and working with transactional, time-series, and unstructured Datasets-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage-Familiar with AI, Gen AI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD-Practical experience using tools such as Terraform, Cloud Build, Git Hub Actions, and scripting via Python, Bash etc.-Familiar with software development lifecycle and cloud-native application development-Remains hands-on with technology and stays current with industry trends and GCP service evolution-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantageInterested candidates can apply by sharing their resume at or apply via job post.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data architect Jobs in Mumbai !

Data Architect

Mumbai, Maharashtra Skill Ventory

Posted today

Job Viewed

Tap Again To Close

Job Description

Roles and Responsibilities

We are looking for skilled Azure Data Engineer / Architect to join our Data Lake team. The candidate shal have an experience into building and optimising large data platforms preferably in for lending business


Exp required – - Yrs


Skillsets

  • Architecting Data platform solution

  • Experience in delivery of large scale enterprise data warehouse solution

  • Strong written and oral communication skills

  • Knowledge in BFSI domain will be added advantage

  • Design and develop batch data pipelines independently to enable faster business analysis

  • Experience in Datalake / DataWareHousing projects from end to end delivery perspective.

  • Experience with Data modelling (relational and dimensional)

  • Working Experience in Microsoft Azure cloud and preferably component : Azure data lake, Azure data factory, SQL DW (Synapse) , Spark

  • 4+ Exp in writing complex SQL queries/procedures/Views/Functions and database objects.

  • Minimum 3 years exp required into cloud computing preferably into Microsoft Azure .

  • Experience working with big data frameworks especially Spark.

  • Experience into R and Python would be added advantage.

  • Nice to have Talend/ SSIS knowledge

  • Azure admin knowledge will be added advantage

  • Proficient understanding of code versioning tools.

  • Excellent analytical and organisational skill

  • This advertiser has chosen not to accept applicants from your region.

    Data architect

    Thane, Maharashtra InvokHR Solutions

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    permanent
    About the client A boutique AI and data engineering firm, founded in 2018 and headquartered in Gurgaon, India, that empowers Fortune 1000 companies through AI-powered decision intelligence—especially in finance and supply chain transformation—delivering over US $2 billion in financial benefits while fostering a highly developed people-first cultureKey Responsibilities: You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organization skills are keys for this position.-Design and drive end-to-end data and analytics solution architecture from concept to delivery on Google Cloud Platform (GCP)-Design, develop, and support conceptual, logical, and physical data models for advanced analytics and ML-driven solutions-Ensure integration of industry-accepted data architecture principles, standards, guidelines, and concepts with other domains, along with coordinated roll-out and adoption strategies-Drive the design, sizing, provisioning, and setup of GCP environments and related services such as Big Query, Dataflow, and Cloud Storage. Provide mentoring and guidance on GCP-based data architecture to engineering, analytics, and business teams-Review solution requirements and architecture for appropriate technology selection, efficient resource utilization, and effective integration across systems and technologies-Advise on emerging GCP trends and services, and recommend adoption strategies to maintain competitive edge-Actively participate in pre-sales engagements, Po Cs, and contribute to publishing thought leadership content-Collaborate closely with the founders and leadership team to shape and drive the organization’s cloud and data strategySkills & Experience Required -Demonstrated experience in delivering multiple data and analytics solutions on GCP-Hands-on experience with data ingestion, processing and orchestration tools such as Dataflow, Pub/Sub, Dataproc, Cloud Composer, and Data Fusion-Deep expertise with GCP data warehousing and analytics services including Big Query, Cloud Storage, and Looker-Familiar with Data Mesh, Data Fabric and Data products, Data Contracts and experience in data mesh implementation-Strong understanding and practical experience with different data modelling techniques—Relational, Star, Snowflake, Data Vault etc. and working with transactional, time-series, and unstructured Datasets-Experience with enterprise data management i.e. Data Quality t, Metadata management, Data governance, Data Observability using GCP-native or third-party tools-Experience in design and implementation of event driven architecture and tools like google pub -sub or Kafka will be an added advantage-Familiar with AI, Gen AI concepts and Vertex AI. Solid grasp of operational dependencies and integration across applications, networks, security, and infrastructure in the cloud—including IAM, VPCs, VPNs, firewall rules, GKE, and service accounts-Strong foundation in computer science or software engineering, with expertise in software development methodologies and Dev/Data Ops and CI/CD-Practical experience using tools such as Terraform, Cloud Build, Git Hub Actions, and scripting via Python, Bash etc.-Familiar with software development lifecycle and cloud-native application development-Remains hands-on with technology and stays current with industry trends and GCP service evolution-Demonstrated hands-on experience coding in Python, SQL, and Spark, with the flexibility to pick up new languages or technologies quick GCP professional data engineer certification will be an added advantageInterested candidates can apply by sharing their resume at or apply via job post.
    This advertiser has chosen not to accept applicants from your region.

    Data architect

    Thane, Maharashtra Pro5.ai

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    permanent
    We are seeking a talented Data Architect to design, develop, and govern our data architecture and pipelines. to join as soon as possible. Ideally someone who has worked on multiple projects with GCP-Big Query, Data Proc technologies. The candidate will need to be collaborative, organised, think out-of-the-box, and be ready to pursue new opportunities. Most importantly, this role is for an individual who is passionate about making a difference through healthcare.Key ResponsibilitiesDesign and implement scalable and robust data architectures for data warehousing and enterprise data platforms.Develop and optimize data pipelines and ETL/ELT/CDC (Change Data Capture) workflows using tools such as Fivetran and Cloud Composer.Collaborate with data scientists, product managers, and business stakeholders to define data requirements and create logical and physical data models.Manage and administer various database systems, including Big Query, SAP HANA, and Postgre SQL.Ensure data quality, integrity, and security across all data platforms and pipelines.Work with our AI/ML teams to design data serving layers and feature stores that support Vertex AIworkloads.Design and develop reporting frameworks and data marts to support business intelligence needs.Integrate data platforms with various enterprise systems (CRMs, ERPs) and third-party APIs.Define and implement data governance, master data management, and data cataloging strategies.Contribute to the full data lifecycle: requirements gathering, architecture, data modeling, development, testing, and deployment.Troubleshoot and resolve data platform issues to ensure high availability and optimal performance.Document technical designs, data lineage, and architecture for cross-functional reference.Required QualificationsBachelor’s or Master’s degree in Computer Science, Software Engineering, Data Science, or a related field.Proficiency in one or more backend languages/frameworks, with a strong preference for Python or Go.Experience with building RESTful APIs and designing microservices for data delivery.Solid grasp of data modeling fundamentals, including Kimball and Inmon methodologies.Proficiency in writing complex SQL queries and experience with SQL and No SQL databases.Familiarity with data warehousing concepts and best practices, including CDC.Strong version-control habits (Git) and experience with CI/CD pipelines.Excellent problem-solving, communication, and collaboration skills.Passion for continuous learning and adapting to emerging data technologies.Preferred QualificationsHands-on experience designing and deploying production-grade data warehouses.Deep experience with Google Cloud Platform (GCP):Big Query for large-scale analytical workloads.Cloud Composer for orchestrating complex data pipelines.Vertex AI for AI/ML model serving and feature stores.Experience with other cloud providers (AWS, Azure) and their data services.Working knowledge of data governance frameworks, master data management, and data cataloging tools.Experience with data ingestion tools like Fivetran.Business-intelligence expertise in building dashboards and reports with Power BI or Tableau.Familiarity with other data technologies such as SAP HANA.Understanding of MLOps concepts and their application to data pipelines.Contributions to open-source data projects or technical blogging/presentation
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Architect Jobs View All Jobs in Mumbai