Data Architect

Indore, Madhya Pradesh Persistent Systems

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About Position:


We are seeking a highly skilled and experienced Azure Databricks Architect to lead the design, development, and implementation of scalable data solutions using Azure Databricks. This role involves working closely with cross-functional teams to architect data pipelines, optimize performance, and ensure secure and efficient data processing across cloud platforms.


  • Role: Data Architect
  • Location: Pune
  • Experience: 12 to 17 Years
  • Job Type: Full Time Employment


What You'll Do:


  • Design and implement scalable data architectures using Azure Databricks, Delta Lake, and Azure Synapse Analytics.
  • Lead data migration strategies and cloud-based analytics transformation initiatives.
  • Collaborate with stakeholders to gather requirements and translate them into technical solutions.
  • Develop and optimize Spark-based data processing workflows using PySpark, Scala, and SQL.
  • Ensure data security, governance, and compliance using Azure Purview and other tools.
  • Integrate Databricks with Azure Data Factory, Azure Data Lake, and other Azure services.
  • Monitor and tune performance of Databricks clusters and jobs.
  • Provide technical leadership and mentorship to data engineers and developers.
  • Document architecture designs, processes, and best practices.
  • Stay updated with the latest Databricks features, Azure services, and industry trends.


Expertise You'll Bring:


  • Proven experience architecting and implementing large-scale data solutions using Azure Databricks and Apache Spark.
  • Deep understanding of Lakehouse architecture, Delta Lake, and data lake optimization strategies.
  • Strong programming skills in Python, Scala, and SQL for data engineering and analytics.
  • Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake Storage.
  • Expertise in data modeling, ETL/ELT pipeline design, and real-time data processing.
  • Familiarity with CI/CD practices, DevOps tools, and infrastructure-as-code (e.g., Terraform, ARM templates).
  • Knowledge of data governance, security, and compliance frameworks within Azure.
  • Ability to lead technical discussions, mentor teams, and collaborate with cross-functional stakeholders.
  • Experience integrating Databricks with Power BI, MLflow, and other analytics or ML tools.


Benefits:


  • Competitive salary and benefits package
  • Culture focused on talent development with quarterly growth opportunities and company-sponsored higher education and certifications
  • Opportunity to work with cutting-edge technologies
  • Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards
  • Annual health check-ups
  • Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents


Values-Driven, People-Centric & Inclusive Work Environment:


Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds.


  • We support hybrid work and flexible hours to fit diverse lifestyles.
  • Our office is accessibility-friendly, with ergonomic setups and assistive technologies to support employees with physical disabilities.
  • If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment


Let’s unleash your full potential at Persistent - persistent.com/careers


“Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”

This advertiser has chosen not to accept applicants from your region.

Data Architect

Indore, Madhya Pradesh Infogain

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities:

  • Design, develop, and optimize scalable data pipelines and workflows using Databricks and Apache Spark on AWS.
  • Collaborate with data scientists, analysts, and other stakeholders to transform data into actionable insights.
  • Implement data ingestion processes from various sources including S3, RDS, Redshift, and external APIs.
  • Optimize Spark jobs for performance and cost efficiency.
  • Monitor and troubleshoot data workflows, ensuring data quality and integrity.
  • Automate deployment and orchestration of data pipelines using AWS services like Lambda, Step Functions, or Glue.
  • Maintain security and governance standards in cloud data environments.
  • Document processes, data lineage, and architecture designs.
This advertiser has chosen not to accept applicants from your region.

Data Architect

Indore, Madhya Pradesh Tata Consultancy Services

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Greetings from TCS Human Resources Team!

Skill: GCP Data Architect

Years of experience: 8 to 10 Years

Location: PAN India


Mandatory Skills:*

- *Python*: Programming language

- *Airflow*: Data pipeline orchestration

- *Spark*: Big data processing

- *SQL*: Database querying

- *Data Form & DBT*: Data architecture and modeling

- *Architecture Design*: Scalable data solutions

Design and implement scalable data architectures on GCP, leveraging these key skills.

This advertiser has chosen not to accept applicants from your region.

Data Architect

Indore, Madhya Pradesh Mindfire Solutions

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

About the Job

We are seeking a highly skilled and strategic Data Architect to design, develop, and manage the organisation’s data architecture. The ideal candidate will ensure that data assets are structured, integrated, and governed to support analytics, reporting, and business intelligence initiatives across the enterprise. This role will work closely with business stakeholders, data engineers, analysts, and IT teams to ensure data is accurate, secure, scalable, and aligned with organisational goals.


Core Responsibilities


Data Strategy & Architecture

- Define and maintain the enterprise data architecture, ensuring alignment with business strategy and technology standards.

- Design data models, databases, and integration patterns that enable high-quality reporting and analytics.

- Establish standards for data storage, retrieval, transformation, and integration across systems.


Data Governance & Quality

- Develop and enforce policies for data governance, security, and compliance (e.g., HIPAA, GDPR, CCPA as applicable).

- Implement data quality frameworks to ensure accuracy, consistency, and completeness of information.


Collaboration & Leadership

- Partner with cross-functional teams (business, engineering, analytics, IT) to understand requirements and translate them into scalable data solutions.

- Provide technical leadership and mentorship to data engineers and developers.

- Collaborate with stakeholders to define metadata, master data management, and data lineage practices.


Technology & Innovation

- Evaluate and recommend modern data management tools, cloud platforms, and emerging technologies.

- Lead initiatives in data modernization, cloud migration, and big data/AI integration.

- Ensure high performance and scalability of enterprise data platforms.


Required Skills

- Experience with Azure Data Services (Azure Fabric, Azure Data Factory, Azure Synapse, Azure Data Lake).

- Knowledge of data security and compliance frameworks.

- Experience supporting AI/ML initiatives through data readiness.

- Prior experience in the retail or healthcare industry is a plus.


Qualifications

- Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or related field.

- Proven experience (10+ years) in data architecture, database design, and data modeling.

- Strong expertise with SQL, NoSQL, ETL tools, and cloud data platforms (Azure, AWS, or GCP).

- Hands-on experience with data integration, APIs, data lakes, and data warehouses.

- Knowledge of data governance, master data management, and metadata management.

- Familiarity with analytics, BI tools (e.g., Power BI, Tableau), and reporting frameworks.

- Strong problem-solving, analytical, and communication skills.

This advertiser has chosen not to accept applicants from your region.

Data Architect

Indore, Madhya Pradesh G10X

Posted 19 days ago

Job Viewed

Tap Again To Close

Job Description

Job Opportunity: Senior Data Architect (15+ Years Experience)

We are looking for a seasoned Data Architect to lead and shape enterprise-wide data initiatives. This role requires a strategic leader with deep technical expertise and strong stakeholder management skills.


Key Responsibilities

  • Act as the architectural authority for Data Integration, Data Governance, and MDM programs.
  • Define and oversee High-Level and Low-Level Solution Designs across business, application, and data domains.
  • Lead solution roadmaps covering as-is vs. to-be states, business value realization, and scalability.
  • Partner with business stakeholders, project managers, and engineering teams to ensure alignment of IT solutions with business needs.
  • Provide leadership and direction to teams while ensuring governance and delivery success.
  • Engage in presales activities to support the Data & Analytics Center of Excellence.

Required Qualifications

  • 15+ years of experience in Data Engineering & Architecture roles.
  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Proven expertise in Data Integration, Data Governance, and Master Data Management (MDM).
  • Strong knowledge of data modeling, governance, flows, quality, security, privacy, scalability, and performance.
  • Excellent stakeholder management and communication skills, with the ability to influence at senior levels.
  • Demonstrated leadership experience guiding teams and ensuring delivery of large-scale data programs.


Location: Kochi/ Remote


Please note: We are looking for early joiners.


Interested, Share your resume to

This advertiser has chosen not to accept applicants from your region.

GCP Data Architect

Indore, Madhya Pradesh Tata Consultancy Services

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title :- GCP Data Architect

Experience: 7 to 12 years

Location: Pan India

Virtual Drive : 10am to 4pm

Date: 11th Oct 2025

Greetings from TCS!

Job Description:

  • Design and Implement Data Architectures: Architect and build scalable, end-to-end data solutions on GCP, encompassing data ingestion, transformation, storage, and consumption.
  • Develop Data Pipelines: Design and develop ETL/ELT data pipelines using tools like Apache Airflow (Cloud Composer) and programming languages such as Python and SQL for batch and real-time processing.
  • Create Data Models: Build logical and physical data models, including dimensional modelling and schema design, to support data warehousing, data lakes, and analytics.
  • Ensure Data Quality and Governance: Establish and enforce data governance, security, and quality standards, implementing data validation and testing procedures.
  • Collaborate with Stakeholders: Work with data engineers, business analysts, data scientists, and product owners to translate business requirements into technical data solutions.
  • Optimize GCP Services: Optimize the performance and cost-effectiveness of GCP services, particularly Big Query, for analytics and data storage.
  • Provide Technical Guidance: Lead architectural reviews, provide technical guidance on cloud-native data strategies, and mentor engineering teams on GCP best practices.

Required Skills and Knowledge

  • Google Cloud Platform (GCP): Expertise with GCP services like BigQuery, Cloud Storage, Cloud SQL, and Cloud Composer.
  • Data Modelling: Proficiency in designing data models for data warehouses and data lakes.
  • ETL/ELT: Experience with designing and building data pipelines using tools like Apache Airflow.
  • Programming: Strong skills in SQL and Python for data processing and development.
  • Data Governance: Understanding and ability to implement data governance, metadata management, and security policies.
  • Collaboration: Strong communication skills to work with cross-functional teams and explain complex technical concepts.
This advertiser has chosen not to accept applicants from your region.

Principal/ Senior Data Architect

Indore, Madhya Pradesh Aays

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Position: Principal/ Senior Data Architect

Desired experience: 10+ years


Your team

You will act as a key member of the consulting team helping Clients to re-invent their corporate finance function by leveraging advanced analytics. You will be closely working directly with senior stakeholders of the clients designing and implementing data strategy in finance space which includes multiple use cases viz. controllership, FP&A and GPO. You will be responsible for developing technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies. You will also collaborate with Business Consultants and Product Owners to design and implement technical solutions. Communication and organisation skills are keys for this position.


Aays is a fast growing analytics consulting firm. We are solving some of the most complex corporate finance problems faced by large and fast growing companies in the world utilising big data, cloud computing and AI/ML. We are an employee first company – we firmly believe that happy employees can create happy customers and that can in turn result in a successful business. Come and join us for a life changing journey.


Responsibilities:

• Design and drive end to end data and analytics solution architecture from concept to delivery

• Design, develop, and support conceptual/logical/physical data models for analytics solutions.

• Ensures that industry-accepted data architecture principles, standards, guidelines and concepts are integrated with those of allied disciplines and coordinated roll-out and adoption strategies are in place.

• Drive the design, sizing, setup, etc. of Azure environments and related services

• Provide mentoring on data architecture design and requirements to development and business teams

• Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.

• Advising on new technology trends and possible adoption to maintain competitive advantage

• Participate in pre-sales activities and publish thought leaderships

• Work closely with the founders to drive the technology strategy for the organisation

• Help and lead technology team recruitments in various areas of data analytics


Experience Needed:

• Demonstrated experience delivering multiple data solutions

• Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies.

• Demonstrated in depth skills with SQL Server, Azure Synapse, Azure Databricks, HDInsights, Azure Data Lake with the ability to configure and administrate all aspects of SQL Server.

• Demonstrated experience with different data models like normalised, de-normalized, stars, and snowflake models. Worked with transactional, temporal, time series, and structured and unstructured data.

• Data Quality Management (Microsoft DQS and other data quality and governance tools) and Data Architecture standardization experience

• Deep understanding of the operational dependencies of applications, networks, systems, security and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).

• Advanced study / knowledge in the field of computer science or software engineering along with advanced knowledge of software development and methodologies (Microsoft development lifecycle including OOP principles, Visual Studio, SDKs, PowerShell, CLI).

• Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in-service delivery (Microsoft and Azure DevOps. Azure Automation).

• Has strong technical background and remains evergreen with technology and industry development

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Database manager Jobs in Indore !

Sr. Data Architect - Data Architect - Snowflake & SAP BW/ECC(Urgent Hire)

Indore, Madhya Pradesh Reflections Info Systems

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

We are looking for 10+ years experienced Sr. Data Architect


  • A minimum of 10 years of experience in data engineering, encompassing the development and scaling of data warehouse and data lake platforms.
  • The candidate should possess a strong background in both Snowflake and SAP BW/ECC environments, demonstrating leadership in technical design, architecture, and implementation of complex data solutions. Understand SAP ABAP programs, reports, interfaces, and enhancements
  • Working hours - 8 hours , with a few hours of overlap during EST Time zone. This overlap hours is mandatory as meetings happen during this overlap hours. Working hours will be 12 PM - 9 PM


Primary Skills :

  • Extensive experience in designing and implementing data solutions using Snowflake, DBT.
  • Proficiency in data modeling, schema design, and optimization within Snowflake environments.
  • Strong understanding of SAP BW/ECC systems, including data extraction, transformation, and loading (ETL/ELT) processes. Experience in integrating SAP data into cloud data platforms.
  • Strong understanding of cloud data warehousing concepts and best practices, particularly with Snowflake.
  • Expertise in python/java/scala, SQL, ETL processes, and data integration techniques, with a focus on Snowflake.
  • Familiarity with other cloud platforms and data technologies (e.g., AWS, Azure, GCP)
  • Proficiency in SAP ABAP (Advanced Business Application Programming) development.
  • Experience in designing, developing, and implementing SAP ABAP programs, reports, interfaces, and enhancements
  • Demonstrated experience in implementing data governance frameworks and DataOps practices.
  • Working experience in SAP environments


Responsibilities include:

  • Mandatory Skills: Snowflake experiance, Data Architecture experiance, ETL process experiance, Large Data migration solutioning experiance.
  • Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability, performance, and reliability.
  • Collaborate with stakeholders to understand business requirements and translate them into technical specifications and data models.
  • Develop and maintain data architecture standards, guidelines, and best practices, including data governance principles and DataOps methodologies.
  • Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within Snowflake environments.
  • Provide technical guidance and mentorship to data engineering teams, fostering skill development and knowledge sharing.
  • Conduct performance tuning and optimization of Snowflake databases and queries.
  • Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data governance, and DataOps practices.
This advertiser has chosen not to accept applicants from your region.

Senior Data Architect - Cloud Solutions

452001 Indore, Madhya Pradesh ₹1900000 Annually WhatJobs

Posted 16 days ago

Job Viewed

Tap Again To Close

Job Description

full-time
Our client, a dynamic enterprise technology company, is seeking a highly skilled and experienced Senior Data Architect to lead the design and implementation of robust, scalable, and secure data solutions. This is a fully remote position, offering the flexibility to work from anywhere while contributing to critical projects. The ideal candidate will have a deep understanding of data warehousing, data modeling, ETL/ELT processes, and cloud data platforms. You will be responsible for defining data strategies, establishing data governance frameworks, and ensuring data integrity and accessibility across the organization.

Key Responsibilities:
  • Design and develop enterprise-level data architectures, focusing on cloud-based solutions (AWS, Azure, GCP).
  • Create and maintain conceptual, logical, and physical data models for various data stores (data warehouses, data lakes, operational databases).
  • Define and implement ETL/ELT processes for data ingestion, transformation, and loading.
  • Establish and enforce data governance policies, standards, and best practices.
  • Collaborate with business stakeholders, data engineers, and analysts to understand data requirements and deliver effective solutions.
  • Evaluate and recommend new data technologies and tools to enhance data infrastructure.
  • Ensure data security, privacy, and compliance with relevant regulations.
  • Provide technical leadership and mentorship to data engineering teams.
  • Develop and maintain comprehensive data architecture documentation.
  • Troubleshoot and resolve complex data-related issues.
  • Effectively communicate architectural designs and decisions to both technical and non-technical audiences within a remote team setting.
Qualifications:
  • Master's or Bachelor's degree in Computer Science, Information Technology, or a related quantitative field.
  • Minimum of 8-10 years of experience in data architecture, data modeling, and data engineering.
  • Proven expertise in designing and implementing solutions on major cloud platforms (AWS, Azure, or GCP).
  • Strong knowledge of various database technologies (e.g., SQL, NoSQL), data warehousing concepts (e.g., Kimball, Inmon), and data lake architectures.
  • Experience with big data technologies (e.g., Spark, Hadoop) and data streaming technologies (e.g., Kafka).
  • Proficiency in data modeling tools and techniques.
  • Excellent understanding of data governance, data quality, and data security principles.
  • Strong analytical, problem-solving, and strategic thinking skills.
  • Exceptional communication, collaboration, and leadership abilities.
  • Experience in managing cross-functional projects in a remote environment.
  • Relevant cloud certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) are a plus.
This impactful role, supporting our client's digital transformation initiatives impacting Indore, Madhya Pradesh, IN and globally, is perfect for a visionary architect.
This advertiser has chosen not to accept applicants from your region.

Fabric Platform Data Architect (Remote)

Indore, Madhya Pradesh Thinkgrid Labs

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Thinkgrid Labs is at the forefront of innovation and technology. Our expert team of software engineers, architects, and UI/UX designers specialises in crafting bespoke web, mobile, cloud applications, data platforms, along with AI solutions and intelligent bots. Thinkgrid Labs is expanding our data practice to stand up Microsoft Fabric for a US health insurer . You’ll architect the platform foundation, set governance guardrails, and ensure the environment is secure, compliant, cost-efficient, and usable for everyone from execs to engineers.


Job Title : Fabric Platform Architect

Location : Remote

Working Hours : 2 PM IST to 11 PM IST

Experience Required : 8–12 years overall; 3+ years in Azure data platform governance/Fabric admin

Education : Bachelor’s or Master’s degree in Computer Science, Health Informatics, or Business.

Preferred certifications : DP-600, DP-500, AZ-305, SC-100/SC-300.


Who are you?

  • Platform Architecture (Fabric or Equivalent): You’ve designed multi-tenant or multi-domain lakehouse platforms (e.g., Fabric, Databricks + ADLS/Delta, Azure Synapse, Snowflake, AWS Glue/Lake Formation, GCP BigQuery), including workspace/project topology, environment isolation, and capacity planning.
  • Fabric Authority: Deep, hands-on expertise with Microsoft Fabric architecture—tenants, domains, capacity groups, workspaces, OneLake, Lakehouse/Warehouse, Real-Time Analytics.
  • Governance-First: You design pragmatic guardrails for rights/roles/budgets/access; you operationalise RLS/CLS and data domains at scale.
  • Catalogue & Lineage Champion: Comfortable configuring Microsoft Purview—collections, glossary, scanners, policies, and lineage end-to-end.
  • Security & Compliance Mindset: Strong Entra ID/Azure IAM; you bake in HIPAA/PII controls, auditing, and least privilege by default.
  • Cost-Savvy Operator: Capacity planning, workloads, reserved capacity, and FinOps reporting come naturally.
  • Stakeholder Communicator: Equally at home in steering committees with execs and deep dives with engineers. (We value client-facing clarity and excellent written/verbal communication.)


What you will be doing?

  • Platform Foundation (Owner): Design and configure Fabric tenants, domains, capacity groups, and workspaces; define dev/test/prod environment strategy and workspace topology.
  • Governance & Access: Establish RBAC models, security groups, RLS/CLS patterns, budgeting guardrails, naming/tags, and approval workflows.
  • Catalogue, Glossary & Lineage:  Stand up Purview collections; implement glossary governance, scanner rules, sensitivity labels, and end-to-end lineage across Fabric artefacts.
  • Security, Privacy & Audit: Map HIPAA/PHI controls; integrate with Entra, Azure Policy, and Defender for Cloud; enable audit logs to be sent to Log Analytics and configured for retention; drive periodic access reviews.
  • Cost Management & FinOps: Right-size capacity, monitor usage, forecast spend, set budgets/alerts, and recommend workload isolation or reservations to optimise TCO.
  • CI/CD & Change Control: Enable source control (Git mode), deployment pipelines, and release promotion; define SOW-aligned change processes and acceptance criteria for platform changes.
  • Operational Excellence: Define SLOs/alerts/runbooks, backup/DR and region strategy; lead incident/RCAs; maintain RAID and platform roadmaps.
  • Enablement & Communication: Run discovery/architecture reviews, steering/QBRs, and training for analysts/engineering teams; produce clear docs, playbooks, and decision logs.


Must-have skills

  • Microsoft Fabric administration (tenancy, domains, capacities, workspaces, OneLake)
  • Microsoft Purview (catalogue, glossary, scans, policies, lineage)
  • Azure IAM/Entra ID (RBAC, conditional access), RLS/CLS patterns
  • Cost management and FinOps awareness (capacity planning, budgets, alerts)
  • Excellent stakeholder communication (execs, security, data engineering)


Benefits

  • 5-day work week
  • 100% remote setup with flexible work culture and international exposure
  • Opportunity to work on mission-critical healthcare projects impacting providers and patients globally
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Database Manager Jobs View All Jobs in Indore