591 Data Architect jobs in Bengaluru
Azure Big Data Architect
Posted today
Job Viewed
Job Description
Job Description
Tiger Analytics is a global analytics consulting firm. With data and technology at the core of our solutions, we are solving some of the toughest problems out there. Our culture is modeled around expertise and mutual respect with a team first mindset.Cloud and Big data Architect
Posted today
Job Viewed
Job Description
Role: Cloud and Big data Architect
Role Description
We are looking for skilled and passionate Azure Cloud experts with a strong focus on Big Data management and data quality skills to join our team at Socit Gnrale Global Solution Centre in Bengaluru.
This is a full-time hybrid role for Digital & Data Solutions (DDS), offering flexibility for remote work.
Required Skills & Experience:
- Hands-on experience in Azure for a minimum of 34 years, with overall professional experience of less than 10 years.
- Proficient in Big Data technologies (e.g., Hadoop, Spark, Kafka) and their integration with cloud environments.
- Strong understanding of data management principles and best practices, including data governance and data lifecycle management.
- Knowledge of Data quality and Governance frameworks and tools to ensure accuracy, completeness, and reliability of data.
- Scripting knowledge in PowerShell, Python, and Shell scripting.
- Good knowledge of CI/CD integration tools such as Jenkins, Argo CD, and GitHub Actions.
- Azure certification (AZ-104 / AZ-305 / AZ-204) and DevOps experience are a plus.
- Experience working with SQL and NoSQL databases.
- Finance and banking experience will be highly appreciated.
Roles & Responsibilities:
- Strong understanding of Big Data architecture, cloud adoption strategies, and solution design.
- Experience with hybrid cloud environments and migrating on-premises applications and databases to the cloud.
- Ensure data governance and applicative implementation standards on the data lake store. Review and capture non-compliance with data standards, propose remediation, and follow up with upstream and downstream stakeholders.
- Stay updated with the latest trends in Azure cloud technologies, Data management and governance, and industry best practices to inform and improve solutions.
- Excellent communication and collaboration skills, with a strong ability to work in a team-oriented environment.
- Analyze existing operational standards, processes, and governance to identify and implement improvements.
- Conduct POCs in Data and Cloud Platform to ensure that suggested solutions and technologies meet the requirements.
- Stay updated with the latest trends in cloud technologies, Data management services, and industry best practices to inform and improve solutions.
- Strong analytical and problem-solving skills, with the ability to troubleshoot complex issues in a fast-paced environment.
Why Join Us:
We are committed to creating a diverse environment and are proud to be an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status
Azure Big Data Architect engineer
Posted today
Job Viewed
Job Description
• Azure Databricks
• Python, Scala, PySpark, Spark
• HIVE / HIVE LLAP / HBASE / CosmoDb
• Azure Active Directory Domain Services
• Apache Ranger / Apache Ambari
• Azure Key Vault
• Expertise in HDInsight (Minimum 2-3 years’ experience with multiple implementations)
• Expertise in Cloud Native and Open Cloud Architecture
Big Data Technical Architect
Posted today
Job Viewed
Job Description
Hello,
We are hiring for "Big Data Architect" for Bangalore location.
Exp: 10+Years
Loc: Bangalore
Work Mode: Hybrid
Notice Period: Immediate joiners(notice period served or serving candidate)
NOTE: We are looking for Immediate joiners(notice period served or serving candidate)
Apply only If you are Immediate joiners(notice period served or serving candidate)
Apply only if you are having 10+Years of relevant experience as per the JD.
Job Description:
We are seeking a 10+ years experienced and technically proficient Big Data Technical Architect with deep expertise in cloud platforms (AWS, Azure) and Business Intelligence (BI) tools like Tableau, Power BI, and Qlik. The ideal candidate will have a solid understanding of modern data architecture patterns, data lake and lakehouse strategies, robust ETL/ELT design, real-time data ingestion, and hands-on experience with tools such as Snowflake, AWS Glue, Microsoft Fabric, Spark, and Python. This role requires a strategic thinker with excellent communication skills to interface with clients and internal stakeholders and lead enterprise-level data initiatives.
Roles & Responsibilities
Key Responsibilities:
- • Architect and guide implementation of Data Lakes, Lakehouses, and Data Warehouses using tools such as Snowflake, Microsoft Fabric, and Delta Lake.
- • Design and implement scalable, secure, and high-performing Big Data architectures across AWS and Azure.
- • Develop robust ETL/ELT pipelines using modern data services like AWS Glue, Azure Data Factory, Spark, and custom scripts in Python/SQL/PLSQL.
- • Integrate structured and unstructured data sources using API integrations, event-driven pipelines, real-time data ingestion, and batch processing.
- • Lead the BI and analytics layer strategy using tools such as Tableau, Power BI, and Qlik Sense for enterprise reporting and dashboarding.
- • Design and implement data models (conceptual, logical, physical) that support both operational and analytical requirements.
- • Establish and enforce data governance, data security, and data quality standards across platforms.
- • Drive initiatives in data observability, monitoring data pipelines, identifying issues, and ensuring SLA adherence.
- • Serve as a technical SME and advisor to both internal teams and clientstranslating business needs into technical solutions.
- • Lead architectural reviews and provide guidance on data best practices and cloud optimization.
- • Develop and deliver technical presentations to executive and non-technical stakeholders.
Domain Experience (Good to Have):
- • Exposure to BFSI domain, including understanding of risk management, regulatory compliance (Basel III, PCI DSS), fraud detection, and financial data workflows.
- • Familiarity with Retail data challenges such as supply chain analytics, customer behavior tracking, inventory management, and omni-channel reporting.
- • Experience with Pharma and Healthcare sectors, including clinical data management, regulatory compliance (HIPAA, GDPR), patient analytics, and drug safety data.
- • Ability to adapt data architecture and BI solutions to domain-specific requirements across these industries, supporting both operational and strategic business goals.
Required Skills:
- • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.
- • Strong hands-on experience with cloud platforms and related data technologies:
- o AWS: S3, AWS Glue, Redshift, Lambda, Kinesis Data Streams & Firehose, Managed Kafka (MSK), EMR (Spark), Athena, IAM, KMS.
- o Azure: Data Lake Storage Gen2, Synapse Analytics, Data Factory, Event Hubs, Stream Analytics, Managed Kafka, Databricks, Azure Functions, Active Directory, Key Vault.
- • Proven expertise in building and optimizing ETL/ELT pipelines using AWS Glue, Azure Data Factory, Apache Spark, and scripting languages like Python, SQL, and PL/SQL.
- • Solid experience with data lake and lakehouse strategies, and hands-on with modern data warehouse platforms such as Snowflake and Microsoft Fabric.
- • Skilled in real-time data ingestion and streaming technologies like Apache Kafka, AWS Kinesis, Azure Event Hubs, and Spark Streaming.
- • Deep understanding of data modeling concepts (conceptual, logical, physical) and best practices for both OLTP and OLAP systems.
- • Expertise in business intelligence tools such as Tableau, Power BI, and Qlik Sense for enterprise-grade dashboards and analytics.
- • Strong grasp of data governance, data security (encryption, access control), data quality frameworks, and data observability tools like Monte Carlo, DataDog, or Great Expectations.
- • Familiarity with relevant data privacy and regulatory compliance standards (GDPR, CCPA, HIPAA, PCI DSS).
- • Excellent client-facing communication skills with ability to explain complex technical concepts to non-technical stakeholders.
- • Proven leadership and mentoring capabilities in guiding cross-functional teams.
Qualifications:
- • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.
Big Data Solution Architect
Posted today
Job Viewed
Job Description
Description
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are looking for Solution Architects for data-driven projects to join our Data Practice team in India. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain.
#LI-DNI #REF-IN-WOMEN
Responsibilities
Requirements
We offer
Data Architect
Posted 1 day ago
Job Viewed
Job Description
We are looking for experienced Data Architects who can join us within 30 days .
If you have strong expertise in data architecture, cloud platforms, and big data technologies , this is your chance to work on exciting enterprise-level projects.
Key Skills & Experience:
- System Architect – SAFe Train, Data Engineering & BI (Power BI)
- Data & Reporting Platform (Power BI, DAX, SQL – AHDRP)
- Cloud Technologies – Microsoft Azure preferred (other clouds considered)
- Data Management – Hadoop, Spark, Kafka
- Databases – Azure SQL, Cosmos DB, Couchbase
- CI/CD & Data Pipeline development
- Programming – Java, Scala, Skube, Python
- DevOps – AUDs, KeyVault, ArgoCD
- Reporting & Analytics – Power Query, MDX, Power BI, DAX
- Automation – Robot Framework, Selenium, Octane
What You’ll Do:
- Architect and implement scalable, secure Azure-based data solutions
- Lead process improvements, automation, and infrastructure optimization
- Collaborate with stakeholders across business, analytics, and IT teams
- Design and build robust ETL/ELT pipelines for diverse data sources
- Ensure data governance, security compliance, and performance optimization
- Translate business requirements into efficient technical solutions
- Mentor and guide cross-functional teams
Ideal Candidate Profile:
- 10+ years in Data Architecture or similar roles
- Deep expertise in Azure and big data ecosystems
- Proven track record in data modeling , pipeline automation & governance
- Strong collaboration & communication skills
- Full working proficiency in English
- Bonus – ART leadership experience
Education – Degree in Computer Science or equivalent experience; Azure Architect certifications are a plus
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Data Architect
Job Summary:** We are seeking a highly skilled Senior Data Architect with extensive experience in data modeling, Azure Databricks, Azure OpenAI, and associated services. The ideal candidate will be responsible for designing and implementing a scalable, secure, and reliable global data store that supports both batch and near-real-time processing utilizing a medallion architecture. This role requires a deep understanding of data governance, particularly through Unity Catalog, DLT and proficiency in enabling data access via API and Power BI.
**Key Responsibilities:** - Design and implement a robust data architecture that supports scalable data storage and processing. - Develop and maintain data models that facilitate efficient data retrieval and analysis. - Utilize Azure Databricks and Azure OpenAI to enhance data processing capabilities and drive innovation. - Implement a medallion architecture to optimize data flow and ensure high-quality data is available for analytics. - Ensure data governance and compliance through the effective use of Unity Catalog. - Collaborate with cross-functional teams to define data access requirements and ensure seamless integration with APIs and Power BI. - Monitor and optimize data performance, ensuring reliability and security across the data ecosystem. - Stay current with industry trends and emerging technologies to continuously improve data architecture and practices.
**Qualifications:** - Bachelor’s degree in Computer Science, Information Technology, or a related field; - Proven experience as a Data Architect or similar role, with a strong portfolio of successful data architecture projects. - Expertise in data modeling techniques and best practices. - Proficient in Azure Databricks, Azure OpenAI, and associated Azure services. - Strong understanding of medallion architecture and its application in data processing. - Experience with data governance frameworks, particularly Unity Catalog. - Familiarity with API development and Power BI for data visualization. - Excellent problem-solving skills and the ability to work collaboratively in a team environment.
Be The First To Know
About the latest Data architect Jobs in Bengaluru !
Data Architect
Posted 1 day ago
Job Viewed
Job Description
About the Company
Re:Sources is the backbone of Publicis Groupe , the world's third-largest communications group. Formed in 1998 as a small team to service a few Publicis Groupe firms, Re:Sources has grown to 5,000+ people servicing a global network of prestigious advertising, public relations, media, healthcare and digital marketing agencies. We provide technology solutions and business services including finance, accounting, legal, benefits, procurement, tax, real estate, treasury and risk management to help Publicis Groupe agencies do what they do best: create and innovate for their clients.
In addition to providing essential, everyday services to our agencies, Re:Sources develops and implements platforms, applications and tools to enhance productivity, encourage collaboration and enable professional and personal development. We continually transform to keep pace with our ever-changing communications industry and thrive on a spirit of innovation felt around the globe. With our support, Publicis Groupe agencies continue to create and deliver award-winning campaigns for their clients.
Job Location: Gurgaon, Bengaluru, Pune
Responsibilities
Must have skills:
- Strong written and verbal communication skills
- Strong experience in implementing Graph database technologies (property graph)
- Strong experience in leading data modelling activities for a production graph database solution
- Strong experience in Cypher (or Tinkerpop Gremlin) with understanding of tuning
- Strong experience working with data integration technologies, specifically Azure Services, ADF, ETLs, JSON, Hop or ETL orchestration tools.
- Strong experience using PySpark, Scala, DataBricks
- 10+ years’ experience in design and implementation of complex distributed systems architectures
- Strong experience with Master Data Management solutions
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Strong knowledge Azure based services
- Strong understanding of RDBMS data structure, Azure Tables, Blob, and other data sources
- Experience with GraphQL
- Experience in high availability and disaster recovery solutions
- Experience with test driven development
- Understanding of Jenkins, CI/CD processes using ADF, and DataBricks.
- Strong analytical skills related to working with unstructured datasets.
- Strong analytical skills necessary to triage and troubleshoot
- Results-oriented and able to work across the organization as an individual contributor
Good to have skills:
- Knowledge in graph data science, such as graph embedding
- Knowledge in Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.)
- Experience in working with EventHub, streaming data.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with Redis
- Understanding of ML models and experience in building ML pipeline, MLflow, AirFlow.
- Bachelor's degree in engineering, computer science, information systems, or a related field from an accredited college or university; Master's degree from an accredited college or university is preferred. Or equivalent work experience.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable Azure based data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
Qualifications
- Bachelor's degree in engineering, computer science, information systems, or a related field from an accredited college or university; Master's degree from an accredited college or university is preferred. Or equivalent work experience.
Required Skills
- Strong written and verbal communication skills
- Strong experience in implementing Graph database technologies (property graph)
- Strong experience in leading data modelling activities for a production graph database solution
- Strong experience in Cypher (or Tinkerpop Gremlin) with understanding of tuning
- Strong experience working with data integration technologies, specifically Azure Services, ADF, ETLs, JSON, Hop or ETL orchestration tools.
- Strong experience using PySpark, Scala, DataBricks
- 10+ years’ experience in design and implementation of complex distributed systems architectures
- Strong experience with Master Data Management solutions
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Strong knowledge Azure based services
- Strong understanding of RDBMS data structure, Azure Tables, Blob, and other data sources
- Experience with GraphQL
- Experience in high availability and disaster recovery solutions
- Experience with test driven development
- Understanding of Jenkins, CI/CD processes using ADF, and DataBricks.
- Strong analytical skills related to working with unstructured datasets.
- Strong analytical skills necessary to triage and troubleshoot
- Results-oriented and able to work across the organization as an individual contributor
Preferred Skills
- Knowledge in graph data science, such as graph embedding
- Knowledge in Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.)
- Experience in working with EventHub, streaming data.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with Redis
- Understanding of ML models and experience in building ML pipeline, MLflow, AirFlow.
Data Architect
Posted 1 day ago
Job Viewed
Job Description
ONLY IMMEDIATE JOINERS (0-10 days)
Data Architect / Data Modeler
Position Overview
Experienced Data Architect / Data Modeler with strong expertise in Financial Services, and preferably exposure to Private Equity. The role involves designing enterprise-level data models, defining data architecture standards, and ensuring the scalability, performance, and governance of data across the organization. The ideal candidate will work closely with business stakeholders, technology teams, and data analysts to enable effective decision-making through robust data architecture.
Key Responsibilities
- Design and implement enterprise data architecture to support business intelligence, analytics, and operational needs.
- Develop conceptual, logical, and physical data models that align with business requirements.
- Collaborate with business and technical stakeholders to define data standards, taxonomies, and metadata frameworks.
- Ensure data integrity, quality, and security across systems.
- Partner with Data Engineers and Analysts to optimize ETL/ELT pipelines and data integration workflows.
- Translate financial services and private equity business requirements into scalable data solutions.
- Evaluate and recommend data management tools, platforms, and technologies (data warehouses, lakes, cloud solutions).
- Establish and enforce best practices in data modeling, governance, and compliance.
Candidate Profile
- 12+ years of experience in data architecture and data modeling.
- Strong expertise in Financial Services domain (banking, investment management, capital markets).
- Experience in Private Equity is highly desirable.
- Proficiency in data modeling tools (PowerDesigner, or similar).
- Strong SQL and database design skills across relational (SQL Server, PostgreSQL) and NoSQL platforms.
- Hands-on experience with cloud platforms (Azure) and modern data warehouses (Snowflake, BigQuery, Azure Synapse).
- Knowledge of ETL/ELT pipelines, data governance, and master data management (MDM).
- Familiarity with reporting and analytics tools (Tableau, Power BI, Looker).
Education
- Bachelor’s degree in computer science, IT, or a related discipline.
Pls send your CV at with CTC and notice period details. we are looking for immediate joiners only.
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Data Architect_Full-Time_Trivandrum/Bangalore/Chennai/Kochi
Job Title: Data Architect
Job Type: Full-Time
Location: Trivandrum/Bangalore/Chennai/Kochi
Exp: 10+ years
Job Description:
•Design and implement scalable, secure, and cost-effective data architectures using GCP.
•Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage.
•Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP.
•Ensure data architecture aligns with business goals, governance, and compliance requirements.
•Collaborate with stakeholders to define data strategy and roadmap.
•Design and deploy BigQuery solutions for optimized performance and cost efficiency.
•Build and maintain ETL/ELT pipelines for large-scale data processing.
•Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration.
•Implement best practices for data security, privacy, and compliance in cloud environments.
•Integrate machine learning workflows with data pipelines and analytics tools.
•Define data governance frameworks and manage data lineage.
•Lead data modeling efforts to ensure consistency, accuracy, and performance across systems.
•Optimize cloud infrastructure for scalability, performance, and reliability.
•Mentor junior team members and ensure adherence to architectural standards.
•Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager).
•Ensure high availability and disaster recovery solutions are built into data systems.
•Conduct technical reviews, audits, and performance tuning for data solutions.
•Design solutions for multi-region and multi-cloud data architecture.
•Stay updated on emerging technologies and trends in data engineering and GCP.
•Drive innovation in data architecture, recommending new tools and services on GCP.
Certifications :
• Google Cloud Certification is Preferred.
Primary Skills :
• 7+ years of experience in data architecture, with at least 3 years in GCP environments.
• Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services.
• Strong experience in data warehousing, data lakes, and real-time data pipelines.
• Proficiency in SQL, Python, or other data processing languages.
• Experience with cloud security, data governance, and compliance frameworks.
•Strong problem-solving skills and ability to architect solutions for complex data environments.
• Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred.
•Leadership experience and ability to mentor technical teams.
• Excellent communication and collaboration skills.