Azure Big Data Architect

Bengaluru, Karnataka Tiger Analytics

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Tiger Analytics is a global analytics consulting firm. With data and technology at the core of our solutions, we are solving some of the toughest problems out there. Our culture is modeled around expertise and mutual respect with a team first mindset.
  • Working at Tiger, you’ll be at the heart of this AI revolution. You’ll work with teams that push the boundaries of what-is-possible and build solutions that energize and inspire. We are headquartered in the Silicon Valley and have our delivery centers across the globe.
  •  Azure Big Data Architect A bout the role: We are looking for a Data Architect / Sr. Data and Pr. Data Architects to be based out of our Chennai office or Remote. This role involves a combination of hands-on contribution, customer engagement and technical team management. As a Data Architect, you will
  • Design, Architect, Deploy and maintain solutions on the MS Azure using different Cloud & Big Data Technologies.
  • Manage the full life-cycle of a Data Lake / Big Data solutions from requirement gathering and analysis to platform selection, design of the architecture and deployment.
  • Be responsible for implementing solutions which can scale on Cloud.
  • Collaborate with a team of business domain experts, data scientists and application developers to develop BD solutions.
  • Explore and learn new technologies for creative business problem solving and mentor a team of Data Engineers.
  • Job Requirement Required Experience, Skills & Competencies:
  • Strong Hands-on experience in implementing Data Lake with technologies like – Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB and Purview.
  • Experience of using big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase or MongoDB, Neo4J, Elastic Search, Impala, Sqoop etc.
  • Strong programming & debugging skills either in Python and Scala/Java. Experience of building REST services is good to have.
  • Experience of supporting BI and Data Science teams in consuming the data in a secure and governed manner.
  • Good understanding and Experience of using CI/CD with Git, Jenkins / Azure DevOps.
  • Experience of setting up cloud-computing infrastructure solutions.
  • Hands on Experience/Exposure to NoSQL Databases and Data Modelling in Hive
  • 9+ years of technical experience with at-least 2 years on MS Azure and 2 year on Hadoop (CDH/HDP).
  • B.Tech/B.E from reputed institute preferred.
  • Designation will be commensurate with expertise/experience. Compensation packages are among the best in the industry.
    This advertiser has chosen not to accept applicants from your region.

    Cloud and Big data Architect

    Bengaluru, Karnataka ₹1500000 - ₹2500000 Y Societe Generale Global Solution Centre

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Role: Cloud and Big data Architect

    Role Description

    We are looking for skilled and passionate Azure Cloud experts with a strong focus on Big Data management and data quality skills to join our team at Socit Gnrale Global Solution Centre in Bengaluru.

    This is a full-time hybrid role for Digital & Data Solutions (DDS), offering flexibility for remote work.

    Required Skills & Experience:

    • Hands-on experience in Azure for a minimum of 34 years, with overall professional experience of less than 10 years.
    • Proficient in Big Data technologies (e.g., Hadoop, Spark, Kafka) and their integration with cloud environments.
    • Strong understanding of data management principles and best practices, including data governance and data lifecycle management.
    • Knowledge of Data quality and Governance frameworks and tools to ensure accuracy, completeness, and reliability of data.
    • Scripting knowledge in PowerShell, Python, and Shell scripting.
    • Good knowledge of CI/CD integration tools such as Jenkins, Argo CD, and GitHub Actions.
    • Azure certification (AZ-104 / AZ-305 / AZ-204) and DevOps experience are a plus.
    • Experience working with SQL and NoSQL databases.
    • Finance and banking experience will be highly appreciated.

    Roles & Responsibilities:

    • Strong understanding of Big Data architecture, cloud adoption strategies, and solution design.
    • Experience with hybrid cloud environments and migrating on-premises applications and databases to the cloud.
    • Ensure data governance and applicative implementation standards on the data lake store. Review and capture non-compliance with data standards, propose remediation, and follow up with upstream and downstream stakeholders.
    • Stay updated with the latest trends in Azure cloud technologies, Data management and governance, and industry best practices to inform and improve solutions.
    • Excellent communication and collaboration skills, with a strong ability to work in a team-oriented environment.
    • Analyze existing operational standards, processes, and governance to identify and implement improvements.
    • Conduct POCs in Data and Cloud Platform to ensure that suggested solutions and technologies meet the requirements.
    • Stay updated with the latest trends in cloud technologies, Data management services, and industry best practices to inform and improve solutions.
    • Strong analytical and problem-solving skills, with the ability to troubleshoot complex issues in a fast-paced environment.

    Why Join Us:

    We are committed to creating a diverse environment and are proud to be an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status

    This advertiser has chosen not to accept applicants from your region.

    Azure Big Data Architect engineer

    Bengaluru, Karnataka Anicalls (Pty) Ltd

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    • Azure Data Factory
    • Azure Databricks
    • Python, Scala, PySpark, Spark
    • HIVE / HIVE LLAP / HBASE / CosmoDb
    • Azure Active Directory Domain Services
    • Apache Ranger / Apache Ambari
    • Azure Key Vault
    • Expertise in HDInsight (Minimum 2-3 years’ experience with multiple implementations)
    • Expertise in Cloud Native and Open Cloud Architecture
    This advertiser has chosen not to accept applicants from your region.

    Big Data Technical Architect

    Bengaluru, Karnataka ₹1200000 - ₹3600000 Y Srs Business Solutions India

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Hello,

    We are hiring for "Big Data Architect" for Bangalore location.

    Exp: 10+Years

    Loc: Bangalore

    Work Mode: Hybrid

    Notice Period: Immediate joiners(notice period served or serving candidate)

    NOTE: We are looking for Immediate joiners(notice period served or serving candidate)

    Apply only If you are Immediate joiners(notice period served or serving candidate)

    Apply only if you are having 10+Years of relevant experience as per the JD.

    Job Description:

    We are seeking a 10+ years experienced and technically proficient Big Data Technical Architect with deep expertise in cloud platforms (AWS, Azure) and Business Intelligence (BI) tools like Tableau, Power BI, and Qlik. The ideal candidate will have a solid understanding of modern data architecture patterns, data lake and lakehouse strategies, robust ETL/ELT design, real-time data ingestion, and hands-on experience with tools such as Snowflake, AWS Glue, Microsoft Fabric, Spark, and Python. This role requires a strategic thinker with excellent communication skills to interface with clients and internal stakeholders and lead enterprise-level data initiatives.

    Roles & Responsibilities

    Key Responsibilities:

    • • Architect and guide implementation of Data Lakes, Lakehouses, and Data Warehouses using tools such as Snowflake, Microsoft Fabric, and Delta Lake.
    • • Design and implement scalable, secure, and high-performing Big Data architectures across AWS and Azure.
    • • Develop robust ETL/ELT pipelines using modern data services like AWS Glue, Azure Data Factory, Spark, and custom scripts in Python/SQL/PLSQL.
    • • Integrate structured and unstructured data sources using API integrations, event-driven pipelines, real-time data ingestion, and batch processing.
    • • Lead the BI and analytics layer strategy using tools such as Tableau, Power BI, and Qlik Sense for enterprise reporting and dashboarding.
    • • Design and implement data models (conceptual, logical, physical) that support both operational and analytical requirements.
    • • Establish and enforce data governance, data security, and data quality standards across platforms.
    • • Drive initiatives in data observability, monitoring data pipelines, identifying issues, and ensuring SLA adherence.
    • • Serve as a technical SME and advisor to both internal teams and clientstranslating business needs into technical solutions.
    • • Lead architectural reviews and provide guidance on data best practices and cloud optimization.
    • • Develop and deliver technical presentations to executive and non-technical stakeholders.

    Domain Experience (Good to Have):

    • • Exposure to BFSI domain, including understanding of risk management, regulatory compliance (Basel III, PCI DSS), fraud detection, and financial data workflows.
    • • Familiarity with Retail data challenges such as supply chain analytics, customer behavior tracking, inventory management, and omni-channel reporting.
    • • Experience with Pharma and Healthcare sectors, including clinical data management, regulatory compliance (HIPAA, GDPR), patient analytics, and drug safety data.
    • • Ability to adapt data architecture and BI solutions to domain-specific requirements across these industries, supporting both operational and strategic business goals.

    Required Skills:

    • • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.
    • • Strong hands-on experience with cloud platforms and related data technologies:
    • o AWS: S3, AWS Glue, Redshift, Lambda, Kinesis Data Streams & Firehose, Managed Kafka (MSK), EMR (Spark), Athena, IAM, KMS.
    • o Azure: Data Lake Storage Gen2, Synapse Analytics, Data Factory, Event Hubs, Stream Analytics, Managed Kafka, Databricks, Azure Functions, Active Directory, Key Vault.
    • • Proven expertise in building and optimizing ETL/ELT pipelines using AWS Glue, Azure Data Factory, Apache Spark, and scripting languages like Python, SQL, and PL/SQL.
    • • Solid experience with data lake and lakehouse strategies, and hands-on with modern data warehouse platforms such as Snowflake and Microsoft Fabric.
    • • Skilled in real-time data ingestion and streaming technologies like Apache Kafka, AWS Kinesis, Azure Event Hubs, and Spark Streaming.
    • • Deep understanding of data modeling concepts (conceptual, logical, physical) and best practices for both OLTP and OLAP systems.
    • • Expertise in business intelligence tools such as Tableau, Power BI, and Qlik Sense for enterprise-grade dashboards and analytics.
    • • Strong grasp of data governance, data security (encryption, access control), data quality frameworks, and data observability tools like Monte Carlo, DataDog, or Great Expectations.
    • • Familiarity with relevant data privacy and regulatory compliance standards (GDPR, CCPA, HIPAA, PCI DSS).
    • • Excellent client-facing communication skills with ability to explain complex technical concepts to non-technical stakeholders.
    • • Proven leadership and mentoring capabilities in guiding cross-functional teams.

    Qualifications:

    • • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.
    This advertiser has chosen not to accept applicants from your region.

    Big Data Solution Architect

    Bengaluru, Karnataka Epam

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Description

    EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

    We are looking for Solution Architects for data-driven projects to join our Data Practice team in India. Together we design and drive lots of solutions that generate value from data, taking advantage of scalable platforms, cutting-edge technologies, and machine learning algorithms. We provide a solid architecture framework, educational programs, and a strong SA community to support our new Architects in a deep dive into the data domain.

    #LI-DNI #REF-IN-WOMEN

    Responsibilities

  • Design data analytics solutions by utilising the big data technology stack
  • Create and present solution architecture documents with deep technical details
  • Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions
  • Conduct solution architecture review/audit, calculate and present ROI
  • Lead implementation of the solutions from establishing project requirements and goals to solution "go-live"
  • Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives
  • Create and follow personal education plan in the technology stack and solution architecture
  • Maintain a strong understanding of industry trends and best practices
  • Get involved in engaging new clients to further drive EPAM business in the big data space
  • Requirements

  • Should have minimum experience of 12+ yrs
  • Strong hands-on experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python
  • Experience delivering data analytics projects and architecture guidelines
  • Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud)
  • Production project experience in at least one of the big data technologies
  • Batch processing: Hadoop and MapReduce/Spark/Hive
  • NoSQL databases: Cassandra/HBase/Accumulo/Kudu
  • Knowledge of Agile development methodology, Scrum in particular
  • Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments
  • Experience working within a consulting business and pre-sales experience would be highly attractive
  • We offer

  • Opportunity to work on technical challenges that may impact across geographies
  • Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications
  • Opportunity to share your ideas on international platforms
  • Sponsored Tech Talks & Hackathons
  • Unlimited access to LinkedIn learning solutions
  • Possibility to relocate to any EPAM office for short and long-term projects
  • Focused individual development
  • Benefit package: Health benefits Retirement benefits Paid time off Flexible benefits
  • Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)
  • This advertiser has chosen not to accept applicants from your region.

    Data Architect

    Bengaluru, Karnataka ScaleneWorks People Solutions LLP

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    We are looking for experienced Data Architects who can join us within 30 days .

    If you have strong expertise in data architecture, cloud platforms, and big data technologies , this is your chance to work on exciting enterprise-level projects.

    Key Skills & Experience:

    • System Architect – SAFe Train, Data Engineering & BI (Power BI)
    • Data & Reporting Platform (Power BI, DAX, SQL – AHDRP)
    • Cloud Technologies – Microsoft Azure preferred (other clouds considered)
    • Data Management – Hadoop, Spark, Kafka
    • Databases – Azure SQL, Cosmos DB, Couchbase
    • CI/CD & Data Pipeline development
    • Programming – Java, Scala, Skube, Python
    • DevOps – AUDs, KeyVault, ArgoCD
    • Reporting & Analytics – Power Query, MDX, Power BI, DAX
    • Automation – Robot Framework, Selenium, Octane

    What You’ll Do:

    • Architect and implement scalable, secure Azure-based data solutions
    • Lead process improvements, automation, and infrastructure optimization
    • Collaborate with stakeholders across business, analytics, and IT teams
    • Design and build robust ETL/ELT pipelines for diverse data sources
    • Ensure data governance, security compliance, and performance optimization
    • Translate business requirements into efficient technical solutions
    • Mentor and guide cross-functional teams

    Ideal Candidate Profile:

    • 10+ years in Data Architecture or similar roles
    • Deep expertise in Azure and big data ecosystems
    • Proven track record in data modeling , pipeline automation & governance
    • Strong collaboration & communication skills
    • Full working proficiency in English
    • Bonus – ART leadership experience

    Education – Degree in Computer Science or equivalent experience; Azure Architect certifications are a plus

    This advertiser has chosen not to accept applicants from your region.

    Data Architect

    Bengaluru, Karnataka Tech Mahindra

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Data Architect

    Job Summary:** We are seeking a highly skilled Senior Data Architect with extensive experience in data modeling, Azure Databricks, Azure OpenAI, and associated services. The ideal candidate will be responsible for designing and implementing a scalable, secure, and reliable global data store that supports both batch and near-real-time processing utilizing a medallion architecture. This role requires a deep understanding of data governance, particularly through Unity Catalog, DLT and proficiency in enabling data access via API and Power BI.

    **Key Responsibilities:** - Design and implement a robust data architecture that supports scalable data storage and processing. - Develop and maintain data models that facilitate efficient data retrieval and analysis. - Utilize Azure Databricks and Azure OpenAI to enhance data processing capabilities and drive innovation. - Implement a medallion architecture to optimize data flow and ensure high-quality data is available for analytics. - Ensure data governance and compliance through the effective use of Unity Catalog. - Collaborate with cross-functional teams to define data access requirements and ensure seamless integration with APIs and Power BI. - Monitor and optimize data performance, ensuring reliability and security across the data ecosystem. - Stay current with industry trends and emerging technologies to continuously improve data architecture and practices.

    **Qualifications:** - Bachelor’s degree in Computer Science, Information Technology, or a related field; - Proven experience as a Data Architect or similar role, with a strong portfolio of successful data architecture projects. - Expertise in data modeling techniques and best practices. - Proficient in Azure Databricks, Azure OpenAI, and associated Azure services. - Strong understanding of medallion architecture and its application in data processing. - Experience with data governance frameworks, particularly Unity Catalog. - Familiarity with API development and Power BI for data visualization. - Excellent problem-solving skills and the ability to work collaboratively in a team environment.

    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Data architect Jobs in Bengaluru !

    Data Architect

    Bengaluru, Karnataka Publicis Re:Sources

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    About the Company


    Re:Sources is the backbone of Publicis Groupe , the world's third-largest communications group. Formed in 1998 as a small team to service a few Publicis Groupe firms, Re:Sources has grown to 5,000+ people servicing a global network of prestigious advertising, public relations, media, healthcare and digital marketing agencies. We provide technology solutions and business services including finance, accounting, legal, benefits, procurement, tax, real estate, treasury and risk management to help Publicis Groupe agencies do what they do best: create and innovate for their clients.

    In addition to providing essential, everyday services to our agencies, Re:Sources develops and implements platforms, applications and tools to enhance productivity, encourage collaboration and enable professional and personal development. We continually transform to keep pace with our ever-changing communications industry and thrive on a spirit of innovation felt around the globe. With our support, Publicis Groupe agencies continue to create and deliver award-winning campaigns for their clients.


    Job Location: Gurgaon, Bengaluru, Pune


    Responsibilities


    Must have skills:


    • Strong written and verbal communication skills
    • Strong experience in implementing Graph database technologies (property graph)
    • Strong experience in leading data modelling activities for a production graph database solution
    • Strong experience in Cypher (or Tinkerpop Gremlin) with understanding of tuning
    • Strong experience working with data integration technologies, specifically Azure Services, ADF, ETLs, JSON, Hop or ETL orchestration tools.
    • Strong experience using PySpark, Scala, DataBricks
    • 10+ years’ experience in design and implementation of complex distributed systems architectures
    • Strong experience with Master Data Management solutions
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Strong knowledge Azure based services
    • Strong understanding of RDBMS data structure, Azure Tables, Blob, and other data sources
    • Experience with GraphQL
    • Experience in high availability and disaster recovery solutions
    • Experience with test driven development
    • Understanding of Jenkins, CI/CD processes using ADF, and DataBricks.
    • Strong analytical skills related to working with unstructured datasets.
    • Strong analytical skills necessary to triage and troubleshoot
    • Results-oriented and able to work across the organization as an individual contributor


    Good to have skills:


    • Knowledge in graph data science, such as graph embedding
    • Knowledge in Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.)
    • Experience in working with EventHub, streaming data.
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with Redis
    • Understanding of ML models and experience in building ML pipeline, MLflow, AirFlow.
    • Bachelor's degree in engineering, computer science, information systems, or a related field from an accredited college or university; Master's degree from an accredited college or university is preferred. Or equivalent work experience.
    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
    • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
    • A successful history of manipulating, processing and extracting value from large disconnected datasets.
    • Working knowledge of message queuing, stream processing, and highly scalable Azure based data stores.
    • Strong project management and organizational skills.
    • Experience supporting and working with cross-functional teams in a dynamic environment.


    Qualifications


    • Bachelor's degree in engineering, computer science, information systems, or a related field from an accredited college or university; Master's degree from an accredited college or university is preferred. Or equivalent work experience.


    Required Skills


    • Strong written and verbal communication skills
    • Strong experience in implementing Graph database technologies (property graph)
    • Strong experience in leading data modelling activities for a production graph database solution
    • Strong experience in Cypher (or Tinkerpop Gremlin) with understanding of tuning
    • Strong experience working with data integration technologies, specifically Azure Services, ADF, ETLs, JSON, Hop or ETL orchestration tools.
    • Strong experience using PySpark, Scala, DataBricks
    • 10+ years’ experience in design and implementation of complex distributed systems architectures
    • Strong experience with Master Data Management solutions
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Strong knowledge Azure based services
    • Strong understanding of RDBMS data structure, Azure Tables, Blob, and other data sources
    • Experience with GraphQL
    • Experience in high availability and disaster recovery solutions
    • Experience with test driven development
    • Understanding of Jenkins, CI/CD processes using ADF, and DataBricks.
    • Strong analytical skills related to working with unstructured datasets.
    • Strong analytical skills necessary to triage and troubleshoot
    • Results-oriented and able to work across the organization as an individual contributor


    Preferred Skills


    • Knowledge in graph data science, such as graph embedding
    • Knowledge in Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.)
    • Experience in working with EventHub, streaming data.
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with Redis
    • Understanding of ML models and experience in building ML pipeline, MLflow, AirFlow.
    This advertiser has chosen not to accept applicants from your region.

    Data Architect

    Bengaluru, Karnataka ThoughtFocus

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    ONLY IMMEDIATE JOINERS (0-10 days)


    Data Architect / Data Modeler

    Position Overview

    Experienced Data Architect / Data Modeler with strong expertise in Financial Services, and preferably exposure to Private Equity. The role involves designing enterprise-level data models, defining data architecture standards, and ensuring the scalability, performance, and governance of data across the organization. The ideal candidate will work closely with business stakeholders, technology teams, and data analysts to enable effective decision-making through robust data architecture.


    Key Responsibilities

    • Design and implement enterprise data architecture to support business intelligence, analytics, and operational needs.
    • Develop conceptual, logical, and physical data models that align with business requirements.
    • Collaborate with business and technical stakeholders to define data standards, taxonomies, and metadata frameworks.
    • Ensure data integrity, quality, and security across systems.
    • Partner with Data Engineers and Analysts to optimize ETL/ELT pipelines and data integration workflows.
    • Translate financial services and private equity business requirements into scalable data solutions.
    • Evaluate and recommend data management tools, platforms, and technologies (data warehouses, lakes, cloud solutions).
    • Establish and enforce best practices in data modeling, governance, and compliance.


    Candidate Profile

    • 12+ years of experience in data architecture and data modeling.
    • Strong expertise in Financial Services domain (banking, investment management, capital markets).
    • Experience in Private Equity is highly desirable.
    • Proficiency in data modeling tools (PowerDesigner, or similar).
    • Strong SQL and database design skills across relational (SQL Server, PostgreSQL) and NoSQL platforms.
    • Hands-on experience with cloud platforms (Azure) and modern data warehouses (Snowflake, BigQuery, Azure Synapse).
    • Knowledge of ETL/ELT pipelines, data governance, and master data management (MDM).
    • Familiarity with reporting and analytics tools (Tableau, Power BI, Looker).

    Education

    • Bachelor’s degree in computer science, IT, or a related discipline.


    Pls send your CV at with CTC and notice period details. we are looking for immediate joiners only.

    This advertiser has chosen not to accept applicants from your region.

    Data Architect

    Bengaluru, Karnataka Best Infosystems Ltd.

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Data Architect_Full-Time_Trivandrum/Bangalore/Chennai/Kochi


    Job Title: Data Architect

    Job Type: Full-Time

    Location: Trivandrum/Bangalore/Chennai/Kochi

    Exp: 10+ years


    Job Description:

    •Design and implement scalable, secure, and cost-effective data architectures using GCP.

    •Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage.

    •Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP.

    •Ensure data architecture aligns with business goals, governance, and compliance requirements.

    •Collaborate with stakeholders to define data strategy and roadmap.

    •Design and deploy BigQuery solutions for optimized performance and cost efficiency.

    •Build and maintain ETL/ELT pipelines for large-scale data processing.

    •Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration.

    •Implement best practices for data security, privacy, and compliance in cloud environments.

    •Integrate machine learning workflows with data pipelines and analytics tools.

    •Define data governance frameworks and manage data lineage.

    •Lead data modeling efforts to ensure consistency, accuracy, and performance across systems.

    •Optimize cloud infrastructure for scalability, performance, and reliability.

    •Mentor junior team members and ensure adherence to architectural standards.

    •Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager).

    •Ensure high availability and disaster recovery solutions are built into data systems.

    •Conduct technical reviews, audits, and performance tuning for data solutions.

    •Design solutions for multi-region and multi-cloud data architecture.

    •Stay updated on emerging technologies and trends in data engineering and GCP.

    •Drive innovation in data architecture, recommending new tools and services on GCP.


    Certifications :

    • Google Cloud Certification is Preferred.


    Primary Skills :

    • 7+ years of experience in data architecture, with at least 3 years in GCP environments.

    • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services.

    • Strong experience in data warehousing, data lakes, and real-time data pipelines.

    • Proficiency in SQL, Python, or other data processing languages.

    • Experience with cloud security, data governance, and compliance frameworks.

    •Strong problem-solving skills and ability to architect solutions for complex data environments.

    • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred.

    •Leadership experience and ability to mentor technical teams.

    • Excellent communication and collaboration skills.

    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Architect Jobs View All Jobs in Bengaluru