179 Data Architect jobs in Mumbai
Data Architect - Hadoop/Big Data
Posted today
Job Viewed
Job Description
• As individual contributor code and test modules.
• Interact and collaborate directly with software developers, product managers, and business analysts to ensure proper development and quality of service applications and products.
• Ability to do development in an Agile environment.
• Work closely with Leads and Architects to understand the requirements and translate that into code.
• Mentor junior engineers if required.
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Role: Technical Architect - Data
Experience Level: 10 to 15 Years
Work location: Mumbai, Bangalore, Trivandrum (Hybrid)
Notice Period: Any
Role & Responsibilities:
- More than 8 years of experience in Technical, Solutioning, and Analytical roles.
- 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
- Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
- Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
- Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
- Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
- Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
- Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
- Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
- Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
- Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
- Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
- Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
- Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
- Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
Role:
- Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
- Interface with multiple stakeholders within IT and business to understand the data requirements.
- Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
- Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
- Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
- Mentor Young Talent within the Team, Define and track their growth parameters.
- Contribute to building Assets and Accelerators.
Other Skills:
- Strong Communication and Articulation Skills.
- Good Leadership Skills.
- Should be a good team player.
- Good Analytical and Problem-solving skills.
Data Architect
Posted today
Job Viewed
Job Description
Key Responsibilities:
Platform Management & Operations:
- Oversee the day-to-day operations, administration, and L2/L3 support for Dell Boomi, webMethods, JS7, and Xtract Universal platforms, ensuring high availability, reliability, security, and optimal performance across our global infrastructure.
- Manage Service Level Agreements (SLAs), operational procedures (SOPs), monitoring, alerting, and incident/problem management for the middleware and scheduler environments, adhering to ITIL best practices.
- Plan and execute platform patching, upgrades, disaster recovery testing, and capacity management activities in coordination with global teams.
Integration Development & Delivery:
- Lead the design, development, testing, and deployment of new integrations and enhancements on Boomi and webMethods platforms according to business requirements from various functional areas.
- Establish and enforce development standards, best practices, code review processes, and governance for the integration development lifecycle (SDLC).
- Oversee the development, maintenance, and optimization of jobs and complex workflows within the multiple scheduler platforms.
Platform Optimization & Strategy:
- Continuously evaluate and optimize the performance, efficiency, security posture, and cost-effectiveness of the middleware platforms and integration processes.
- Develop and maintain the strategic roadmap for PerkinElmer's middleware and integration technologies, aligning with overall IT and business strategies, including cloud adoption, API-led integration patterns, and event-driven architectures.
- Manage vendor relationships and licenses for key middleware and scheduler software (Dell, Software AG, SOS GmbH, Theobald Software).
Architecture & Cross-Functional Alignment:
- Act as the lead architect for integration solutions, ensuring designs are scalable, secure, resilient, maintainable, and adhere to PerkinElmer's enterprise architecture standards.
- Collaborate closely with global Enterprise Application leads (e.g., SAP, Salesforce, Data Warehouse, etc.), Infrastructure teams, Security teams, and business stakeholders to understand requirements and design effective end-to-end integration solutions.
- Provide integration expertise, architectural guidance, and resource estimation for platform modernization efforts and other capital projects impacting the Enterprise Applications landscape.
Leadership & Team Management:
- Build, lead, mentor, and develop a high-performing team of middleware engineers and administrators based in Pune and potentially coordinating with resources in other regions.
- Manage resource allocation, and project prioritization for the middleware function.
- Foster a culture of technical excellence, innovation, collaboration, security awareness, and continuous improvement within the team.
- Communicate effectively with senior leadership and stakeholders regarding platform status, strategy, risks, and project updates.
Data Architect
Posted today
Job Viewed
Job Description
We are looking for an experienced Senior Data Architect with strong hands-on expertise in Databricks and Master Data Management (MDM) for a 30-day on-site assignment at our client's location.
- Location: Mumbai Thane
- Mode of Work: Onsite (Client Location)
- Start Date: Immediate joiners preferred
We are seeking an experienced and visionary Senior Databricks Architect with 12 + experience to lead the design, development, and implementation of cutting-edge data solutions on both Azure and AWS cloud platforms. This role is pivotal in shaping our data strategy, driving digital transformation, and ensuring the delivery of high-quality, scalable, and secure data systems. The ideal candidate will possess deep expertise in Databricks Lakehouse architecture, robust knowledge of cloud data services, and proven experience in data consulting, Master Data Management (MDM), and data quality frameworks.
Responsibilities:
- Strategic Data Leadership:
- Define and champion the enterprise data strategy, roadmap, and architectural principles aligning with business objectives.
- Lead the adoption of Databricks Lakehouse architecture across Azure and AWS environments.
- Provide expert consulting to business stakeholders and technical teams on data-related challenges, opportunities, and best practices.
- Drive innovation by evaluating and integrating new technologies and approaches within the data ecosystem.
- Architectural Design & Implementation:
- Architect end-to-end scalable, high-performance, and cost-effective data solutions using Databricks on both Azure (Azure Data Lake Storage, Azure Data Factory, Azure Synapse Analytics) and AWS (S3, Glue, Lambda, Redshift).
- Design and optimize complex ETL/ELT pipelines for batch and real-time data processing, leveraging Apache Spark and Databricks.
- Develop and enforce architectural standards, patterns, and best practices for data modeling, data warehousing, and data lake implementations.
- Oversee the technical design, development, and deployment of data solutions, ensuring adherence to architectural guidelines.
- Data Governance & Quality:
- Design and implement robust Master Data Management (MDM) strategies and solutions to ensure data consistency, accuracy, and completeness across critical business entities.
- Develop and enforce data quality frameworks, including data profiling, cleansing, validation, and monitoring mechanisms.
- Establish and maintain data governance policies, metadata management, and data lineage tracking.
- Ensure data security, privacy (e.g., GDPR, CCPA), and compliance within all data solutions.
- Team Leadership & Mentorship:
- Provide technical leadership, guidance, and mentorship to data engineers, data scientists, and other technical team members.
- Conduct architectural reviews and ensure the quality and integrity of delivered solutions.
- Foster a culture of continuous learning and improvement within the data team.
- Stakeholder Collaboration:
- Collaborate closely with business leaders, product owners, and cross-functional teams to translate complex business requirements into technical data solutions.
- Communicate complex technical concepts clearly and concisely to both technical and non-technical audiences.
- Manage and prioritize competing demands and lead discussions to achieve consensus on architectural decisions.
Required Skills & Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field.
- 10+ years of progressive experience in data architecture, with at least 5 years in a senior or lead architect role.
- Expert-level proficiency in Databricks Lakehouse Platform, including Databricks Workspace, Delta Lake, Unity Catalog, Databricks SQL, and Databricks Machine Learning.
- Deep hands-on experience with both Azure and AWS cloud platforms and their respective data services:
- Azure: Azure Data Lake Storage (ADLS Gen2), Azure Data Factory (ADF), Azure Synapse Analytics, Azure Event Hubs, Azure Stream Analytics, Azure SQL Database, Azure Cosmos DB.
- AWS: S3, AWS Glue, AWS Lambda, Amazon Redshift, Amazon Kinesis, Amazon EMR, Amazon DynamoDB.
- Proven experience in designing and implementing large-scale data systems (data warehouses, data lakes, data meshes).
- Strong expertise in Master Data Management (MDM) concepts, tools, and implementation methodologies.
- Extensive experience in developing and implementing data quality frameworks and data governance strategies.
- Proficiency in programming languages such as Python (PySpark), Scala, and SQL.
- Strong understanding of data modelling techniques (dimensional modeling, data vault, 3NF).
- Experience with real-time streaming data technologies (e.g., Kafka, Spark Streaming, Azure Event Hubs/Kafka, Kinesis).
- Familiarity with DevOps practices, CI/CD pipelines (e.g., Azure DevOps, GitHub Actions, Jenkins), and infrastructure as code (e.g., Terraform).
- Excellent analytical, problem-solving, and communication skills.
- Ability to work effectively in a fast-paced, agile environment and manage multiple priorities.
Preferred Qualifications:
- Databricks Certified Architect or Data Engineer certifications.
- Azure or AWS Solution Architect certifications.
- Experience with data visualization tools (e.g., Power BI, Tableau).
- Knowledge of advanced analytics, machine learning, and AI concepts.
- Experience in a consulting environment. This is a critical role for our organization, offering the opportunity to make a significant impact on our data landscape and drive innovation. If you are a passionate and experienced Databricks architect with a strong background in Azure and AWS, we encourage you to apply.
Data Architect
Posted today
Job Viewed
Job Description
Position Purpose
The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data.
The scope is CIB EMEA and CIB ASIA
Responsibilities
Direct Responsibilities
Engage with key business stakeholders to assist with establishing fundamental data governance processes
Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards
Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation
Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects.
Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy
Allows people to step logically through the Information System (be able to train them to use tools like Abacus)
Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings.
Skill Area
Expected
Communicating between the technical and the non-technical
Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge.
Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics.
Data Modelling (Business Flows of Data in Abacus)
Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models.
Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards.
Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs.
Data Standards (Rules defined to manage/ maintain Data)
Develops and sets data standards for an organisation.
Communicates the business benefit of data standards, championing and governing those standards across the organisation.
Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach.
Metadata Management
Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets.
Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date.
The objective is to manage own learning and contribute to domain knowledge building
Turning business problems into data design
Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements.
Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns.
Contributing Responsibilities
It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment.
The Data Architect needs to have the knowledge of:
The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles
The activities Global Market and/or Global Banking
Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022)
Skill Area
Expected
Data Communication
Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals.
Presents, communicates and disseminates data appropriately and with high impact.
Able to create basic visuals and presentations.
Data Governance
Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service.
Understands what data governance is required and contribute to these data governance.
Data Innovation
Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes
Aware of opportunities for innovation with new tools and uses of data
Technical & Behavioral Competencies
Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics.
Able to create basic visuals and presentations.
Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc)
Experience in working with BI Tools (Like Power BI)
Good understanding of Excel (formulas and Functions)
Specific Qualifications (if required)
Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA
Data Architect
Posted today
Job Viewed
Job Description
- Analyze and understand customers use case and data sources and extract, transform and load data from multitude of customers nterprise sources and ingest into Adobe Experience Platform
- Design and build data ingestion pipelines into the platform using PySpark
- Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed.
- Develop and test complex SQLs to extractanalyze and report the data ingested into the Adobe Experience platform.
- Ensure the SQLs are implemented in compliance with the best practice to they are performant.
- Migrate platform configurations, including the data ingestion pipelines and SQL, across various sandboxes.
- Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues.
- Support Data Architects in implementing data model in the platform.
- Contribute to the innovation charter and develop intellectual property for the organization.
- Present on advanced features and complex use case implementations at multiple forums.
- Attend regular scrum events or equivalent and provide update on the deliverables.
- Work independently across multiple engagements with none or minimum supervision.
Data Architect
Posted 1 day ago
Job Viewed
Job Description
Experience Level: 10 to 15 Years
Work location: Mumbai, Bangalore, Trivandrum (Hybrid)
Notice Period: Any
Role & Responsibilities:
More than 8 years of experience in Technical, Solutioning, and Analytical roles.
5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
Role:
Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
Interface with multiple stakeholders within IT and business to understand the data requirements.
Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
Mentor Young Talent within the Team, Define and track their growth parameters.
Contribute to building Assets and Accelerators.
Other Skills:
Strong Communication and Articulation Skills.
Good Leadership Skills.
Should be a good team player.
Good Analytical and Problem-solving skills.
Be The First To Know
About the latest Data architect Jobs in Mumbai !
Data Architect
Posted today
Job Viewed
Job Description
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : Microsoft Azure Data Services
Good to have skills : NA
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting the application’s needs. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Good To Have Skills: Experience with data modeling tools and techniques. - Strong understanding of data integration processes and ETL methodologies. - Familiarity with cloud-based data storage solutions and architectures. - Experience in performance tuning and optimization of data systems. Additional Information: - The candidate should have minimum 7.5 years of experience in Microsoft Azure Data Services. - This position is based in Mumbai. - A 15 years full time education is required.15 years full time education
About Accenture
We work with one shared purpose: to deliver on the promise of technology and human ingenuity. Every day, more than 775,000 of us help our stakeholders continuously reinvent. Together, we drive positive change and deliver value to our clients, partners, shareholders, communities, and each other.We believe that delivering value requires innovation, and innovation thrives in an inclusive and diverse environment. We actively foster a workplace free from bias, where everyone feels a sense of belonging and is respected and empowered to do their best work.At Accenture, we see well-being holistically, supporting our people’s physical, mental, and financial health. We also provide opportunities to keep skills relevant through certifications, learning, and diverse work experiences. We’re proud to be consistently recognized as one of the World’s Best Workplaces™.Join Accenture to work at the heart of change.Visit us at
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, militaryveteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by applicablelaw. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
Data Architect
Posted today
Job Viewed
Job Description
Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills : Microsoft Azure Data Services
Good to have skills : NA
Minimum 7.5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting the application’s needs. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Good To Have Skills: Experience with data modeling tools and techniques. - Strong understanding of data integration processes and ETL methodologies. - Familiarity with cloud-based data storage solutions and architectures. - Experience in performance tuning and optimization of data systems. Additional Information: - The candidate should have minimum 7.5 years of experience in Microsoft Azure Data Services. - This position is based in Mumbai. - A 15 years full time education is required.15 years full time education
About Accenture
We work with one shared purpose: to deliver on the promise of technology and human ingenuity. Every day, more than 775,000 of us help our stakeholders continuously reinvent. Together, we drive positive change and deliver value to our clients, partners, shareholders, communities, and each other.We believe that delivering value requires innovation, and innovation thrives in an inclusive and diverse environment. We actively foster a workplace free from bias, where everyone feels a sense of belonging and is respected and empowered to do their best work.At Accenture, we see well-being holistically, supporting our people’s physical, mental, and financial health. We also provide opportunities to keep skills relevant through certifications, learning, and diverse work experiences. We’re proud to be consistently recognized as one of the World’s Best Workplaces™.Join Accenture to work at the heart of change.Visit us at
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, militaryveteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by applicablelaw. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
Data Architect
Posted today
Job Viewed
Job Description
Roles and Responsibilities
We are looking for skilled Azure Data Engineer / Architect to join our Data Lake team. The candidate shal have an experience into building and optimising large data platforms preferably in for lending business
Exp required – - Yrs
Skillsets
Architecting Data platform solution
Experience in delivery of large scale enterprise data warehouse solution
Strong written and oral communication skills
Knowledge in BFSI domain will be added advantage
Design and develop batch data pipelines independently to enable faster business analysis
Experience in Datalake / DataWareHousing projects from end to end delivery perspective.
Experience with Data modelling (relational and dimensional)
Working Experience in Microsoft Azure cloud and preferably component : Azure data lake, Azure data factory, SQL DW (Synapse) , Spark
4+ Exp in writing complex SQL queries/procedures/Views/Functions and database objects.
Minimum 3 years exp required into cloud computing preferably into Microsoft Azure .
Experience working with big data frameworks especially Spark.
Experience into R and Python would be added advantage.
Nice to have Talend/ SSIS knowledge
Azure admin knowledge will be added advantage
Proficient understanding of code versioning tools.
Excellent analytical and organisational skill