13 Data Architect jobs in Chennai

Data Architect

Chennai, Tamil Nadu Ford Motor Company

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Role & Responsibilities**
+ Utilize Google Cloud Platform & Data Services to modernize legacy applications.
+ Understand technical business requirements and define architecture solutions that align to Ford Motor & Credit Companies Patterns and Standards.
+ Collaborate and work with global architecture teams to define analytics cloud platform strategy and build Cloud analytics solutions within enterprise data factory.
+ Provide Architecture leadership in design & delivery of new Unified data platform on GCP.
+ Understand complex data structures in analytics space as well as interfacing application systems. Develop and maintain conceptual, logical & physical data models. Design and guide Product teams on Subject Areas and Data Marts to deliver integrated data solutions.
+ Leverage cloud AI/ML Platforms to deliver business and technical requirements.
+ Provide architectural guidance for optimal solutions considering regional Regulatory needs.
+ Provide architecture assessments on technical solutions and make recommendations that meet business needs and align with architectural governance and standard.
+ Guide teams through the enterprise architecture processes and advise teams on cloud-based design, development, and data mesh architecture.
+ Provide advisory and technical consulting across all initiatives including PoCs, product evaluations and recommendations, security, architecture assessments, integration considerations, etc.
**Required Skills and Selection Criteria:**
+ Google Professional Solution Architect certification.
+ 8+ years of relevant work experience in analytics application and data architecture, with deep understanding of cloud hosting concepts and implementations.
+ 5+ years' experience in Data and Solution Architecture in analytics space. Solid knowledge of cloud data architecture, data modelling principles, and expertise in Data Modeling tools.
+ Experience in migrating legacy analytics applications to Cloud platform and business adoption of these platforms to build insights and dashboards through deep knowledge of traditional and cloud Data Lake, Warehouse and Mart concepts.
+ Good understanding of domain driven design and data mesh principles.
+ Experience with designing, building, and deploying ML models to solve business challenges using Python/BQML/Vertex AI on GCP.
+ Knowledge of enterprise frameworks and technologies. Strong in architecture design patterns, experience with secure interoperability standards and methods, architecture tolls and process.
+ Deep understanding of traditional and cloud data warehouse environment, with hands on programming experience building data pipelines on cloud in a highly distributed and fault-tolerant manner. Experience using Dataflow, pub/sub, Kafka, Cloud run, cloud functions, Bigquery, Dataform, Dataplex , etc.
+ Strong understanding on DevOps principles and practices, including continuous integration and deployment (CI/CD), automated testing & deployment pipelines.
+ Good understanding of cloud security best practices and be familiar with different security tools and techniques like Identity and Access Management (IAM), Encryption, Network Security, etc. Strong understanding of microservices architecture.
**Nice to Have**
+ Bachelor's degree in Computer science/engineering, Data science or related field.
+ Strong leadership, communication, interpersonal, organizing, and problem-solving skills
+ Good presentation skills with ability to communicate architectural proposals to diverse audiences (user groups, stakeholders, and senior management).
+ Experience in Banking and Financial Regulatory Reporting space.
+ Ability to work on multiple projects in a fast paced & dynamic environment.
+ Exposure to multiple, diverse technologies, platforms, and processing environments.
**Requisition ID** : 47962
This advertiser has chosen not to accept applicants from your region.

Data Architect

Chennai, Tamil Nadu Montra Electric

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the Role:

We are seeking a seasoned Data Engineering Architect with 15+ years of experience designing and scaling enterprise-grade data platforms to lead the architecture of our Connected Electric Vehicle (EV) Platform.

This role requires deep expertise in cloud-native (AWS & GCP) data ecosystems, real-time streaming infrastructure, and architecting solutions for high-velocity, high-volume telemetry generated by Electric Vehicle.


Skill Sets Required:

  • 15+ years of experience in Data Architecture and Engineering
  • Deep understanding of structured and unstructured data ecosystems
  • Hands-on experience with ETL, ELT, stream processing, querying, and data modeling
  • Proficiency in tools and languages such as Java, Spark, Kafka, Airflow, and Python
  • Proficiency in SQL and MongoDB
  • Proven experience in building and maintaining data warehouses and data lakes
  • Experience in implementing data quality standards and performing exploratory data analysis (EDA)
  • Experience with any public cloud platform: GCP, AWS, or Azure
  • Strong knowledge of data governance, privacy, and compliance standards
  • Expertise in designing high-performance and cost-optimized data engineering pipelines leveraging Medallion Architecture
  • A strategic mindset with the ability to execute hands-on when needed

Good to Have:

  • Knowledge of GraphQL/Graph databases
  • Exposure to connected vehicle platforms
  • Experience in the EV or automotive domain
  • Knowledge of API development
  • Exposure to data mesh and data fabric concepts

Responsibilities:

  • Design scalable, secure, and high-performance data architectures across cloud and on-premise environments for our connected vehicle platform
  • Lead the end-to-end architecture of data pipelines and real-time streaming solutions using Kafka, Kinesis, Google Pub/Sub, Apache Flink, or Dataflow.
  • Establish best practices for tiered storage across hot, warm, and cold data using BigQuery, Redshift, S3/GCS, and Delta Lake.
  • Collaborate with cross-functional teams and business stakeholders to ensure data solutions meet analytical and operational requirements
  • Enable AI/ML readiness through clean, governed, and performant data pipelines for predictive models.
  • Evaluate and recommend tools and technologies to improve the data platform by conducting proof of concepts (PoCs)
  • Ensure compliance with data security, privacy, and regulatory standards
  • Mentor and guide data engineers and solution architects
  • Establish best practices and coding standards for data engineering teams
  • Stay current with emerging technologies and drive adoption of modern data practices
  • Lead and contribute to technical discussions and decision-making


Qualifications:

  • Bachelor’s or Master’s degree in any discipline
  • 15+ years of industry experience, with a minimum of 10 years in data engineering
This advertiser has chosen not to accept applicants from your region.

Data Architect

Chennai, Tamil Nadu Hotfoot Technology Solutions

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Hands-On | Micro Data Lakes | Enterprise Data Strategy


Are you a hands-on Data Architect who thrives on solving complex data problems across structured and unstructured sources? Do you enjoy designing micro data lakes and driving enterprise-wide data strategy? If so, we want to hear from you.


What You Will Do


Design and build micro data lakes tailored to lending domain

Define and implement enterprise data strategies including modelling, lineage, and governance

Architect and build robust data pipelines for batch and real-time data ingestion

Develop strategies for extracting, transforming, and storing data from APIs, PDFs, logs, databases, and more

Establish best practices for data quality, metadata management, and data lifecycle control

Hands-on in implementation of processes, strategies and tools to create differentiated products. – MUST HAVE

Collaborate with engineering and product teams to align data architecture with business goals

Evaluate and integrate modern data platforms and tools such as Databricks, Spark, Kafka, Snowflake, AWS, GCP, Azure

Mentor data engineers and advocate for engineering excellence in data practices



What You Bring


10+ years of experience in data architecture and engineering

Deep understanding of structured and unstructured data ecosystems

Hands-on experience with ETL, ELT, stream processing, querying and data modelling

Proficiency in tools and languages such as Spark, Kafka, Airflow, SQL, Amundsen, Glue Catalog and Python

Expertise in cloud-native data platforms including AWS, Azure, or GCP

Strong grounding in data governance, privacy, and compliance standards

A strategic mindset with the ability to execute hands-on when needed



Nice to Have


Exposure to the lending domain

Exposure to ML pipelines or AI integrations

Background in fintech, lending, or regulatory data environments



What We Offer


An opportunity to lead data-first transformation, create products that accelerate AI adoption

Autonomy to design, build, and scale modern data architecture

A forward-thinking, collaborative, and tech-driven culture

Access to the latest tools and technologies in the data ecosystem



Location: Chennai

Experience: 10-15 Years | Full-Time | Work From Office

This advertiser has chosen not to accept applicants from your region.

GCP Data Architect 

New
Chennai, Tamil Nadu Tata Consultancy Services

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Responsibilities

  • Design and Implement Data Architectures: Architect and build scalable, end-to-end data solutions on GCP, encompassing data ingestion, transformation, storage, and consumption.
  • Develop Data Pipelines: Design and develop ETL/ELT data pipelines using tools like Apache Airflow (Cloud Composer) and programming languages such as Python and SQL for batch and real-time processing.
  • Create Data Models: Build logical and physical data models, including dimensional modelling and schema design, to support data warehousing, data lakes, and analytics.
  • Ensure Data Quality and Governance: Establish and enforce data governance, security, and quality standards, implementing data validation and testing procedures.
  • Collaborate with Stakeholders: Work with data engineers, business analysts, data scientists, and product owners to translate business requirements into technical data solutions.
  • Optimize GCP Services: Optimize the performance and cost-effectiveness of GCP services, particularly Big Query, for analytics and data storage.
  • Provide Technical Guidance: Lead architectural reviews, provide technical guidance on cloud-native data strategies, and mentor engineering teams on GCP best practices.

Required Skills and Knowledge

  • Google Cloud Platform (GCP): Expertise with GCP services like BigQuery, Cloud Storage, Cloud SQL, and Cloud Composer.
  • Data Modelling: Proficiency in designing data models for data warehouses and data lakes.
  • ETL/ELT: Experience with designing and building data pipelines using tools like Apache Airflow.
  • Programming: Strong skills in SQL and Python for data processing and development.
  • Data Governance: Understanding and ability to implement data governance, metadata management, and security policies.
  • Collaboration: Strong communication skills to work with cross-functional teams and explain complex technical concepts.
This advertiser has chosen not to accept applicants from your region.

Azure Data Architect

Chennai, Tamil Nadu HCLTech

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job title


Azure Data Architect


Experience: 10+ Years


Overview of Role / Role Purpose


As an Azure Data Architect you will be a key member of the Global GTS M&A team. This role will be proficient in investigating, decomposing, and combining data of different data sources and types. Ideal candidates are prepared to design appropriate solutions to support merger data needs. The Azure Data Architect will partner with other team members on the creation of best-practices approach to capture data from disparate applications into an Azure Data Lake for cleaning, curation and mapping into various application destinations meeting Gallagher approved standards. They will identify and explore new data sources and be primarily focused on developing integration processes between systems and create thorough diagrams and documentation. They will also be responsible for identifying datasets targeted for on-prem migration and architecting the requisite platform for hosting that data from within the Gallagher datacenter.



Core Responsibilities


  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Investigates data, data sources, and performs data quality analysis.
  • Collaborates with other Architects and Data Engineers to help adopt best-practices in data system creation, data integrity, test design, analysis, validation, and documentation.
  • Creates and maintains optimal data pipeline architecture for extraction, transformation, and loading of data from various data sources – both internal and external.
  • Assembles large, complex sets of data that meet non-functional and functional business requirements.
  • Builds industrialized analytic datasets and delivery mechanisms that utilize the data pipeline to deliver actionable insights.
  • Communicates and maintains master data, metadata, data management repositories, logical data models, data standards.
  • Works with stakeholders including the executive, product, data and technology teams to support their data infrastructure needs while assisting with data-related technical issues.



Key Skills and Experience

  • Excellent analysis and problem-solving skills
  • Effective time-management skills
  • Ability to adapt to ever-changing system environments and understand all business applications
  • Desire to learn new technologies and adapt requirements for future cloud solutions
  • Good organizational skills and a strict attention to detail and proven ability to follow through
  • Strong interpersonal skills
  • Ability to develop effective working relationships with stakeholders
  • Hands-on experienced with designing, moving and transforming data to/from Microsoft SQL Server, Azure SQL, Azure SQL DW, Dynamics CRM, Dynamics 365.
  • Hands-on experience integration technologies such as SQL Server Integration Services, Azure Data Factory, Azure Logic Apps, Power Automate Azure Integration Services or similar technologies.
  • Mastery of relational SQL and NoSQL databases
  • Mastery knowledge of Snowlflake in both AWS and Azure clouds
  • Experienced with data wrangling and preparation for use within PowerBI
  • Strong written and verbal communication skills including the ability to effectively collaborate with multi-disciplinary groups and all organizational levels.



Education / Qualifications


  • Bachelor's degree in Computer Science, Information Technology, or a related field and minimum 3 years of prior relevant experience.
This advertiser has chosen not to accept applicants from your region.

Senior Data Architect

Chennai, Tamil Nadu Luxoft

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Notice period: 30 days max


We are a leading international Bank that is going through a significant transformation of its front-to-back operations, marked as one of the banks top 3 transformation agendas.


F2B Business Architecture is a small central global team in CIB- CTB to support the delivery of this key front-to-back (F2B) transformation priorities of the management board. The Data Architecture team will play the central role of defining the data model that will align the business processes and ensure data lineage, effective controls, and implement efficient client strategy and reporting solutions. This will require building strong relationships with key stakeholders and help deliver tangible value.


The role will report to the India Head of Investment Bank and Cross Product F2B Operations.


Responsibilities:

- Be part of the CTB team to define and manage Data models used to implement solutions to automate F2B business processes and controls

- Ensure the models follow the bank's data modelling standards and principles and influence them as necessary

- Actively partner with various functional leads & teams to socialize the data models towards adoption and execution of front-to-back solutions


Mandatory Skills Description:

- 10+ years in financial services, preferably Strategy and solutions in the Corporate and Investment Banking domain.

- Strong Data analysis skills, SQL/Python experience, and the ability to build data models are desirable.

- Must have implemented data models.

- Worked on architectural design by coordinating with both business and technology teams.

- Excellent working knowledge of the functional side and business data.

- Strong Knowledge of transaction banking domain process and controls for banking & trading business to drive conversation with business SMEs. Experience in developing models for transacting banking products is preferable.

- Experience working in an enterprise agile environment in a matrix organization.

- Critical problem-solving skills, able to think tactically and strategically.


Nice-to-Have Skills Description:

Good Tech stack


Languages:

English: C2 Proficient

This advertiser has chosen not to accept applicants from your region.

Solutions Data Architect bpost

Chennai, Tamil Nadu Radial

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Solutions Data Architect bpost
**Job Number:** JO-
**Location (City, State):** Chennai, India
**Employee Group:** Regular
**Shift:** Day
**Travel:** 0%
**Site Name:** TECHNOLOGIES INDIA PRIVATE LIMITED
**Is Remote Eligible:** No
**Pay:** INR 3,130,100.00 - INR 5,321,170.00 per year
Share ( |Email this job
The Solutions Architect primary focus is to ensure the technical integrity of the Event Driven Architecture platform and to formulate the technical direction for the strategic cloud investments. The Solution Architect drives the estimation, analysis, and design as well as supports implementation and operations of a slew of microservices owned by the team. The Solutions Architect will work closely with the senior tech and business leadership as well as engineering and ops teams driving the vision, and the technical strategic objectives throughout the SDLC. The Solution Architect remains current in all Parcel and Logistics related technologies in support of enterprise applications and infrastructure. Working as Solution Architect, your passion for technology and thirst for innovation will help shaping the future of our digital transformation now and for years to come.Responsibilities: Analyze, design, and lead technical solutions fulfilling core business requirements in our migration journey from legacy messaging solution to Confluent Kafka in AWS platform. Keeping in mind solution scalability, availability, security, extensibility, and maintainability characteristics as well as risk assumptions, and cost consideration for solutions. Actively participate in proof-of-concept implementation of new applications and services. Research, evaluate and recommend third party software packages and services to enhance the digital transformation capabilities. Promote technical vision and sound engineering principles among technology department staff members and across the global team. Occasionally, assist in production escalations, systems operations, and problem resolutions. Assist team members in adopting new Kafka and real time data streaming solutions. And mentor the team to remain current with latest tech trends in the global marketplace. Perform and mentor conceptual, logical and physical data modeling. Drive team to maintain sematic models. Guide teams in adopting data warehousing, data lakes and data mesh architectures Drive process, policy, and standard improvements related to the architecture, design, and development principles. Assist business leadership in the prioritization of business capabilities and go-to-market decisions. Collaborate with cross functional teams and business team as required to drive the strategy and initiatives forward Leading architecture teams in digital capabilities and competency building mentoring junior team members Qualifications: 15+ years of software engineering experience with 5+ years in hands-on architecture roles Ability to define platform strategy, target state architecture, and implementation roadmaps for enterprise-scale applications to migrate to Kafka Proven hands-on architecture and design experience in Microservice architecture and Domain Driven Design concepts and principles including 12-factor app and other enterprise design patterns. Establish enterprise architectural blueprints and cloud deployment topologies Highly experienced in designing high traffic services serving 1k+ transactions per second or similar high transaction volume distributed systems with resilient high-availability and fault-tolerance. Experience in developing event-driven, message-driven asynch systems such as Kafka, Kinesis etc Experience in AWS services such as Lambdas, ECS, EKS, EC2, S3 DynamoDB, RDS, VPCs, Route 53, ELB Experience in Enterprise Java, Springboot eco-system Experience with Oracle Databases, NoSQL DBs and distributed caching. Experience with Data Architecture, Data Modelling, Data Lake and Data Mesh implementations Extensive experience implementing system integrations utilizing API gateways, JSON, Avro and XML libraries. Excellent written, verbal, presentation, and process facilitation skills AWS architecture certification or equivalent working expertise with AWS platform. B.S. or M.S. in computer science or a related technical area preferred Financial, payment processing or e-commerce domains is a plus. Travel: This position is based out of GDC in Chennai India, and occasional travel to Belgium might be required for business interactions and training.
Would you like to apply to this job?
Apply for the Solutions Data Architect bpost position
Radial is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
Radial is committed to ensuring that its online application process provides an equal employment opportunity to all job seekers, including individuals with disabilities. If you believe you need a reasonable accommodation in order to search for a job opening or to submit an application, please contact us by emailing . We will work to assist disabled job seekers whose disability prevents them from being able to apply online.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data architect Jobs in Chennai !

Solution Leader - Senior Data Architect

Chennai, Tamil Nadu i4 Consulting : Reimagining HR Blueprints

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

full-time

As a Senior Data Architect in the BFSI domain, you will be responsible for designing and governing data architecture across core banking, Wealth Management and financial platforms. Youll lead initiatives that span legacy systems (mainframes) to modern cloud-native and NoSQL solutions, ensuring regulatory compliance, data lineage, and high availability for mission-critical applications.

Key Responsibilities:

  • Architect scalable data solutions for core banking , Wealth Management , risk management , fraud detection , and regulatory reporting .
  • Design data models  for transactional systems, data warehouses, and real-time analytics platforms.
  • Lead data migration  from legacy systems (e.g., DB2, VSAM) to cloud platforms like Snowflake , Azure Synapse , or Google BigQuery . Lead Data migration from Oracle to Postgres , SQL Server to Postgres 
  • Collaborate with compliance teams to ensure data governance , auditability , and regulatory alignment  (e.g., Basel III, IFRS, GDPR).
  • Build and optimize ETL/ELT pipelines  for high-volume financial data using tools like Informatica , Talend , Apache NiFi , or Azure Data Factory .
  • Integrate data across core banking systems , Wealth Management Systems , CRM , insurance platforms , and digital channels .
  • Define metadata standards, lineage tracking, and master data management frameworks.
  • Mentor data engineers and analysts; contribute to architectural reviews and BFSI-specific best practices.

Domain Expertise:

  • Deep understanding of BFSI data domains : customer, account, transaction, trading, portfolios, positions, credit, risk, claims, policy, and underwriting.
  • Experience with financial data modeling , including ledger structures , risk hierarchies , and regulatory taxonomies .

Data Modeling & Architecture:

  • Proficiency in Erwin , PowerDesigner , or Toad Data Modeler .
  • Experience with 3NF , dimensional modeling , data vault , and canonical models .


Key Responsibilities:

  • Architect scalable data solutions for core banking/wealth management , risk management , fraud detection , and regulatory reporting .
  • Design data models  for transactional systems, data warehouses, and real-time analytics platforms.
  • Lead data migration  from legacy systems (e.g., DB2, VSAM) to cloud platforms like Snowflake , Azure Synapse , or Google BigQuery. Lead Data Migration between multiple databases like Oracle to Postgres.
  • Collaborate with compliance teams to ensure data governance , auditability , and regulatory alignment  (e.g., Basel III, IFRS, GDPR).
  • Build and optimize ETL/ELT pipelines  for high-volume financial data using tools like Informatica , Talend , Apache NiFi , or Azure Data Factory .
  • Integrate data across core banking systems , CRM , insurance platforms , and digital channels .
  • Define metadata standards, lineage tracking, and master data management frameworks.
  • Mentor data engineers and analysts; contribute to architectural reviews and BFSI-specific best practices.


Preferred Qualifications:

  • Bachelors/Masters in Computer Science, Information Systems, or Finance
  • Certifications: GCP Certification, AWS Data Analytics, Azure Data Engineer, CDMP (DAMA), TOGAF
  • Experience with financial regulations  and reporting standards  (e.g., BCBS 239, FATCA)



This advertiser has chosen not to accept applicants from your region.

Snowflake Data Warehouse

600086 Chennai, Tamil Nadu 2coms

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Application Lead – Snowflake Data Warehouse Location: Chennai, India Experience: Minimum 5 years in Snowflake Data Warehouse Education: 15 years of full-time education required Job Summary:

As an Application Lead , you will be responsible for leading the design, development, and configuration of data-driven applications, with a focus on Snowflake Data Warehouse. Acting as the primary point of contact, you will collaborate with cross-functional teams to ensure application requirements are met while maintaining alignment with business goals. You will guide your team throughout the development lifecycle, resolve technical challenges, and ensure delivery excellence in both performance and quality.

Roles & Responsibilities:

Act as the Subject Matter Expert (SME) for Snowflake Data Warehouse solutions.

Lead and manage a development team, ensuring high performance and collaboration.

Take responsibility for team-level decisions and accountability for deliverables.

Collaborate with multiple teams across the organization to drive key architectural and strategic decisions.

Provide innovative and scalable solutions to data-related challenges, both within the immediate team and across projects.

Facilitate knowledge sharing , promote adoption of best practices, and support ongoing team development.

Monitor project milestones , ensure timely delivery of application components, and maintain a focus on quality and efficiency.

Drive improvements in data warehousing processes and contribute to continuous optimization.

Professional & Technical Skills: Must-Have Skills:

Strong proficiency in Snowflake Data Warehouse with at least 5 years of hands-on experience.

Deep understanding of cloud-based data solutions and scalable architecture.

Good-to-Have Skills:

Experience with ETL processes and data integration tools (e.g., Informatica, Talend, Matillion).

Proficiency in SQL and data modeling techniques (e.g., dimensional, star-schema).

Knowledge of performance tuning and optimization for data warehouse solutions.

This advertiser has chosen not to accept applicants from your region.

Cloud Data Solution Architect

Chennai, Tamil Nadu Cognizant

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Job Summary**
**Cloud Data Solution Architect Job Description**
**Architect: (same for all 3 clouds- AWS/GCP/Azure) _ 9 - 15 Years**
The successful candidate will be responsible for leading the practice's ambitious growth initiative with our most strategic cloud partners. This opportunity is scoped for high performing, high potential subject matter experts who are self-directed, entrepreneurial, highly collaborative, organized, diligent, and thoughtful. This role requires a domain expertise combined with the ability to task-organize, focus on the details, and sustain and drive progress over time.
For the right candidate, this role offers an opportunity to contribute to rapid revenue growth in one of the most exciting and dynamic segments of the market. This is an ideal opportunity for technologists seeking cross vertical and/or cross industry expertise. Among other career-enhancing considerations, this role provides opportunities to directly and indirectly manage teams.
The entire AIA leadership team is personally committed to the success of the architects on-boarded through this focused cloud hiring initiative.
**Principal Responsibilities**
- Lead joint solution/offering architecture discussions with partner architects/stakeholders and support development
· RFPs: Craft and manage solution architecture and proposal defense for reactive and proactive proposals responses for data platforms and solutions on cloud
· Turnkey Proposals: Drive turnkey proposals for data platforms, analytics, built cost summaries, value propositions as part of cloud modernization, data portfolio rationalization
- Provide leadership for the transformation of customer requirements into visions, strategies, and roadmaps to implement data platforms, multi-cloud , cloud data warehouse
- Independently lead client design workshops and provide tradeoffs and recommendations
- Contribute to the Cognizant Data community by developing assets, thought leadership, etc.
**Technical Qualifications**
- Preferred cloud competencies:
· Certified Solution Architect on one of the clouds
· Aware of PaaS services relating to Data, Analytics and end-point API interfaces - EMR/Dataproc/Databricks etc.
· Able to compare and contrast between best of technologies by layer - e.g, stream processing, messaging, search, etc.
· High-level awareness of other partner technologies, which are alternative to native services like Snowflake, Cloudera, Talend, Informatica, Confluent, etc.
**Key competencies**
· Excellent consulting, communication, influencing and facilitation skills, in particular problem solving/troubleshooting activities.
· Lead key data architecture decisions, design of architectural roadmaps, database, data access and data technology architectures
· Provide expertise on the overall data eco-system's engineering best practices, standards, architectural approaches, and complex technical resolutions
· Holistic knowledge of solution architecture, including:
× Data layers and purpose
× Serverless compute
× Processing run time - ETL vs Spark run time, etc.
× Security controls around authentication, authorizations, encryption and certificates
× Recommend the migration tools and design data migration strategy
× Metadata and data catalogs
· Excellent communication skills, preparing PowerPoint presentations, executive readouts
**Key Experience**
Successfully built and defended solution architecture with different levels of technical stakeholders - Business heads, enterprise architects, tech leads, etc.
+ Demonstrate proficiency in designing and deploying solutions on Azure and AWS platforms.
+ Possess strong analytical and problem-solving skills to address complex technical challenges.
+ Exhibit excellent communication and interpersonal skills to collaborate effectively with diverse teams.
+ Show a proven track record of delivering successful cloud projects within specified timelines.
+ Have a deep understanding of cloud security principles and best practices.
+ Be adept at using infrastructure as code tools for efficient cloud resource management.
+ Display a commitment to staying updated with the latest cloud technologies and trends.If interested pls share your CV to with email subject as "Cloud Data Solution Architect"
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Architect Jobs View All Jobs in Chennai