1451 Data Engineer jobs in Hyderabad

Staff, Data Engineer - Data Architecture [T500-20225]

Hyderabad, Andhra Pradesh Costco IT

Posted today

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.


Position Title: Staff, Data Engineer

Job Description:

Roles & Responsibilities:

  • Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
  • Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
  • Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
  • Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
  • Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
  • Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
  • Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
  • Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.


Technical Skills:

  • 15+ years of experience
  • Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
  • Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
  • Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
  • Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
  • Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
  • Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
  • Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.


Must Have Skills:

  • Deep expertise in data architecture, distributed systems, and GCP.
  • Python or Java, infrastructure-as-code (e.g. Terraform)
  • Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
  • Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
  • Data modeling and architecture - level expert, have experience on hybrid architectures
  • SQL Skills level - Expert
  • Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
  • Understanding of full concepts of cloud architecture.
This advertiser has chosen not to accept applicants from your region.

Staff, Data Engineer - Data Architecture T500-20225

Hyderabad, Andhra Pradesh ₹2000000 - ₹2500000 Y Costco IT

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members' experiences and make employees' jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.

Position Title: Staff, Data Engineer

Job Description:

Roles & Responsibilities:

  • Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
  • Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
  • Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
  • Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
  • Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
  • Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
  • Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
  • Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.

Technical Skills:

  • 15+ years of experience
  • Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
  • Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
  • Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
  • Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
  • Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
  • Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
  • Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.

Must Have Skills:

  • Deep expertise in data architecture, distributed systems, and GCP.
  • Python or Java, infrastructure-as-code (e.g. Terraform)
  • Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
  • Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
  • Data modeling and architecture - level expert, have experience on hybrid architectures
  • SQL Skills level - Expert
  • Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
  • Understanding of full concepts of cloud architecture.
This advertiser has chosen not to accept applicants from your region.

Staff, Data Engineer - Data Architecture [T500-20225]

Hyderabad, Andhra Pradesh Costco IT

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.

Position Title: Staff, Data Engineer
Job Description:
Roles & Responsibilities:
Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.

Technical Skills:
15+ years of experience
Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.

Must Have Skills:
Deep expertise in data architecture, distributed systems, and GCP.
Python or Java, infrastructure-as-code (e.g. Terraform)
Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
Data modeling and architecture - level expert, have experience on hybrid architectures
SQL Skills level - Expert
Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
Understanding of full concepts of cloud architecture.
This advertiser has chosen not to accept applicants from your region.

Staff, Data Engineer - Data Architecture [T500-20225]

Hyderabad, Andhra Pradesh Costco IT

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.


Position Title: Staff, Data Engineer

Job Description:

Roles & Responsibilities:

  • Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
  • Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
  • Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
  • Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
  • Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
  • Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
  • Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
  • Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.


Technical Skills:

  • 15+ years of experience
  • Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
  • Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
  • Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
  • Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
  • Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
  • Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
  • Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.


Must Have Skills:

  • Deep expertise in data architecture, distributed systems, and GCP.
  • Python or Java, infrastructure-as-code (e.g. Terraform)
  • Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
  • Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
  • Data modeling and architecture - level expert, have experience on hybrid architectures
  • SQL Skills level - Expert
  • Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
  • Understanding of full concepts of cloud architecture.
This advertiser has chosen not to accept applicants from your region.

Data Engineer L3 - Data Architecture [T500-20144]

Hyderabad, Andhra Pradesh Costco IT

Posted today

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.


About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.


Position Title: Data Engineer L3

Job Description:

Roles & Responsibilities:

  • Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
  • Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
  • Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
  • Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
  • Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
  • Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.


Technical Skills:

  • 8 – 12 years of experience
  • Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
  • Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
  • Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
  • Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
  • Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
  • Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.


Must Have Skills:

  • Python, orchestration tools (e.g. Airflow, Cloud Composer)
  • Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
  • Experience with GCP and production-grade data platform deployment
  • Data security, compliance, and governance standards
  • Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
  • SQL Skills level - Expert
  • Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
This advertiser has chosen not to accept applicants from your region.

Data Engineer L3 - Data Architecture [T500-20144]

Hyderabad, Andhra Pradesh Costco IT

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.

Position Title: Data Engineer L3
Job Description:
Roles & Responsibilities:
Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.

Technical Skills:
8 – 12 years of experience
Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.

Must Have Skills:
Python, orchestration tools (e.g. Airflow, Cloud Composer)
Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
Experience with GCP and production-grade data platform deployment
Data security, compliance, and governance standards
Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
SQL Skills level - Expert
Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
This advertiser has chosen not to accept applicants from your region.

Data Engineer L3 - Data Architecture [T500-20144]

Hyderabad, Andhra Pradesh Costco IT

Posted today

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.

Position Title: Data Engineer L3

Job Description:

Roles & Responsibilities:

  • Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
  • Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
  • Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
  • Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
  • Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
  • Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.

Technical Skills:

  • 8 – 12 years of experience
  • Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
  • Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
  • Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
  • Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
  • Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
  • Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.

Must Have Skills:

  • Python, orchestration tools (e.g. Airflow, Cloud Composer)
  • Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
  • Experience with GCP and production-grade data platform deployment
  • Data security, compliance, and governance standards
  • Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
  • SQL Skills level - Expert
  • Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in Hyderabad !

Data Engineer L3 - Data Architecture [T500-20144]

Hyderabad, Andhra Pradesh Costco IT

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.


About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.


Position Title: Data Engineer L3

Job Description:

Roles & Responsibilities:

  • Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
  • Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
  • Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
  • Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
  • Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
  • Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.


Technical Skills:

  • 8 – 12 years of experience
  • Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
  • Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
  • Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
  • Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
  • Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
  • Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.


Must Have Skills:

  • Python, orchestration tools (e.g. Airflow, Cloud Composer)
  • Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
  • Experience with GCP and production-grade data platform deployment
  • Data security, compliance, and governance standards
  • Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
  • SQL Skills level - Expert
  • Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
This advertiser has chosen not to accept applicants from your region.

Big Data Engineer, Data Modeling

Hyderabad, Andhra Pradesh data.ai

Posted today

Job Viewed

Tap Again To Close

Job Description

What can you tell your friends

when they ask you what you do?

We’re looking for an experienced Big Data Engineer who can create innovative new products in the analytics and data space. You will participate in the development that creates the world's #1 mobile app analytics service. Together with the team, you will build out new product features and applications using agile methodologies and open-source technologies. You will work directly with Data Scientists, Data Engineers, Product Managers, and Software Architects, and will be on the front lines of coding new and exciting analytics and data mining products. You should be passionate about what you do and excited to join an entrepreneurial start-­up.

To ensure we execute on our values we are looking for someone who has a passion for:

As a Big Data Engineer, we will need you to be in charge of model implementation and maintenance, and to build a clean, robust, and maintainable data processing program that can support these projects on huge amounts of data, this includes

  • Able to design and implement complex data product components based on requirements with possible technical solutions.
  • Write data programs using Python (e.g., pyspark) with a commitment to maintaining high-quality work while being confident in dealing with data mining challenges.
  • Discover any feasible new technologies lying in the Big Data ecosystem, for example, the Hadoop ecosystem, and share them with to team with your professional perspectives.
  • Get up to speed in the data science and machine learning domain, implementing analysis components in a distributed computing environment (e.g., MapReduce implementation) with instruction from Data Scientists.
  • Be comfortable conducting detailed discussions with Data Scientists regarding specific questions related to specific data models.
  • You should be a strong problem solver with proven experience in big data.
  • You should recognize yourself in the following…

  • Hands-on experience and deep knowledge of the Hadoop ecosystem.
  • Must: PySpark, MapReduce, HDFS.
  • Plus: Storm, Kafka.
  • Must have 2+ years of Linux environment development experience.
  • Proficient with programming in Python & Scala, experience in Pandas, Sklearn or Other data science and data analysis toolset is a big plus.
  • Experience in data pipeline design & automation.
  • Having a background in data mining, analytics & data science components implementation, and machine learning domain, familiarity with common algorithms and libs is a plus.
  • Passion for cloud computing (AWS in particular) and distributed systems.
  • You must be a great problem solver with the ability to dive deeply into complex problems and emerge with clear and pragmatic solutions.
  • Good communication, and cooperation globally.
  • Major in Math or Computer Science.
  • You are driven by passion for innovation that pushes us closer to our vision in everything we do. Centering around our purpose and our hunger for new innovations is the foundation that allows us to grow and unlock the potential in AI.
  • You are an Ideal Team Player: You are hungry and no, we are not talking about food here. You are humble, yet love to succeed, especially as a team! You are smart, and not just book smart, you have a great read on people.
  • This position is located in Hyderabad, India.

    We are hiring for our engineering team at our data.ai India subsidiary entity, which is in the process of getting established . As we are awaiting approval from the Indian government, they shall be interim employees at Innova Solutions who is our Global Employer of Record.

    This advertiser has chosen not to accept applicants from your region.

    Lead Data Engineer-Solution Architecture

    Hyderabad, Andhra Pradesh Chubb

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    About Chubb


    Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: .


    About Chubb India


    At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow

    With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape.


    We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning.


    Position Details

    • Job Title : Lead Data Engineer-Solution Architecture
    • Function/Department : Technology
    • Location : Hyderabad/Bangalore/Bhubaneswar
    • Employment Type : Full Time


    Role Overview


    Qualifications:

    • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred
    • Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred.
    • Proven track record of successful implementation of data architecture within large-scale transformation programs or projects
    • Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices
    • Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka) technologies
    • Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, ADF)
    • Experience with cloud-based data architectures and platforms (e.g. ADLS, Synapse, Snowflake, Azure SQL Database)
    • Familiarity with .NET Core and Python FastAPI or similar; hands on experience preferred.
    • Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation)
    • Familiarity with authentication and authorization methods and frameworks (e.g. OAuth 2.0).
    • Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines
    • Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture
    • Skilled in asynchronous programming patterns.
    • Familiarity with containerization and microservices frameworks, such as Docker and Kubernetes.
    • Proficient in utilizing Azure or other cloud services, including AKS, Cosmos NoSQL, Cognitive Search, SQL Database, ADLS, App Insights, and API Management.
    • Familiar with DevSecOps practices and CI/CD tools, including Git, Azure DevOps, and Jenkins.
    • Familiar with Kafka or similar messaging technologies.
    • Familiar with GIS / geospatial systems and terminology preferred.
    • Strong analytical and problem-solving capabilities.
    • Experienced in producing technical documentation to support system design.
    • Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.
    • Familiarity with Agile methodologies and experience working in Agile project environments, including ceremonies and tools like JIRA.


    Why Join Us?

    • Be at the forefront of digital transformation in the insurance industry.
    • Lead impactful initiatives that simplify claims processing and enhance customer satisfaction.
    • Work alongside experienced professionals in a collaborative, innovation-driven environment.


    Why Chubb?


    Join Chubb to be part of a leading global insurance company!


    Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results.

    • Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence
    • A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years , and
    • Laser focus on excellence : At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results
    • Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter
    • Growth and success : As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment


    Employee Benefits


    Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include:


    • Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances
    • Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs.
    • Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits.


    Application Process


    Our recruitment process is designed to be transparent, and inclusive.

    • Step 1 : Submit your application via the Chubb Careers Portal / Linkedin.
    • Step 2 : Engage with our recruitment team for an initial discussion.
    • Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable).
    • Step 4 : Final interaction with Chubb leadership.


    Join Us

    With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion , and are ready to make a difference, we invite you to be part of Chubb India’s journey .


    Apply Now :

    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Engineer Jobs View All Jobs in Hyderabad