1451 Data Engineer jobs in Hyderabad
Staff, Data Engineer - Data Architecture [T500-20225]
Posted today
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Staff, Data Engineer
Job Description:
Roles & Responsibilities:
- Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
- Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
- Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
- Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
- Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
- Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
- Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
- Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.
Technical Skills:
- 15+ years of experience
- Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
- Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
- Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
- Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
- Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
- Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
- Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.
Must Have Skills:
- Deep expertise in data architecture, distributed systems, and GCP.
- Python or Java, infrastructure-as-code (e.g. Terraform)
- Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
- Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
- Data modeling and architecture - level expert, have experience on hybrid architectures
- SQL Skills level - Expert
- Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
- Understanding of full concepts of cloud architecture.
Staff, Data Engineer - Data Architecture T500-20225
Posted 1 day ago
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members' experiences and make employees' jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Staff, Data Engineer
Job Description:
Roles & Responsibilities:
- Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
- Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
- Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
- Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
- Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
- Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
- Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
- Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.
Technical Skills:
- 15+ years of experience
- Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
- Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
- Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
- Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
- Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
- Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
- Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.
Must Have Skills:
- Deep expertise in data architecture, distributed systems, and GCP.
- Python or Java, infrastructure-as-code (e.g. Terraform)
- Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
- Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
- Data modeling and architecture - level expert, have experience on hybrid architectures
- SQL Skills level - Expert
- Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
- Understanding of full concepts of cloud architecture.
Staff, Data Engineer - Data Architecture [T500-20225]
Posted 1 day ago
Job Viewed
Job Description
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Staff, Data Engineer
Job Description:
Roles & Responsibilities:
Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.
Technical Skills:
15+ years of experience
Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.
Must Have Skills:
Deep expertise in data architecture, distributed systems, and GCP.
Python or Java, infrastructure-as-code (e.g. Terraform)
Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
Data modeling and architecture - level expert, have experience on hybrid architectures
SQL Skills level - Expert
Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
Understanding of full concepts of cloud architecture.
Staff, Data Engineer - Data Architecture [T500-20225]
Posted 5 days ago
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Staff, Data Engineer
Job Description:
Roles & Responsibilities:
- Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
- Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
- Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
- Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
- Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
- Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
- Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
- Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.
Technical Skills:
- 15+ years of experience
- Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
- Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
- Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
- Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
- Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
- Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
- Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.
Must Have Skills:
- Deep expertise in data architecture, distributed systems, and GCP.
- Python or Java, infrastructure-as-code (e.g. Terraform)
- Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
- Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
- Data modeling and architecture - level expert, have experience on hybrid architectures
- SQL Skills level - Expert
- Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
- Understanding of full concepts of cloud architecture.
Data Engineer L3 - Data Architecture [T500-20144]
Posted today
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Data Engineer L3
Job Description:
Roles & Responsibilities:
- Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
- Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
- Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
- Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
- Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
- Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.
Technical Skills:
- 8 – 12 years of experience
- Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
- Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
- Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
- Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
- Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
- Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.
Must Have Skills:
- Python, orchestration tools (e.g. Airflow, Cloud Composer)
- Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
- Experience with GCP and production-grade data platform deployment
- Data security, compliance, and governance standards
- Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
- SQL Skills level - Expert
- Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
Data Engineer L3 - Data Architecture [T500-20144]
Posted 1 day ago
Job Viewed
Job Description
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Data Engineer L3
Job Description:
Roles & Responsibilities:
Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.
Technical Skills:
8 – 12 years of experience
Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.
Must Have Skills:
Python, orchestration tools (e.g. Airflow, Cloud Composer)
Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
Experience with GCP and production-grade data platform deployment
Data security, compliance, and governance standards
Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
SQL Skills level - Expert
Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
Data Engineer L3 - Data Architecture [T500-20144]
Posted today
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Data Engineer L3
Job Description:
Roles & Responsibilities:
- Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
- Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
- Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
- Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
- Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
- Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.
Technical Skills:
- 8 – 12 years of experience
- Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
- Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
- Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
- Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
- Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
- Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.
Must Have Skills:
- Python, orchestration tools (e.g. Airflow, Cloud Composer)
- Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
- Experience with GCP and production-grade data platform deployment
- Data security, compliance, and governance standards
- Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
- SQL Skills level - Expert
- Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
Be The First To Know
About the latest Data engineer Jobs in Hyderabad !
Data Engineer L3 - Data Architecture [T500-20144]
Posted 5 days ago
Job Viewed
Job Description
About Costco Wholesale
Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.
About Costco Wholesale India
At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.
Position Title: Data Engineer L3
Job Description:
Roles & Responsibilities:
- Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.
- Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.
- Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.
- Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.
- Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.
- Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.
Technical Skills:
- 8 – 12 years of experience
- Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.
- Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.
- Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.
- Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.
- Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.
- Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.
Must Have Skills:
- Python, orchestration tools (e.g. Airflow, Cloud Composer)
- Data architecture: data lakes, warehouses, streaming (e.g. Pub/Sub, Dataflow, Dataproc)
- Experience with GCP and production-grade data platform deployment
- Data security, compliance, and governance standards
- Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models)
- SQL Skills level - Expert
- Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations
- Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.
Big Data Engineer, Data Modeling
Posted today
Job Viewed
Job Description
What can you tell your friends
when they ask you what you do?We’re looking for an experienced Big Data Engineer who can create innovative new products in the analytics and data space. You will participate in the development that creates the world's #1 mobile app analytics service. Together with the team, you will build out new product features and applications using agile methodologies and open-source technologies. You will work directly with Data Scientists, Data Engineers, Product Managers, and Software Architects, and will be on the front lines of coding new and exciting analytics and data mining products. You should be passionate about what you do and excited to join an entrepreneurial start-up.
To ensure we execute on our values we are looking for someone who has a passion for:
As a Big Data Engineer, we will need you to be in charge of model implementation and maintenance, and to build a clean, robust, and maintainable data processing program that can support these projects on huge amounts of data, this includes
You should recognize yourself in the following…
This position is located in Hyderabad, India.
We are hiring for our engineering team at our data.ai India subsidiary entity, which is in the process of getting established . As we are awaiting approval from the Indian government, they shall be interim employees at Innova Solutions who is our Global Employer of Record.
Lead Data Engineer-Solution Architecture
Posted today
Job Viewed
Job Description
About Chubb
Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: .
About Chubb India
At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow
With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape.
We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning.
Position Details
- Job Title : Lead Data Engineer-Solution Architecture
- Function/Department : Technology
- Location : Hyderabad/Bangalore/Bhubaneswar
- Employment Type : Full Time
Role Overview
Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred
- Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred.
- Proven track record of successful implementation of data architecture within large-scale transformation programs or projects
- Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices
- Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka) technologies
- Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, ADF)
- Experience with cloud-based data architectures and platforms (e.g. ADLS, Synapse, Snowflake, Azure SQL Database)
- Familiarity with .NET Core and Python FastAPI or similar; hands on experience preferred.
- Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation)
- Familiarity with authentication and authorization methods and frameworks (e.g. OAuth 2.0).
- Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines
- Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture
- Skilled in asynchronous programming patterns.
- Familiarity with containerization and microservices frameworks, such as Docker and Kubernetes.
- Proficient in utilizing Azure or other cloud services, including AKS, Cosmos NoSQL, Cognitive Search, SQL Database, ADLS, App Insights, and API Management.
- Familiar with DevSecOps practices and CI/CD tools, including Git, Azure DevOps, and Jenkins.
- Familiar with Kafka or similar messaging technologies.
- Familiar with GIS / geospatial systems and terminology preferred.
- Strong analytical and problem-solving capabilities.
- Experienced in producing technical documentation to support system design.
- Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.
- Familiarity with Agile methodologies and experience working in Agile project environments, including ceremonies and tools like JIRA.
Why Join Us?
- Be at the forefront of digital transformation in the insurance industry.
- Lead impactful initiatives that simplify claims processing and enhance customer satisfaction.
- Work alongside experienced professionals in a collaborative, innovation-driven environment.
Why Chubb?
Join Chubb to be part of a leading global insurance company!
Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results.
- Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence
- A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years , and
- Laser focus on excellence : At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results
- Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter
- Growth and success : As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment
Employee Benefits
Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include:
- Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances
- Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs.
- Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits.
Application Process
Our recruitment process is designed to be transparent, and inclusive.
- Step 1 : Submit your application via the Chubb Careers Portal / Linkedin.
- Step 2 : Engage with our recruitment team for an initial discussion.
- Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable).
- Step 4 : Final interaction with Chubb leadership.
Join Us
With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion , and are ready to make a difference, we invite you to be part of Chubb India’s journey .
Apply Now :