3,016 Data Pipelines jobs in India

Software Developer - Risk Data Pipelines

Bengaluru, Karnataka Squarepoint Capital

Posted today

Job Viewed

Tap Again To Close

Job Description

Department: Risk Technology

Position Overview:

Risk Technology develops core services and systems required by Squarepoint’s systematic and quantitative trading strategies, such as real-time risk controls, position/inventory/P&L monitoring, internal order routing, and various pre and post-trading services.

Risk Data Pipelines develops software on top of the core Risk Technology platform to handle market or asset-class specific processing, including:

  • Trade Data Ingress - Normalize and stream trade data to Squarepoint's systems from trading platforms like Bloomberg, Fidessa and SpiderRock.
  • Trade Data Egress - Feeds to 3rd party platforms to ensure trade booking correctness and regulatory compliance.
  • Services and analytics used by investment and quant teams to understand Risk exposure, P&L and to improve capital efficiency.
  • Automation of trading operations to support the growth of the business, such as the management of options & future expiry and automated trade, position and P&L reconciliation.
  • This role provides an opportunity to learn many aspects of the way hedge funds operate through close collaboration with trading and quantitative teams, and as a developer you will:

  • Design, develop and deliver high quality and maintainable business critical software.
  • Work closely with stakeholders and colleagues to capture requirements, define architecture and technologies, identify and resolve bottlenecks, and deliver functionality.
  • Lead and contribute to design discussions and mentor other team members.
  • Participate in level 2 support.
  • Required Qualifications:

  • Bachelor’s degree in computer science or related subject.
  • 4 years’ minimum Python experience working in the financial industry.
  • Team player with excellent communication skills.
  • Experience with database management systems and related technologies such as SQL.
  • Knowledge of traded financial instruments (Equity, FX, Credit or Rates).
  • Nice to Have:

  • Experience of FIX protocol
  • Experience developing in data engineering (using Python with pandas, R, Julia)
  • Experience with KDB+/q
  • This advertiser has chosen not to accept applicants from your region.

    Software Developer - Data Pipelines (Python)

    Bengaluru, Karnataka Squarepoint Capital

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Team: Development - Alpha Data

    Position Overview:

    We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation.

    Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation.

    We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth.

  • Take part ownership of our ever-growing estate of data pipelines,
  • Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally,
  • Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services,
  • Collaborate with researchers to onboard new datasets,
  • Regularly take the lead on production support operations - during normal working hours only.
  • Required Qualifications:

  • 4+ years of experience coding to a high standard in Python,
  • Bachelor's degree in a STEM subject,
  • Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres),
  • Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3),
  • Excellent communication skills.
  • Nice to haves

  • Experience with big data frameworks, databases, distributed systems, or Cloud development.
  • Experience with any of these: C++, kdb+/q, Rust.
  • This advertiser has chosen not to accept applicants from your region.

    Senior Database Infrastructure Engineer- Cassandra, DataStax, Big Data Pipelines

    Pune, Maharashtra HEROIC.com

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    HEROIC Cybersecurity ( HEROIC ) is seeking a Senior Data Infrastructure Engineer with deep expertise in DataStax Enterprise (DSE) and Apache Cassandra to help architect, scale, and maintain the data infrastructure that powers our cybersecurity intelligence platforms.

    You will be responsible for designing and managing fully automated, big data pipelines that ingest, process, and serve hundreds of billions of breached and leaked records sourced from the surface, deep, and dark web. You'll work with DSE Cassandra, Solr, and Spark, helping us move toward a 99% automated pipeline for data ingestion, enrichment, deduplication, and indexing — all built for scale, speed, and reliability.

    This position is critical in ensuring our systems are fast, reliable, and resilient as we ingest thousands of unique datasets daily from global threat intelligence sources.

    What you will do: 

    • Design, deploy, and maintain high-performance Cassandra clusters using DataStax Enterprise (DSE)
    • Architect and optimize automated data pipelines to ingest, clean, enrich, and store billions of records daily
    • Configure and manage DSE Solr and Spark to support search and distributed processing at scale
    • Automate dataset ingestion workflows from unstructured surface, deep, and dark web sources
    • Cluster management, replication strategy, capacity planning, and performance tuning
    • Ensure data integrity, availability, and security across all distributed systems
    • Write and manage ETL processes, scripts, and APIs to support data flow automation
    • Monitor systems for bottlenecks, optimize queries and indexes, and resolve production issues
    • Research and integrate third-party data tools or AI-based enhancements (e.g., smart data parsing, deduplication, ML-based classification)
    • Collaborate with engineering, data science, and product teams to support HEROIC’s AI-powered cybersecurity platform




    Requirements
    • Minimum 5 years experience with Cassandra / DataStax Enterprise in production environments
    • Hands-on experience with DSE Cassandra, Solr, Apache Spark, CQL, and data modeling at scale
    • Strong understanding of NoSQL architecture, sharding, replication, and high availability
    • Advanced knowledge of Linux/Unix, shell scripting, and automation tools (e.g., Ansible, Terraform)
    • Proficient in at least one programming language: Python, Java, or Scala
    • Experience building large-scale automated data ingestion systems or ETL workflows
    • Solid grasp of AI-enhanced data processing, including smart cleaning, deduplication, and classification
    • Excellent written and spoken English communication skills
    • Prior experience with cybersecurity or dark web data (preferred but not required)



    Benefits
    • Position Type: Full-time
    • Location: Pune, India  (Remote – Work from anywhere)
    • Compensation: Competitive salary depending on experience
    • Benefits: Paid Time Off + Public Holidays
    • Professional Growth: Amazing upward mobility in a rapidly expanding company.
    • Innovative Culture: Fast-paced, innovative, and mission-driven. Be part of a team that leverages AI and cutting-edge technologies. 

       

    About Us: HEROIC Cybersecurity ( HEROIC ) is building the future of cybersecurity. Unlike traditional cybersecurity solutions, HEROIC takes a predictive and proactive approach to intelligently secure our users before an attack or threat occurs. Our work environment is fast-paced, challenging and exciting. At HEROIC, you’ll work with a team of passionate, engaged individuals dedicated to intelligently securing the technology of people all over the world.

    Position Keywords: DataStax Enterprise (DSE), Apache Cassandra, Apache Spark, Apache Solr, AWS, Jira, NoSQL, CQL (Cassandra Query Language), Data Modeling, Data Replication, ETL Pipelines, Data Deduplication, Data Lake, Linux/Unix Administration, Bash, Docker, Kubernetes, CI/CD, Python, Java, Distributed Systems, Cluster Management, Performance Tuning, High Availability, Disaster Recovery, AI-based Automation, Artificial Intelligence, Big Data, Dark Web Data



    This advertiser has chosen not to accept applicants from your region.

    Senior Group Data Engineering Manager(Data Pipelines, ADF, ADB, Python, SQL)

    Bengaluru, Karnataka AtkinsRéalis

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Description

    We’re AtkinsRéalis, a world-leading Design, Engineering and Project Management organization. Created by the integration of long-standing organizations dating back to 1911, we are a world-leading professional services and project management company dedicated to engineering a better future for our planet and its people. We create sustainable solutions that connect people, data and technology to transform the world's infrastructure and energy systems. We deploy global capabilities locally to our clients and deliver unique end-to-end services across the whole life cycle of an asset including consulting, advisory & environmental services, intelligent networks & cybersecurity, design & engineering, procurement, project & construction management, operations & maintenance, decommissioning and capital. The breadth and depth of our capabilities are delivered to clients in key strategic sectors such as Engineering Services, Nuclear, Operations & Maintenance and Capital.

    News and information are available at or follow us on LinkedIn.

    Our teams are proud to deliver on some of the most prestigious projects across the world. It's thanks to our talented people and their diverse thinking, expertise, and knowledge. Join us and you'll be part of our genuinely collaborative environment, where everyone is supported to make the most of their talents and expertise.
    When it comes to work-life balance, AtkinsRéalis is a great place to be. So, let's discuss how our flexible and remote working policies can support your priorities. We're passionate about are work while valuing each other equally. So, ask us about some of our recent pledges for Women's Equality and being a 'Disability Confidence' and 'Inclusive Employer’.

    EAI-AtkinsRéalis is a vibrant and continuously growing team. It is an important part of GTC-AtkinsRéalis and widely recognized for its high and quality project deliveries. This would be a vital role to take EAI one step forward in providing data solutions to our business and client. This role would simultaneously work on multiple projects and would provide planning, designing and delivery of data driven projects. Effective communication and a team player are important characteristics of this role.

    Key Activities for This Role

  • Technical guide for a team of Lead Data Engineers.
  • Develop, configure, deploy, and optimize Microsoft Azure based Data solutions.
  • Collaborate with other team members to develop and enhance deliverables.
  • Continuously improve team processes to ensure information is of the highest quality, contributing to the overall effectiveness of the team.
  • Stay abreast of industry changes, especially in the areas of cloud data and analytics technologies.
  • Ability to simultaneously work and deliver on more than one project on Individual contributor role.
  • Ability to work on multiple areas like Data pipeline ETL, Data modelling & design, writing complex SQL queries etc.
  • Capable of planning and executing on both short-term and long-term goals on your own and with the team.
  • Partner with other Data Engineers, Data architects, domain experts, data analysts and other teams to build foundational data sets that are trusted, well understood, aligned with business strategy and enable self-service.
  • Guide, mentor guide Data Engineers, Sr Data Engineers on Data Architecture, data models, implementation techniques and technologies.
  • Experience & Skills Required:

  • 12+ years of experience designing, developing, Architecture and deploying data solutions using Power BI, Azure platform.
  • Experience on designing Data pipelines (ETL/ELT), Datawarehouse and Data marts.
  • Hands-on expert with real-time data processing and analytics, data ingestion (batched and streamed), and data storage solutions.
  • Hands on Azure Analysis Services & Power BI and good to have experience on other tools.
  • Hands on experience with Data Factory, Data Lake Storage, Databricks, Data Explorer, Machine Learning, and Azure Synapse Analytics is good to have.
  • Expert at creating data dissemination diagrams, data flow diagrams, data lifecycle diagrams, data migration diagrams, and data security diagrams etc.
  • Hands on experience with one of the data presentations tools like PowerBI, Tableau etc.
  • A proven expert in writing optimized SQL to deal with large data volumes.
  • Hands on coding in Python along its main data libraries like Pandas, NumPy, Beautiful soup etc.
  • Good to have ML exposer.
  • Good to have AWS experience.
  • Good to have GCP experience.
  • What We Can Offer You

  • Varied, interesting and meaningful work.
  • A hybrid working environment with flexibility and great opportunities.
  • Opportunities for training and, as the team grows, career progression or sideways moves.
  • An opportunity to work within a large global multi-disciplinary consultancy on a mission to change the ways we approach business as usual.
  • Why work for AtkinsRéalis?

    We at AtkinsRéalis are committed to developing its people both personally and professionally. Our colleagues have the advantage of access to a high ranging training portfolio and development activities designed to help make the best of individual’s abilities and talents. We also actively support staff in achieving corporate membership of relevant institutions.

    Meeting Your Needs

    To help you get the most out of life in and outside of work, we offer employees ‘Total Reward’.
    Making sure you're supported is important to us. So, if you identify as having a disability, tell us ahead of your interview, and we’ll discuss any adjustments you might need.
    Additional Information
    We are an equal opportunity, drug-free employer committed to promoting a diverse and inclusive community - a place where we can all be ourselves, thrive and develop. To help embed inclusion for all, from day one, we offer a range of family friendly, inclusive employment policies, flexible working arrangements and employee networks to support staff from different backgrounds. As an Equal Opportunities Employer, we value applications from all backgrounds, cultures and ability.

    We care about your privacy and are committed to protecting your privacy. Please consult our Privacy Notice on our Careers site to know more about how we collect, use and transfer your Personal Data.

    Link: Equality, diversity & inclusion | Atkins India (atkinsrealis.com)

    Worker Type

    Employee

    Job Type

    Regular

    At AtkinsRéalis, we seek to hire individuals with diverse characteristics, backgrounds and perspectives. We strongly believe that world-class talent makes no distinctions based on gender, ethnic or national origin, sexual identity and orientation, age, religion or disability, but enriches itself through these differences.

    This advertiser has chosen not to accept applicants from your region.

    Data Scientist (Cloud Management, SQL, Building cloud data pipelines, Python, Power BI, GCP)

    Chennai, Tamil Nadu UPS India

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.

    Job Description:

    Job Summary
    UPS Marketing team is looking for a talented and driven Data Scientist to drive its strategic objectives in the areas of pricing, revenue management, market analysis and evidence/data-based decision making. This role will work across multiple channels and teams to drive tangible results in the organization. You will focus on developing metrics for multiple channels and markets, applying advanced statistical modeling where appropriate and pioneering new analytical methods in a variety of fast paced and rapidly evolving consumer channels. This high visibility position will work with multiple levels of the organization, including senior leadership to bring analytical capabilities to the forefront of pricing, rate setting, and optimization of our go-to-market offers. You will contribute to rapidly evolving UPS Marketing analytical capabilities by working amongst a collaborative team of Data Scientists, Analysts and multiple business stakeholders.

    Responsibilities:

    • Become a subject matter expert on UPS business processes, data and analytical capabilities to help define and solve business needs using data and advanced statistical methods

    • Analyze and extract insights from large-scale structured and unstructured data utilizing multiple platforms and tools.

    • Understand and apply appropriate methods for cleaning and transforming data

    • Work across multiple stake holders to develop, maintain and improve models in production

    • Take the initiative to create and execute analyses in a proactive manner

    • Deliver complex analytical and visualizations to broader audiences including upper management and executives

    • Deliver analytics and insights to support strategic decision making

    • Understand the application of AI/ML when appropriate to solve complex business problems


    Qualifications

    • Expertise in R, SQL, Python.

    • Strong analytical skills and attention to detail. 

    • Able to engage key business and executive-level stakeholders to translate business problems to high level analytics solution approach.

    • Expertise with statistical techniques, machine learning or operations research and their application in business applications.

    • Deep understanding of data management pipelines and experience in launching moderate scale advanced analytics projects in production at scale.

    • Proficient in Azure, Google Cloud environment

    • Experience implementing open-source technologies and cloud services; with or without the use of enterprise data science platforms.

    • Solid oral and written communication skills, especially around analytical concepts and methods. 

    • Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience.

    • Master’s Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience.


    Bonus Qualifications

    • Experience with pricing methodologies and revenue management

    • Experience using PySpark, Azure Databricks, Google BigQuery and Vertex AI

    • Creating and implementing NLP/LLM projects

    • Experience utilizing and applying neurals networks and other AI methodologies

    • Familiarity with Data architecture and engineering


    Employee Type:
     

    Permanent


    UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

    This advertiser has chosen not to accept applicants from your region.

    Specialist, Data Architecture

    Noida, Uttar Pradesh Fiserv

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    **Calling all innovators - find your future at Fiserv.**
    We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day - quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
    **Job Title**
    Specialist, Data Architecture
    + Experience on SSIS/SQL(4-7) years and will be responsible for the development of ETL and Reporting solutions
    + Strong Knowledge of SSIS packages, design principles & best practices.
    + Experience with requirements gathering, technical analysis, and writing technical specifications
    + Candidate must have strong database fundamentals.
    + Must have good knowledge of Data Warehousing & Data Modelling Concepts.
    + Good communication skills are required.
    + Capability to work in a distributed team environment with minimal supervision is required for this profile.
    + The position doesn't require working in shifts, however flexibility to overlap with US hours is required.
    + Should have good knowledge in writing SQL commands, queries and stored procedures
    + Good Knowledge of Snowflake would be preferred.
    + Good Knowledge of Python/Pyspark would be preferred
    Thank you for considering employment with Fiserv. Please:
    + Apply using your legal name
    + Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
    **Our commitment to Diversity and Inclusion:**
    Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
    **Note to agencies:**
    Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
    **Warning about fake job posts:**
    Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
    This advertiser has chosen not to accept applicants from your region.

    Advisor, Data Architecture

    Bengaluru, Karnataka Fiserv

    Posted 8 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    **Calling all innovators - find your future at Fiserv.**
    We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day - quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
    **Job Title**
    Advisor, Data Architecture
    **What does a successful Advisor do at Fiserv?**
    As a member of our Data Commerce Solutions group, you will build and take ownership of the design and development of data engineering projects within Fiserv's Enterprise Data Commerce Solutions division. You will apply your depth of knowledge and expertise to all aspects of the data engineering lifecycle, as well as partner continuously with your many stakeholders daily to stay focused on common goals.
    You will Lead large-scale data engineering, integration and warehousing projects, build custom integrations between cloud-based systems using APIs and write complex and efficient queries to transform raw data sources into easily accessible models by using the Data integration tool with coding across several languages such as Java, Python, and SQL. Additional responsibilities include, but are not limited to Architect, build, and launch new data models that provide intuitive analytics to the team and Build data expertise and own data quality for the pipelines you create.
    **What you will do:**
    + Provide strategic leadership and direction to the software development team, fostering a culture of innovation, collaboration, and continuous improvement.
    + Develop and implement a robust software development strategy aligned with the company's overall objectives and long-term vision.
    + Collaborate with product management to define software requirements, scope, and priorities, ensuring alignment with business goals.
    + Lead and guide the software development team in creating technical design specifications, architecture, and development plans for complex software projects.
    + Ensure adherence to industry best practices, coding standards, and software development methodologies to deliver high-quality and scalable software solutions.
    + Monitor and analyze software development metrics and key performance indicators (KPIs) to track team productivity, efficiency, and code quality.
    + Manage the software development budget and resource allocation, optimizing resource utilization and capacity planning.
    + Foster a culture of learning and development within the team, providing coaching, mentoring, and professional growth opportunities to team members.
    + Identify and mitigate potential risks and challenges in software development projects, developing contingency plans as needed.
    + Collaborate with other stakeholders to establish and maintain effective communication channels and project status updates.
    + Stay up to date with industry trends, emerging technologies, and best practices to drive continuous improvement and innovation in software development processes.
    + Build and maintain strong relationships with external partners, vendors, and third-party providers to enhance software development capabilities and delivery.
    **What you will need to have:**
    + Bachelor's or master's degree in computer science, Software Engineering, or a related field. An advanced degree is preferred.
    + Proven experience (minimum 7+ years) in a senior leadership role within software development or software engineering.
    + Demonstrated success in delivering complex software projects and products on time and within budget.
    + Extensive experience in software development methodologies, such as Agile, Scrum, or Kanban, and experience in transitioning teams to these methodologies.
    + Strong technical expertise in software architecture, design patterns, and modern software development languages and frameworks.
    + Excellent communication, interpersonal, and leadership skills, with the ability to influence and inspire cross-functional teams.
    + Exceptional problem-solving and decision-making abilities, with a keen attention to detail and a focus on delivering high-quality products.
    + Proven track record of building and managing high-performing software development teams.
    + Strong business acumen and the ability to align software development initiatives with broader business objectives.
    + A passion for innovation, technology, and keeping abreast of the latest developments in the software industry.
    + Proficiency with solutions for processing large volumes of data, using data processing tools and Big Data platforms.
    + Understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS, SQL experience
    + Hands-on experience in production rollout and infrastructure configuration
    + Demonstrable experience of successfully delivering big data projects using Kafka, Spark
    + Exposure working on NoSQL Databases such as Cassandra, HBase, DynamoDB, and Elastic Search
    + Experience working with PCI Data and working with data scientists is a plus.
    + In depth knowledge of design principles and patterns
    + Experience with cloud platforms and services such as AWS, Azure, or Google Cloud Platform, and knowledge of deploying and managing APIs in a cloud environment.
    + Knowledge of API gateway solutions and their implementation, such as Kong, Apigee, or AWS API Gateway.
    **What would be great to have:**
    + Exposure to Big Data tools and solutions a strong plus.
    + Exposure to Relational Modeling, Dimensional Modeling, and Modeling of Unstructured Data.
    + Experience in Design and architecture review and Banking and Financial domain.
    Thank you for considering employment with Fiserv. Please:
    + Apply using your legal name
    + Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
    **Our commitment to Diversity and Inclusion:**
    Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
    **Note to agencies:**
    Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
    **Warning about fake job posts:**
    Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Data pipelines Jobs in India !

    Specialist, Data Architecture

    Noida, Uttar Pradesh ₹1500000 - ₹2500000 Y Fiserv

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Calling all innovators – find your future at Fiserv.

    We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.

    Job Title

    Specialist, Data Architecture

    • Experience on SSIS/SQL(4-7) years and will be responsible for the development of ETL and Reporting solutions
    • Strong Knowledge of SSIS packages, design principles & best practices.
    • Experience with requirements gathering, technical analysis, and writing technical specifications
    • Candidate must have strong database fundamentals.
    • Must have good knowledge of Data Warehousing & Data Modelling Concepts.
    • Good communication skills are required.
    • Capability to work in a distributed team environment with minimal supervision is required for this profile.
    • The position doesn't require working in shifts, however flexibility to overlap with US hours is required.
    • Should have good knowledge in writing SQL commands, queries and stored procedures
    • Good Knowledge of Snowflake would be preferred.
    • Good Knowledge of Python/Pyspark would be preferred

    Thank you for considering employment with Fiserv. Please:

    • Apply using your legal name
    • Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).

    Our commitment to Diversity and Inclusion:

    Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.

    Note to agencies:

    Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.

    Warning about fake job posts:

    Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

    This advertiser has chosen not to accept applicants from your region.

    Specialist, Data Architecture

    Pune, Maharashtra ₹1200000 - ₹3600000 Y Fiserv

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Calling all innovators – find your future at Fiserv.
    We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.

    Job Title
    Specialist, Data Architecture

    What does a great AI/ML Engineer do?
    We are seeking a highly skilled Generative AI Specialist with 6-8 years of extensive experience in AI and machine learning, focusing on generative models and prompt engineering. The successful candidate will work collaboratively within our team to design, implement, and optimize generative AI solutions that enhance our product offerings and provide value to our clients.

    What You Will Do

    • Develop, implement, and optimize generative models (e.g., GANs, VAEs) for various applications.
    • Conduct research and stay updated with the latest advancements in generative AI and deep learning.
    • Collaborate with cross-functional teams to identify opportunities for leveraging generative AI in our products.
    • Design and optimize prompts used in AI models to improve output quality and relevance.
    • Analyze model performance and user data to refine prompt strategies.
    • Analyze and preprocess large datasets to train and validate models effectively.
    • Create and maintain documentation for algorithms, methodologies, and processes.
    • Provide training and support to internal teams on generative AI technologies.

    What You Will Need To Have

    • Bachelor's or Master's degree in Computer Science, Data Science, AI, or a related field.
    • 6-8 years of experience in AI/ML, with a strong focus on generative models and prompt engineering.
    • Proficiency in programming languages such as Python or R.
    • Experience with deep learning frameworks (e.g., TensorFlow, PyTorch) and libraries.
    • Strong understanding of machine learning algorithms and statistical analysis.
    • Excellent problem-solving skills and ability to work in a fast-paced environment.
    • Strong communication skills to articulate complex ideas to non-technical stakeholders.

    What Would Be Great To Have

    • Experience in the financial services industry is a plus.
    • Familiarity with natural language processing (NLP) or computer vision (CV) applications.
    • Contributions to open source projects or published research in generative AI.

    Thank You For Considering Employment With Fiserv. Please

    • Apply using your legal name
    • Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).

    Our Commitment To Diversity And Inclusion
    Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.

    Note To Agencies
    Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.

    Warning About Fake Job Posts
    Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Pipelines Jobs