21,546 Reporting Engineer jobs in India
BI/Reporting Engineer

Posted 2 days ago
Job Viewed
Job Description
BI/Reporting Engineer
**Job Description:**
**What we're looking for:**
- Overall 10 years of experience in Industry including 6+Years of experience as developer using Databricks/Spark Ecosystems.
- Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL.
- Good understanding of Spark Architecture with Databricks, Structured Streaming. Setting Microsoft Azure with Databricks, Databricks Workspace for Business Analytics, Manage Clusters In Databricks, Managing the Machine Learning Lifecycle
- Hands on experience Data extraction(extract, Schemas, corrupt record handling and parallelized code), transformations and loads (user - defined functions, join optimizations) and Production (optimize and automate Extract, Transform and Load)
**TECHNICAL SKILLS**
+ Spark Data Frame API
+ Python for Data Science
+ Spark Programming
+ SQL for Data Analysis
+ Simplify Data analysis With Python
+ Manage Clusters Databricks
+ Databrick Administration
+ Data Extraction and Transformation and Load
+ Implementing Partitioning and Programming with MapReduce
+ Setting up Azure Databricks Account
+ Linux Command
**What you'll be doing:**
+ Experience in developing Spark applications using Spark-SQL in **Databricks** for data extraction, transformation, and aggregation from multiple file formats for Analyzing & transforming the data to uncover insights into the customer usage patterns.
+ Extract Transform and Load data from sources Systems to Azure Data Storage services using a combination of Azure Data factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data ingestion to one or more Azure **services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW)** and processing the data in **Azure Databricks**
+ Develop Spark applications using pyspark and spark SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns
+ Hands on experience on developing SQL Scripts for automation
+ **R** esponsible for estimating the cluster size, monitoring, and troubleshooting of the Spark databricks cluster
+ Ability to apply the spark **DataFrame API** to complete Data manipulation within spark session
+ Good understanding of Spark Architecture including spark core, **spark SQL** , DataFrame, Spark streaming, Driver Node, Worker Node, Stages, Executors and Tasks, Deployment modes, the Execution hierarchy, fault tolerance, and collection
+ Collaborate with delivery leadership to deliver projects on time adhering to the quality standards
+ Contribute to the growth of the Microsoft Azure practice by helping with solutioning for prospects
+ Problem-solving skills along with good interpersonal & communication skills
+ Self-starter who can pick up any other relevant Azure Services in the Analytics space
**Location:**
IN-KA-Bangalore, India (SKAV Seethalakshmi) GESC
**Time Type:**
Full time
**Job Category:**
Information Technology
Arrow Electronics, Inc.'s policy is to provide equal employment opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, marital status, gender identity or expression, sexual orientation, national origin, disability, citizenship, veran status, genetic information, or any other characteristics protected by applicable state, federal or local laws. Our policy of equal employment opportunity and affirmative action applies to all employment decisions personnel policies and practices, or programs.
BI/Reporting Engineer
Posted today
Job Viewed
Job Description
Description
:What we're looking for:
• Overall 10 years of experience in Industry including 6+Years of experience as developer using Databricks/Spark Ecosystems.
• Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL.
• Good understanding of Spark Architecture with Databricks, Structured Streaming. Setting Microsoft Azure with Databricks, Databricks Workspace for Business Analytics, Manage Clusters In Databricks, Managing the Machine Learning Lifecycle
• Hands on experience Data extraction(extract, Schemas, corrupt record handling and parallelized code), transformations and loads (user - defined functions, join optimizations) and Production (optimize and automate Extract, Transform and Load)
TECHNICAL SKILLS
What you'll be doing:
Location:
IN-KA-Bangalore, India (SKAV Seethalakshmi) GESCTime Type:
Full timeJob Category:
Information TechnologyBI/Reporting Engineer I

Posted 2 days ago
Job Viewed
Job Description
BI/Reporting Engineer I
**Job Description:**
**BI Reporting Engineer:**
+ The BI Reporting Engineer will be a key contributor to the Business Intelligence team, responsible for leading the analysis, visualization, and interpretation of complex data to provide actionable insights that drive strategic decision-making.
**What You'll Be Doing:**
+ **Data Modeling and Warehousing:**
+ Apply a strong understanding of data warehousing concepts (e.g., ETL/ELT, dimensional modeling, schema design) to optimize data structures for reporting and analysis.
+ Collaborate with data engineers to ensure data quality, integrity, and accessibility within the data warehouse environment.
+ Design and implement data models in Power BI/OBIEE that are efficient, scalable, and aligned with business needs.
+ **Advanced Data Analysis and Reporting:**
+ Conduct in-depth statistical analysis, identify key trends, and develop predictive insights from complex datasets.
+ Design, develop, and maintain sophisticated interactive dashboards and reports in Power BI/OBIEE to visualize data and communicate findings effectively to various stakeholders.
+ Utilize DAX (Data Analysis Expressions) in Power BI to create complex calculations, measures, and KPIs.
+ Perform data validation and quality checks to ensure the accuracy and reliability of reports and analyses.
+ **Business Intelligence Solutions and Strategy:**
+ Collaborate with business stakeholders to understand their analytical requirements and translate them into effective BI solutions.
+ Proactively identify opportunities to leverage data to improve business processes and performance.
+ Contribute to the development and implementation of the data intelligence strategy and best practices.
+ **Mentorship and Collaboration:**
+ Mentor and provide guidance to junior data analysts on data analysis techniques, Power BI best practices, and data warehousing concepts.
+ Collaborate effectively with cross-functional teams, including IT, business units, and leadership.
+ Present findings and recommendations to both technical and non-technical audiences.
**What We Are Looking For:**
+ 4+ years of relevant experience.
+ Strong understanding of data warehousing principles, advanced proficiency in Power BI, SQL and proven experience in transforming data into impactful business intelligence solutions.
+ The ideal candidate will be a proactive problem-solver with excellent communication skills and the ability to work independently and collaboratively.
+ Having OBIEE & BI Publisher knowledge would be an added advantage.
**About Arrow**
**Arrow Electronics, Inc. (NYSE: ARW** ), an award-winning Fortune 133 and one of Fortune Magazine's Most Admired Companies. Arrow guides innovation forward for over 220,000 leading technology manufacturers and service providers. With 2024 sales of USD $27.9 billion, Arrow develops technology solutions that improve business and daily life. Our broad portfolio that spans the entire technology landscape helps customers create, make and manage forward-thinking products that make the benefits of technology accessible to as many people as possible. Learn more at .
Our strategic direction of guiding innovation forward is expressed as Five Years Out, a way of thinking about the tangible future to bridge the gap between what's possible and the practical technologies to make it happen. Learn more at .
For more job opportunities, please visit .
**Location:**
IN-KA-Bangalore, India (SKAV Seethalakshmi) GESC
**Time Type:**
Full time
**Job Category:**
Information Technology
Arrow Electronics, Inc.'s policy is to provide equal employment opportunities to all qualified employees and applicants without regard to race, color, religion, age, sex, marital status, gender identity or expression, sexual orientation, national origin, disability, citizenship, veran status, genetic information, or any other characteristics protected by applicable state, federal or local laws. Our policy of equal employment opportunity and affirmative action applies to all employment decisions personnel policies and practices, or programs.
BI/Reporting Engineer I
Posted today
Job Viewed
Job Description
Description
:BI Reporting Engineer:
The BI Reporting Engineer will be a key contributor to the Business Intelligence team, responsible for leading the analysis, visualization, and interpretation of complex data to provide actionable insights that drive strategic decision-making.
What You’ll Be Doing:
Data Modeling and Warehousing:
Apply a strong understanding of data warehousing concepts (e.g., ETL/ELT, dimensional modeling, schema design) to optimize data structures for reporting and analysis.
Collaborate with data engineers to ensure data quality, integrity, and accessibility within the data warehouse environment.
Design and implement data models in Power BI/OBIEE that are efficient, scalable, and aligned with business needs.
Advanced Data Analysis and Reporting:
Conduct in-depth statistical analysis, identify key trends, and develop predictive insights from complex datasets.
Design, develop, and maintain sophisticated interactive dashboards and reports in Power BI/OBIEE to visualize data and communicate findings effectively to various stakeholders.
Utilize DAX (Data Analysis Expressions) in Power BI to create complex calculations, measures, and KPIs.
Perform data validation and quality checks to ensure the accuracy and reliability of reports and analyses.
Business Intelligence Solutions and Strategy:
Collaborate with business stakeholders to understand their analytical requirements and translate them into effective BI solutions.
Proactively identify opportunities to leverage data to improve business processes and performance.
Contribute to the development and implementation of the data intelligence strategy and best practices.
Mentorship and Collaboration:
Mentor and provide guidance to junior data analysts on data analysis techniques, Power BI best practices, and data warehousing concepts.
Collaborate effectively with cross-functional teams, including IT, business units, and leadership.
Present findings and recommendations to both technical and non-technical audiences.
What We Are Looking For:
4+ years of relevant experience.
Strong understanding of data warehousing principles, advanced proficiency in Power BI, SQL and proven experience in transforming data into impactful business intelligence solutions.
The ideal candidate will be a proactive problem-solver with excellent communication skills and the ability to work independently and collaboratively.
Having OBIEE & BI Publisher knowledge would be an added advantage.
About Arrow
Arrow Electronics, Inc. (NYSE: ARW ), an award-winning Fortune 133 and one of Fortune Magazine’s Most Admired Companies. Arrow guides innovation forward for over 220,000 leading technology manufacturers and service providers. With 2024 sales of USD $27.9 billion, Arrow develops technology solutions that improve business and daily life. Our broad portfolio that spans the entire technology landscape helps customers create, make and manage forward-thinking products that make the benefits of technology accessible to as many people as possible. Learn more at .
Our strategic direction of guiding innovation forward is expressed as Five Years Out, a way of thinking about the tangible future to bridge the gap between what's possible and the practical technologies to make it happen. Learn more at .
For more job opportunities, please visit .
Location:
IN-KA-Bangalore, India (SKAV Seethalakshmi) GESCTime Type:
Full timeJob Category:
Information TechnologyData & Reporting Support Engineer
Posted today
Job Viewed
Job Description
Lead Data Engineer - Reporting
Posted today
Job Viewed
Job Description
We are seeking a highly skilled and experienced Senior Power BI Developer to join our dynamic data and analytics team.
The ideal candidate should have strong technical skills in Power BI , responsible for designing, developing, and implementing robust and
insightful business intelligence solutions using Power BI .
This role requires a deep understanding of BI concepts, strong SQL skills, and
extensive experience with various AWS services like Redshift and RDS Postgres Sql.
You will play a crucial role in translating complex business requirements into clear, interactive,
and high-performance dashboards and reports, driving data-driven decision-making across the organization.
Responsibilities
- **Power BI Development & Design:**
- Lead the design, development, and implementation of complex, interactive, and user-friendly dashboards and reports using Power BI.
- Translate diverse business requirements into technical specifications and impactful data visualizations.
- Develop and optimize Power BI datasets, analyses, and dashboards for performance, scalability, and maintainability.
- Implement advanced Power BI features such as rbac ,parameters, calculated fields, custom visuals, and dynamic filtering.
- Ensure data accuracy, consistency, and integrity within all Power BI reports and dashboards.
- **Performance Optimization & Governance:**
- Identify and address performance bottlenecks in Power BI dashboards and underlying data sources.
- Implement best practices for POwer BI security, user access, and data governance.
- Monitor Power BI usage and performance, recommending improvements as needed.
- Ensure compliance with data security policies and governance guidelines when handling sensitive data within Power BI.
- **Continuous Improvement:**
- Stay up-to-date with the latest features, releases, and best practices in Power BI.
- Proactively identify opportunities for process improvement and automation in BI development workflows.
- Work in an Agile/Scrum environment, actively participating in sprint planning, reviews, and retrospectives.
- Bachelor&aposs degree in Computer Science, Information Technology, Data Analytics, or a related field.
- 8-10 years of overall IT experience, with 4-5 years of hands-on experience designing and developing complex dashboards and reports using Power BI.
- Strong proficiency in SQL (writing complex queries, stored procedures, functions, DDL).
- In-depth understanding of BI.
- Extensive experience with various AWS services relevant to data analytics, including:
- Redshift (data warehouse)
- RDS (relational databases)
- Proven ability to translate business requirements into technical solutions and effective data visualizations.
- Excellent analytical, problem-solving, and critical thinking skills.
- Strong communication and interpersonal skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
- Experience working in an Agile development methodology.
- Ability to work independently, manage multiple priorities, and meet tight deadlines.
- Experience with other BI tools (e.g., Tableau etc) is a plus, demonstrating a broad understanding of BI landscape.
- Proficiency in Python or other scripting languages for data manipulation and automation.
Skills Required
Agile Development, Power Bi, Data Visualization, Sql
Functional Engineer Reporting
Posted today
Job Viewed
Job Description
About AkzoNobel
Since , we’ve been supplying the innovative paints and coatings that help to color people’s lives and protect what matters most. Our world class portfolio of brands – including Dulux, International, Sikkens and Interpon – is trusted by customers around the globe. We’re active in more than countries and use our expertise to sustain and enhance the fabric of everyday life. Because we believe every surface is an opportunity. It’s what you’d expect from a pioneering and long-established paints company that’s dedicated to providing sustainable solutions and preserving the best of what we have today – while creating an even better tomorrow. Let’s paint the future together.
© Akzo Nobel N.V. All rights reserved.
Job Purpose
We are seeking a highly skilled Functional Engineer Reporting with deep experience in building scalable data solutions on the Azure cloud platform. The ideal candidate will have strong architecture-level knowledge, hands-on expertise with Databricks, and proven skills in optimizing large-scale data processing systems. This role will be critical in enabling data-driven decisions within our commercial operations.
Key Activities
• Design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks.
• Define and implement data architecture and modeling strategies to support reporting, analytics, and advanced use cases.
• Build and manage data warehousing solutions on Azure.
• Apply optimization techniques to improve pipeline performance, reduce latency, and handle large data volumes efficiently.
• Integrate and transform data from SAP and other enterprise systems.
• Ensure alignment with data governance, quality, and security standards.
• Proactively troubleshoot and resolve data issues and performance bottlenecks.
Experience
• 5-9 years of hands-on experience in data engineering.
• Deep knowledge and hands-on experience with Databricks for data processing and transformation.
• Strong architecture-level understanding of cloud-based data platforms and solution design.
• Proficiency in SQL and Python for data manipulation and automation.
• Experience integrating data from SAP systems.
• Strong grasp of performance tuning and optimization techniques for data pipelines.
Preferred Qualifications:
• Experience with CI/CD and DevOps practices in data engineering.
• Familiarity with Big Data technologies (e.g., Hadoop, Spark).
• Exposure to Azure Streaming Services or real-time data processing.
• Understanding of commercial data and related business metrics
At AkzoNobel we are highly committed to ensuring an inclusive and respectful workplace where all employees can be their best self. We strive to embrace diversity in a context of tolerance. Our talent acquisition process plays an integral part in this journey, as setting the foundations for a diverse environment. For this reason we train and educate on the implications of our Unconscious Bias in order for our TA and hiring managers to be mindful of them and take corrective actions when applicable. In our organization, all qualified applicants receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age or disability.
Requisition ID:
Be The First To Know
About the latest Reporting engineer Jobs in India !
Data Engineer (AWS Analytics & Reporting)
Posted 3 days ago
Job Viewed
Job Description
We are a tech-first company solving real-world problems with clean code, thoughtful design, and fast iterations. Our team is lean, product-driven, and obsessed with delivering value through scalable data platforms and applications.
- Build and maintain data pipelines connecting Aurora MySQL → Redshift Serverless → S3/QuickSight
- Design data models, materialized views, and APIs to enable fast reporting and analytics
- Automate scheduled reporting workflows and exports for clients and admins
- Collaborate with engineers, product, and design to deliver insights quickly
- Ensure performance, reliability, and scalability of our analytics infrastructure
- Strong experience with AWS data stack (Aurora, Redshift, S3, Lambda/EventBridge)
- Solid SQL and data modeling (fact/dimension schemas, materialized views)
- Familiarity with BI tools like QuickSight, Metabase, or Superset
- Good understanding of IAM, security, and data governance principles
- Experience with Python (and libraries like awswrangler for data automation)
- Exposure to ETL/ELT frameworks (Glue, Airflow, dbt)
- Knowledge of data lake best practices (Parquet, partitioning)
- Comfort with infrastructure-as-code (Terraform/CDK)
- Understanding of multi-tenant SaaS reporting architectures
Data engineer (aws analytics & reporting)
Posted today
Job Viewed
Job Description
About UsWe are a tech-first company solving real-world problems with clean code, thoughtful design, and fast iterations. Our team is lean, product-driven, and obsessed with delivering value through scalable data platforms and applications.What You’ll Be DoingBuild and maintain data pipelines connecting Aurora My SQL → Redshift Serverless → S3/Quick SightDesign data models, materialized views, and APIs to enable fast reporting and analyticsAutomate scheduled reporting workflows and exports for clients and adminsCollaborate with engineers, product, and design to deliver insights quicklyEnsure performance, reliability, and scalability of our analytics infrastructureMust-Have SkillsStrong experience with AWS data stack (Aurora, Redshift, S3, Lambda/Event Bridge)Solid SQL and data modeling (fact/dimension schemas, materialized views)Familiarity with BI tools like Quick Sight, Metabase, or SupersetGood understanding of IAM, security, and data governance principlesGood to HaveExperience with Python (and libraries like awswrangler for data automation)Exposure to ETL/ELT frameworks (Glue, Airflow, dbt)Knowledge of data lake best practices (Parquet, partitioning)Comfort with infrastructure-as-code (Terraform/CDK)Understanding of multi-tenant Saa S reporting architectures
Data engineer (aws analytics & reporting)
Posted today
Job Viewed
Job Description
About UsWe are a tech-first company solving real-world problems with clean code, thoughtful design, and fast iterations. Our team is lean, product-driven, and obsessed with delivering value through scalable data platforms and applications.What You’ll Be DoingBuild and maintain data pipelines connecting Aurora My SQL → Redshift Serverless → S3/Quick SightDesign data models, materialized views, and APIs to enable fast reporting and analyticsAutomate scheduled reporting workflows and exports for clients and adminsCollaborate with engineers, product, and design to deliver insights quicklyEnsure performance, reliability, and scalability of our analytics infrastructureMust-Have SkillsStrong experience with AWS data stack (Aurora, Redshift, S3, Lambda/Event Bridge)Solid SQL and data modeling (fact/dimension schemas, materialized views)Familiarity with BI tools like Quick Sight, Metabase, or SupersetGood understanding of IAM, security, and data governance principlesGood to HaveExperience with Python (and libraries like awswrangler for data automation)Exposure to ETL/ELT frameworks (Glue, Airflow, dbt)Knowledge of data lake best practices (Parquet, partitioning)Comfort with infrastructure-as-code (Terraform/CDK)Understanding of multi-tenant Saa S reporting architectures