IN_Senior Associate_Azure Data Enginer_Data and Analytics_Advisory_Pune

PwC India


Date: 2 days ago
City: Pune, Maharashtra
Contract type: Full time
Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.

Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations.

Job Description & Summary:

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.

Responsibilities:

  • Design, develop, and optimize data pipelines and ETL processes using PySpark or Scala to extract, transform, and load large volumes of structured and unstructured data from diverse sources.
  • Implement data ingestion, processing, and storage solutions on Azure cloud platform, leveraging services such as Azure Databricks, Azure Data Lake Storage, and Azure Synapse Analytics.
  • Develop and maintain data models, schemas, and metadata to support efficient data access, query performance, and analytics requirements.
  • Monitor pipeline performance, troubleshoot issues, and optimize data processing workflows for scalability, reliability, and cost-effectiveness.
  • Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance.


Requirement

  • Proven experience as a Data Engineer, with expertise in building and optimizing data pipelines using PySpark, Scala, and Apache Spark.
  • Hands-on experience with cloud platforms, particularly Azure, and proficiency in Azure services such as Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database.
  • Strong programming skills in Python and Scala, with experience in software development, version control, and CI/CD practices.
  • Familiarity with data warehousing concepts, dimensional modeling, and relational databases (e.g., SQL Server, PostgreSQL, MySQL).
  • Experience with big data technologies and frameworks (e.g., Hadoop, Hive, HBase) is a plus.


Mandatory Skill Sets:

Databricks, Pyspark, Azure

Preferred Skill Sets:

Databricks, Pyspark, Azure

Years Of Experience Required:

4 - 8

Education Qualification:

B.Tech / M.Tech / MBA / MCA

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Master of Engineering, Bachelor of Technology, Bachelor of Engineering

Degrees/Field Of Study Preferred:

Certifications (if blank, certifications not specified)

Required Skills

Microsoft Azure, PySpark

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

How to apply

To apply for this job you need to authorize on our website. If you don't have an account yet, please register.

Post a resume

Similar jobs

PMO - B

Capgemini, Pune, Maharashtra
22 hours ago
Job DescriptionThe Project Management Officer (PMO) provides a range of support services to the Engagement Managers to govern our engagements, plan and track them, report progress, manage issues and risks, control change, manage deliverables and quality, track obligations, adhere to our contractual and commercial constraints, manage our finances and keep electronic records of what we produce and do.Job Description -...

Java FSD

Virtusa, Pune, Maharashtra
2 days ago
Job DescriptionResponsibilities: Develop, and maintain Java-based applications.Develop front-end components using Angular/React framework.Implement responsive and user-friendly interfaces based on UI/UX designs.Integrate back-end services with front-end components to ensure seamless user experiences.Write clean, efficient, and maintainable code following best practices and coding standards.Perform unit testing, code reviews, and debugging to ensure software quality and reliability.Document technical specifications, system architecture, and implementation details...

Manager Intercompany

Cummins West Africa Limited, Pune, Maharashtra
2 days ago
DescriptionGPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/)Responsible for general accounting work required to maintain the department’s general ledger. Performs financial analysis of the department’s chart of accounts and financial statements by identifying and analyzing variances.Key ResponsibilitiesAnalyzes, records and reports accounting transactions in a timely manner. Prepares general ledger entries and account reconciliations. Responds to inquiries received from management, internal and external auditors, Sarbanes-Oxley...