Data Engineer-Data Platforms
IBM
Date: 3 weeks ago
City: Pune, Maharashtra
Contract type: Full time
Introduction
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
Your Role And Responsibilities
Master's Degree
Required Technical And Professional Expertise
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
Your Role And Responsibilities
- As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.
- Your primary responsibilities include:
- Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
- Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
- Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
Master's Degree
Required Technical And Professional Expertise
- Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python
- Hbase, Hive Good to have Aws -S3,
- athena ,Dynomo DB, Lambda, Jenkins GIT
- Developed Python and pyspark programs for data analysis.. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine).
- Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.
- Understanding of Devops.
- Experience in building scalable end-to-end data ingestion and processing solutions
- Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
SAP MM Sr Consultant
Deloitte,
Pune, Maharashtra
2 hours ago
Summary Position Summary Enterprise Performance Our Enterprise Performance team is at the forefront of enterprise technology, working across finance, supply chain, and IT operations to support the delivery of holistic performance improvement and digital transformation. We support Deloitte client service teams of strategic advisors and architects, differentiated by our industry depth to help collaborate with leading insights providers and leverage...
Tax Analyst 1
Emerson,
Pune, Maharashtra
2 days ago
Job DescriptionIn This Role, Your Responsibilities Will Be: Review US GAAP tax reporting in the accounting system HFM, for European companies at a Business Unit level monthly. Review all entries made in the tax reporting system OneSource with regards to the quarterly process, annual prior year true-ups and annual forecast submissions. Ensure the tax entries made in OneSource agree to...
Engineering Manager
Zendesk,
Pune, Maharashtra
2 days ago
Job DescriptionDevOps Manager - ChatOps Pune JDJob DescriptionZendesk is a service-first CRM company that builds powerful, customizable software designed to improve customer relations. At Zendesk, we encourage growth, innovation and believe in giving back to the communities we call home.We are currently looking for an experienced DevOps Manager with a strong background in cloud infrastructure, preferably with AWS, expertise in...