Senior Data Engineer IRC262295
GlobalLogic
Date: 3 weeks ago
City: Bengaluru, Karnataka
Contract type: Full time
Description
Job Description
7-12 years of experience with Big Data, PySpark, Databricks including ETL/ELT.
Bachelor/Masters degree in Computer Science and Engineering.
Must work independently on Analytics engines like Big Data and PySpark.
Good experience with Python and Microservices.
Excellent knowledge of MySQL/Redis.
Excellent working knowledge on Public Clouds such as AWS/GCP/Azure.
Should have hands-on experience with Grafana, ELK, Loki, Prometheus for monitoring and metrics.
Hands on experience with Infrastructure as code tools like Terraform and Helm.
Good experience on Container Orchestration tools like Docker and Kubernetes.
Job Responsibilities
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies with Cloud Platforms.
Design and build data pipelines to schedule & orchestrate a variety of tasks such as extracting, cleansing, transforming, enrich & loading data as per the business needs
Work with Agile and DevOps techniques and implementation approaches in the delivery.
Be required to build and deliver Data solutions using AWS products and offerings.
Collaborate with engineering/product/analyst teams across tech sites
Influence infrastructure architecture by sharing your expertise.
Skill Category
Python
Keyskills – Must Have
Python REST Django Framework GitHub AWS Docker Microservices
Requirements
7-12 years of experience with Big Data, PySpark, Databricks including ETL/ELT.
Bachelor/Masters degree in Computer Science and Engineering.
Must work independently on Analytics engines like Big Data and PySpark.
Good experience with Python and Microservices.
Excellent knowledge of MySQL/Redis.
Excellent working knowledge on Public Clouds such as AWS/GCP/Azure.
Should have hands-on experience with Grafana, ELK, Loki, Prometheus for monitoring and metrics.
Hands on experience with Infrastructure as code tools like Terraform and Helm.
Good experience on Container Orchestration tools like Docker and Kubernetes.
Job responsibilities
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies with Cloud Platforms.
Design and build data pipelines to schedule & orchestrate a variety of tasks such as extracting, cleansing, transforming, enrich & loading data as per the business needs
Work with Agile and DevOps techniques and implementation approaches in the delivery.
Be required to build and deliver Data solutions using AWS products and offerings.
Collaborate with engineering/product/analyst teams across tech sites
Influence infrastructure architecture by sharing your expertise.
What we offer
Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders.
Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.
Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.
Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!
High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.
About GlobalLogic
GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Job Description
7-12 years of experience with Big Data, PySpark, Databricks including ETL/ELT.
Bachelor/Masters degree in Computer Science and Engineering.
Must work independently on Analytics engines like Big Data and PySpark.
Good experience with Python and Microservices.
Excellent knowledge of MySQL/Redis.
Excellent working knowledge on Public Clouds such as AWS/GCP/Azure.
Should have hands-on experience with Grafana, ELK, Loki, Prometheus for monitoring and metrics.
Hands on experience with Infrastructure as code tools like Terraform and Helm.
Good experience on Container Orchestration tools like Docker and Kubernetes.
Job Responsibilities
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies with Cloud Platforms.
Design and build data pipelines to schedule & orchestrate a variety of tasks such as extracting, cleansing, transforming, enrich & loading data as per the business needs
Work with Agile and DevOps techniques and implementation approaches in the delivery.
Be required to build and deliver Data solutions using AWS products and offerings.
Collaborate with engineering/product/analyst teams across tech sites
Influence infrastructure architecture by sharing your expertise.
Skill Category
Python
Keyskills – Must Have
Python REST Django Framework GitHub AWS Docker Microservices
Requirements
7-12 years of experience with Big Data, PySpark, Databricks including ETL/ELT.
Bachelor/Masters degree in Computer Science and Engineering.
Must work independently on Analytics engines like Big Data and PySpark.
Good experience with Python and Microservices.
Excellent knowledge of MySQL/Redis.
Excellent working knowledge on Public Clouds such as AWS/GCP/Azure.
Should have hands-on experience with Grafana, ELK, Loki, Prometheus for monitoring and metrics.
Hands on experience with Infrastructure as code tools like Terraform and Helm.
Good experience on Container Orchestration tools like Docker and Kubernetes.
Job responsibilities
Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies with Cloud Platforms.
Design and build data pipelines to schedule & orchestrate a variety of tasks such as extracting, cleansing, transforming, enrich & loading data as per the business needs
Work with Agile and DevOps techniques and implementation approaches in the delivery.
Be required to build and deliver Data solutions using AWS products and offerings.
Collaborate with engineering/product/analyst teams across tech sites
Influence infrastructure architecture by sharing your expertise.
What we offer
Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders.
Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally.
Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today.
Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way!
High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.
About GlobalLogic
GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resumeSimilar jobs
Growth & Sales Specialist
DeltaX,
Bengaluru, Karnataka
8 hours ago
Our products are global by design catering to various industry verticals. In this role you are responsible for building and growing the (B2B) client base in our target markets in accordance with the company’s strategy and vision. This is a consultative sales role where you play an important role fostering companies growth initiatives focusing on our Product(/s), Market(/s) (Region/ Country)...
Process Specialist
Cargill,
Bengaluru, Karnataka
1 day ago
Job Purpose and ImpactThe Process Specialist, will review and perform Cost Estimates, Analyze Process Orders Variances, Purchase Price Variances, Periodic Unit Price Analysis, Cost centre Analysis and reporting activities. In this role, you will lead process improvements. The Process Specialist will deliver financial analysis input into business cases, costs and schedules to support growth in the business. In this role,...
Product Manager [Supply], Uber AI Solutions
Uber,
Bengaluru, Karnataka
3 days ago
About The RoleUber's mission is to be the platform of choice for flexible earning opportunities.At Uber, our mission is to be the platform of choice for flexible earning opportunities, and we're expanding that vision with Uber AI Solutions. This is a new, fast-growing team within Uber, operating like a startup, and driven by a passionate, entrepreneurial team. Our goal is...