Job Detail


Job Description

At least 6+ years of experience in data development and solutions in complex data environments with large data volumes

At least 5+ years of experience developing complex ETLs.

At least 5+ years of SQL / PLSQL experience in developing complex stored procedures, triggers and can write ad-hoc and complex queries to perform data analysis.

An understanding of E-R data models (conceptual, logical, and physical)

Strong understanding of advanced data warehouse concepts

Experience with scripting in Python / PySpark

Strong knowledge and experience with AWS services: AWS Glue, AWS S3, Delta Lake, AWS Lambda, AWS Redshift

Good knowledge on versioning / source code management tools such as Git

Experience with Python API a plus

Experience with Snowflake or Redshift or Big Query

Experience / knowledge in DBT is a plus.

Experience with Airflow DAGs is a plus.

Experience with data ingestion into Hadoop is a plus.

Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.

Persuasive communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.

Self-starter and has proven ability to manage multiple, concurrent projects with minimal supervision.

Can manage a complex ever changing priority list and resolve conflicts to competing priorities.

Strong problem-solving skills and have ability to find where focus is needed and bring clarity to business goals, requirements, and priorities.

Responsibilities

Work on enterprise data platforms and implement data lakes both on-prem and in the cloud.

Design and implementation of data solutions using a variety of big data technologies.

Work extensively on major cloud platforms – AWS, Azure and GCP

Follow modern engineering practices using agile methodologies.

Collaborate with global clients and gain client handling experience.

Job Requirements

 

Work on enterprise data platforms and implement data lakes both on-prem and in the cloud.
Design and implementation of data solutions using a variety of big data technologies.
Work extensively on major cloud platforms – AWS, Azure and GCP
Follow modern engineering practices using agile methodologies.
Collaborate with global clients and gain client handling experience.

How To Apply

Interested candidates can apply for this job by sharing resumes at jyoti@goganalytics.com or kajal@goganalytics.com

About Company

Company Introduction & Information.

Company (not disclosed) Hyderabad location

Apply for this job

Contact Us