Gurgaon
Bangalore
Full-Time
Mid-Level: 4 to 6 years
15L - 24L (Per Year)
Posted on May 06 2024

About the Job

Skills

BigQuery
Snowflake
Airflow
Stitch
Fivetran
AWS
GCP
ETL

Job Description

Location-Gurgram/Bangalore


MANDATORY SKILLSET- Team Management, SQL Expert, Distributed ETL, Data Pipeline Building, Data Modeling, KPIs, Knowledge of Snowflake


As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.


As a Data Engineer you’ll be: 

  1. Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
  2. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture.
  3. Mentoring Fother Junior Engineers in the Team
  4. Be a “go-to” expert for data technologies and solutions
  5. Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
  6. Troubleshooting and resolving technical issues as they arise
  7. Looking for ways of improving both what and how data pipelines are delivered by the department
  8. Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
  9. Owning the delivery of data models and reports end to end
  10. Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
  11. Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches
  12. Discovering, transforming, testing, deploying and documenting data sources
  13. Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review
  14. Building Looker Dashboard for use cases if required

 

 

QUALIFICATIONS-

  1. Having 5+ years of extensive development experience using snowflake or similar data warehouse technology
  2. Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Five Tran, AWS, git ,Looker 
  3. Experience in agile processes, such as SCRUM
  4. Extensive experience in writing advanced SQL statements and performance tuning them
  5. Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
  6. Experience in data modelling and can optimise existing/new data models
  7. Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets

Additional Information

Required QualificationBachelor of Computer Science (B.Sc. (Computer Science)) ,


About the company

HyrEzy Talent Solutions Delivers exceptional service to both client and candidate. We have a proven track record and are renowned for our high level success. Because of our outstanding performance, we are the exclusive recruitment company used by a number of our clients. We keep abreast with the latest technology and adapt to the ever changing needs in the marketplace. Specializes in permanent ...Show More

Company Size

11-50 Employees

Headquarter

Delhi

Other open jobs from HyrEzy Talent Solutions LLP