Currently accepting applicatns
Rishabh Software

Big Data Lead

Senior (7 to 10 years)
Vadodara
Rs.1000000 INR - Rs.2000000 INR (PA)
Visit Website
Share via   

Technical Skills Mandatory (Minimum 4 years of working experience)

Proven hands-on experience in Hadoop, Sqoop, Hive, Yarn, Pig, Impala Experienced in working with NoSQL databases, such as HBase, Cassandra, MongoDB

Experienced in working with streaming data with technologies like Kafka, Spark Experience working with different kinds of structured and unstructured data formats (Parquet/Delta Lake/Avro/XML/JSON/YAML/CSV/Zip/Xlsx/Text)

Solid programming experience in Java/ Python/ Scala

Possess strong analytic skills related to working with unstructured and structured datasets

Experienced in working with distributed (multi-tiered) systems and real time systems Experience of architecting big data pipeline on cloud (AWS, Azure or GCP)

Ability to create architectures and tune to scale to handle 10x data, is highly available and cost effective.

Experienced in data modelling, architecture, and data Governance on Hadoop clusters

Expertise in ETL and good understanding on Joins, Partitions and optimizing queries

Well versed with SDLC methodologies and practices including Agile Good To Have (1+ years of working experience)

Experience working with workflow managers like Airflow, Prefect, Luigi, Oozie Experience working with data governance tools like Apache Atlas, Apache Sentry, Apache Ranger

Knowledge of Docker, Kubernetes

Experience using Elastic Search / Apache Solr, and AWS Redshift/Google Big Query Soft Skills

Good verbal and written communication skills

Ability to collaborate and work effectively in a team.

Excellent analytical and logical skills Education

Preferred: Graduate or Post Graduate with specialization related to Computer Science or Information Technology

Relevant Experience 6+ years

Currently accepting applicatns
Rishabh Software

Big Data Lead

Senior (7 to 10 years)
Vadodara
Rs.1000000 INR - Rs.2000000 INR (PA)
Visit Website
Share via   

Technical Skills Mandatory (Minimum 4 years of working experience)

Proven hands-on experience in Hadoop, Sqoop, Hive, Yarn, Pig, Impala Experienced in working with NoSQL databases, such as HBase, Cassandra, MongoDB

Experienced in working with streaming data with technologies like Kafka, Spark Experience working with different kinds of structured and unstructured data formats (Parquet/Delta Lake/Avro/XML/JSON/YAML/CSV/Zip/Xlsx/Text)

Solid programming experience in Java/ Python/ Scala

Possess strong analytic skills related to working with unstructured and structured datasets

Experienced in working with distributed (multi-tiered) systems and real time systems Experience of architecting big data pipeline on cloud (AWS, Azure or GCP)

Ability to create architectures and tune to scale to handle 10x data, is highly available and cost effective.

Experienced in data modelling, architecture, and data Governance on Hadoop clusters

Expertise in ETL and good understanding on Joins, Partitions and optimizing queries

Well versed with SDLC methodologies and practices including Agile Good To Have (1+ years of working experience)

Experience working with workflow managers like Airflow, Prefect, Luigi, Oozie Experience working with data governance tools like Apache Atlas, Apache Sentry, Apache Ranger

Knowledge of Docker, Kubernetes

Experience using Elastic Search / Apache Solr, and AWS Redshift/Google Big Query Soft Skills

Good verbal and written communication skills

Ability to collaborate and work effectively in a team.

Excellent analytical and logical skills Education

Preferred: Graduate or Post Graduate with specialization related to Computer Science or Information Technology

Relevant Experience 6+ years



Expertia AI Technologies Pvt. Ltd,
Sector 1, HSR Layout, Bangalore 560102

   

© Copyright 2021 Expertia AI. All Rights Reserved.