
Python Developer with PySpark Expertise

Python Developer with PySpark Expertise
18
Applications
Thiruvananthapuram
Remote
Mid-Level: 4 to 6 years
Posted on Feb 19 2025
Not Accepting Applications
About the Job
Skills
Python
PySpark
SQL
DataFrames
Hadoop
HDFS
Hive
Job Title: Python Developer with PySpark Expertise
Location: Remote
Experience: 4-6 Years
Employment Type: Full-time
Job Description:
We are looking for a skilled Python Developer with PySpark expertise to join our team. The ideal candidate should have a strong foundation in Python, Big Data Processing, and Spark (PySpark) and be experienced in handling large-scale data processing workflows.
Key Responsibilities:
- Develop, optimize, and maintain data processing pipelines using PySpark and Python.
- Work with large datasets and ensure efficient data transformation, cleaning, and processing.
- Design and implement ETL workflows for structured and unstructured data.
- Collaborate with data engineers, data scientists, and cloud architects to build scalable data solutions.
- Optimize Spark jobs for performance and scalability on distributed systems.
- Integrate with cloud platforms like AWS, Azure, or GCP for data storage and processing.
- Troubleshoot and resolve issues related to Spark job execution and cluster performance.
Required Skills & Qualifications:
- 3-5 years of experience in Python development with expertise in PySpark.
- Strong understanding of Spark architecture, RDDs, DataFrames, and Spark SQL.
- Hands-on experience with big data technologies such as Hadoop, HDFS, Hive, Kafka, etc.
- Experience in writing optimized and efficient PySpark scripts for batch and real-time data processing.
- Strong knowledge of SQL and database technologies (PostgreSQL, MySQL, NoSQL, etc.).
- Experience with cloud services (AWS, Azure, or GCP) and data lake/data warehouse concepts.
- Familiarity with CI/CD pipelines and version control (Git, Bitbucket, etc.).
- Knowledge of containerization tools like Docker/Kubernetes is a plus.
Preferred Qualifications:
- Experience with workflow orchestration tools like Apache Airflow.
- Exposure to ML/AI pipelines using big data processing.
- Strong problem-solving and analytical skills.
- Ability to work in a collaborative and fast-paced environment.
About the company
We are a unique evaluation partner, offering interview and assessment services to our clients. Our platform's technology-panel-operation-based resources assess candidates' skills, conduct assessments and interviews, advertise jobs, check candidate profiles, hold group discussions, and essentially automate the entire hiring process. We carry out walk-in interviews, campus recruitment, and specialty ...Show More
We are a unique evaluation partner, offering interview and assessment services to our clients. Our platform's technology-panel-operation-based resources assess candidates' skills, conduct assessments and interviews, advertise jobs, check candidate profiles, hold group discussions, and essentially automate the entire hiring process. We carry out walk-in interviews, campus recruitment, and specialty hiring for specialized positions.
...Show Less
Industry
Human Resources Services
Company Size
11-50 Employees
Headquarter
Trivandrum
Other open jobs from Futuremug
