company logo

Big Data Engineer (Freelance Interviewer)

Thiruvananthapuram
Freelance
Mid-Level: 8 to 12 years
Posted on Oct 23 2025

About the Job

Skills

DevOps and CI/CD
Hadoop
Kubernetes
OpenShift
Spark
Etl

Big Data Engineer (Hadoop/Devops) Freelance Interviewer

Experience -8 to 12 years


About the company

We are an HR Tech company based in Trivandrum, offering hiring support to MNCs across India through interview ,assessments and recruitment services. We have a network of 4000+ experienced professionals who take interviews in their available time slots.

We’re looking for experienced professionals across various domains who can take up freelance interviews for our clients. Interviews are conducted remotely, and schedules are flexible based on your availability. And currently we are looking at panels for the given description.


Skills Required


DevOps and CI/CD: Design, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to automate and streamline the software development lifecycle.


Containerization and Orchestration: Deploy and manage containerized applications using Kubernetes and OpenShift, ensuring high availability and scalability.


Infrastructure Management: Develop and maintain infrastructure as code (IaC) using tools like Terraform or Ansible.


Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka.


Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and storage.


Collaboration: Work closely with development, operations, and data teams to understand requirements and deliver robust solutions.


Monitoring and Optimization: Implement monitoring solutions and optimize system performance, reliability, and scalability.


Security and Compliance: Ensure infrastructure and data solutions adhere to security best practices and regulatory requirements.


Qualifications:


Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.


Experience: Minimum of 5 years of experience in big data engineering or a related role.


Technical Skills:


Proficiency in CI/CD tools such as Jenkins and GitOps.


Strong experience with containerization and orchestration tools like Kubernetes and OpenShift.


Knowledge of big data technologies such as Hadoop, Spark, ETLs.


Proficiency in scripting languages such as Python, Bash, or Groovy.


Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible.


Soft Skills:


Excellent problem-solving and analytical skills.


Strong communication and collaboration abilities.


Ability to work in a fast-paced, dynamic environment.



About the company

We are a unique evaluation partner, offering interview and assessment services to our clients. Our platform's technology-panel-operation-based resources assess candidates' skills, conduct assessments and interviews, advertise jobs, check candidate profiles, hold group discussions, and essentially automate the entire hiring process. We carry out walk-in interviews, campus recruitment, and specialty ...Show More

Industry

Human Resources Services

Company Size

11-50 Employees

Headquarter

Trivandrum

Other open jobs from Futuremug