Job Type: Full Time
Job Category: IT

Job Description

Job Title: Hadoop Developer

Location: Charlotte, North Carolina_onsite

Employment Type: Full-Time

 

Job Summary

 

We are seeking a skilled Hadoop Developer to join our data engineering team. The ideal candidate will design, develop, and maintain scalable big data solutions using the Hadoop ecosystem. You will work closely with data engineers, analysts, and business stakeholders to process large datasets and build reliable data pipelines that support analytics and business intelligence.

 

Key Responsibilities

·         Design, develop, and maintain big data applications using the Hadoop ecosystem.

·         Build and optimize data pipelines for ingesting, transforming, and processing large-scale datasets.

·         Develop solutions using tools such as Hive, Spark, MapReduce, HDFS, and Kafka.

·         Write efficient SQL queries and scripts for data extraction, transformation, and loading (ETL).

·         Collaborate with data scientists, analysts, and engineering teams to deliver data-driven solutions.

·         Monitor and troubleshoot data processing workflows and cluster performance.

·         Ensure data quality, governance, and security best practices.

·         Optimize Hadoop jobs for performance and scalability.

·         Document technical designs, workflows, and development processes.

 

Key Skills

·         Hadoop Ecosystem

·         Apache Spark

·         Hive & HDFS

·         Kafka

·         Python / Java / Scala

·         ETL & Data Pipelines

·         SQL & Data Warehousing

·         Linux / Shell Scripting

 

Required Qualifications

·         6+ years of experience in Hadoop or big data development.

·         Strong experience with Hadoop ecosystem tools (HDFS, Hive, Spark, MapReduce, Sqoop, Kafka).

·         Proficiency in Python, Java, or Scala.

·         Experience with ETL tools and data pipeline development.

·         Strong knowledge of SQL and data warehousing concepts.

·         Familiarity with Linux/Unix environments.

·         Experience with distributed data processing and big data architecture.

·         Experience with cloud platforms such as AWS, Azure, or Google Cloud.

·         Familiarity with workflow orchestration tools (Airflow, Oozie).

·         Knowledge of data lakes, data governance, and data security practices.

·         Experience in financial services or large enterprise environments.

Required Skills
Cloud Developer SQL Application Developer

Fill below details & click “Apply”

Only add 10 digit number without prefix
Resume can be attached in PDF, JPG, Word , Txt format only

Share This Job