Job Type: Contract
Job Category: IT
Job Description
Job Summary:
We are looking for an experienced Hadoop Developer / Administrator to design, implement, and maintain scalable and secure big data solutions. This hybrid role requires hands-on expertise in both developing data processing workflows and administering Hadoop ecosystems in production. The ideal candidate will ensure high availability, performance, and security of our Hadoop clusters while enabling data-driven solutions across the enterprise.
Key Responsibilities:
Development Duties:
- Design, develop, and optimize data pipelines using Hadoop ecosystem tools (Hive, Pig, HDFS, MapReduce, Spark, etc.).
- Work with data scientists and analysts to transform raw data into usable formats.
- Write complex HiveQL or Spark SQL queries for data extraction and reporting.
- Develop data ingestion workflows using tools like Kafka.
- Ensure data quality, validation, and governance processes are applied to all data pipelines.
Administration Duties:
- Install, configure, and maintain Hadoop clusters (Cloudera, Hortonworks, or Apache distributions).
- Monitor cluster performance and perform tuning to ensure system efficiency and uptime.
- Implement security policies including Kerberos authentication, Ranger or Sentry-based authorization.
Required Skills
Cloud Developer