Role: Hadoop Engineer
Location : O'Fallon, Missouri (3 Days onsite / week)
Yrs. of experience: 10+ Yrs.
Role & Responsibilities:
Support Big data ETL platform built on top of Hadoop
Required Skills:
1. Hadoop (HDFS/Ozone, Hive), Spark(Python/ Scala/ Java), SparkUI, Unix shell scripting technologies
2. Understanding of Data Warehouse/Data Lake/Lake House related ETL /ELT concepts, data quality, governance and performance tuning aspects.
3. Strong analytical and problem-solving skills.
Desired Skills:
1. Familiarity with ITSM tools like Remedy, JIRA. Understanding of Work Order(WO), incident(INC), problem(PBI), and change(CRQ) management.
2. Knowledge of Python, JupyterNotes, NiFi, NiFi Registry, Oracle (SQL, PL-SQL), C, DMX/Syncsort,, CI/CD (GIT, Jenkins/ Chef), AirFlow, Kafka/Axon Streaming
3. Exposure to cloud platforms (Azure, AWS, or GCP) and tools like Databricks is a plus.