Job Type: Contract
Job Category: IT
Job Description
Job Title: Databricks Engineer with AWS
Location: Boston, MA
Contract
Coding Test – Python & Scala
Databricks Engineer: 12+ years of total experience with 8 years relevant experience in the mandatory skills.
- Mandatory Skills: Databricks, Hadoop, AWS, Python, Spark, Spark SQL, PySpark, AirFlow and IBM StreamSet
Required Skills & Experience:
- Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications
- Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
- Knowledge on medallion architecture, DLT and unity catalog within Databricks.
- Experience in migrating data from on-prem Hadoop to Databricks/AWS
- Understanding of core AWS services, uses, and AWS architecture best practices
- Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
- Solid knowledge on Airflow
- Solid knowledge on CI/CD pipeline in AWS technologies
- Application migration of RDBMS, java/python applications, model code, elastic etc.
- Solid programming background on scala, python, SQL
#Hashtags:
#DatabricksEngineer #AWSCloud #BigDataEngineering #ApacheSpark #DeltaLake #PySpark #CloudDataPlatform #DataPipeline #DataEngineeringJobs #DatabricksOnAWS #CI_CD #ETLDevelopment #AWSGlue #S3Integration #MachineLearningOps #SparkJobs
Required Skills
DevOps Engineer Senior Email Security Engineer