Job Type: Full Time
Job Category: IT

Job Description

Job Title: Sr. Qlik Data Engineer
Location 
– Raleigh, NC
FTE

Job Description

 

•             Data Architecture and Strategy - Design and implement scalable, efficient data architectures. Lead the development of data strategy aligned with business objectives. Evaluate and integrate new technologies to enhance data capabilities

•             Hands-on Data Pipeline Development - Implement complex data pipelines for real-time and batch processing. Optimize data flows for high-volume, high-velocity data environments. Develop advanced ETL processes for diverse data sources.

•             Data Governance and Quality Management - Establish and enforce data governance policies and best practices. Implement data quality frameworks and monitoring systems. Ensure compliance with data regulations and standards

•             Performance Optimization and Troubleshooting - Analyze and optimize system performance for large-scale data operations. Troubleshoot complex data issues and implement robust solutions

•             Mentorship and Knowledge Sharing - Mentor junior data engineers and provide technical guidance. Contribute to the development of best practices and standards. Collaborate with cross-functional teams to drive data literacy

•             Testing & Automation – Write unit test cases, validate the data integrity & consistency requirements, build automated data pipelines using GitLab, Git hub, CICD tools.

•             Code Deployment & Release Management - Adopt release management processes to promote code deployment to various environments including production, disaster recovery, and support activities.

 

Technical/Business Skills:

•             Strong hands-on experience in building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies, tools for large data platforms.

•             Strong experience leveraging data security, governance methodologies meeting data compliance requirements.

•             Strong hands-on experience building medallion architecture automated ELT data pipelines, snow pipe frameworks leveraging Qlik Replicate, DBT Cloud, snowflake with CICD.

•             Strong hands-on experience building data pipelines, data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake.

•             Experience working with various structured & semi-structured data files - CSV, fixed width, JSON, XML, Excel, and mainframe VSAM.

•             Experience using S3, Lambda, SQS, SNS, Glue, RDS AWS services.

•             Excellent proficiency in Python, Pyspark, advanced SQL for ingestion frameworks and automation.

•             Hands-on data orchestration experience using DBT cloud, Astronomer Airflow.

•             Experience in implementing logging, monitoring, alerting, observability, performance tuning techniques.

•             Implement and maintain sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role based masking rules.

•              Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems.

•             Strong experience in adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities.

•             Experience implementing schema drift detection and schema evolution patterns.

•             Financial banking experience is a plus.

•             Must have one or more certifications in the relevant technology fields.

Required Skills
Cloud Security Engineer DevOps Engineer Senior Email Security Engineer

Fill below details & click “Apply”

Only add 10 digit number without prefix
Resume can be attached in PDF, JPG, Word , Txt format only

Share This Job