Role: Teradata Developer
Location: #Charlotte, NC/Plano, TX (Day 1 onsite)
FTE only
Job Description
Must Have Technical/Functional Skills
• 7+ years’ experience with end-to-end ETL and Analytics application development on Teradata-based data-warehouse and analytical platforms
• Extensive experience developing Teradata SQL-based ETL and analytic workflows using native utilities (bteq, tpt, fastexport)
• Very good knowledge of Unix/Linux shell Scripting and scheduling (like Autosys)
• Knowledge and experience working with CI / CD based development and deployment – JIRA, BitBucket
• Experience working with Big Data Technologies, programs and toolsets like Hadoop, Hive, Sqoop, Impala, Kafka, and Python/Spark/PySpark workloads will be a plus. Excellent written, communication and diagramming skills
• Strong analytical and problem-solving abilities
• Speaking / presentation skills in a professional setting
• Excellent interpersonal skills and a team player to work all along with Global teams and business partners
• Positive attitude and flexible
• Willingness to learn new skills and adapt to changes.
Roles & Responsibilities
• Design, build, and optimize end-to-end ETL/ELT data pipelines using Teradata SQL, BTEQ, TPT, and FastExport.
• Translate business requirements into robust technical specifications, data mappings, and transformation logic.
• Implement incremental loads, SCD strategies, and reconciliation checks to ensure data completeness and accuracy.
• Develop reusable modules, parameterized scripts, and standards for code consistency.
• Model data structures (staging, ODS, dimensional models) aligned to Teradata best practices.