Role: Snowflake Data Engineer
Location: RTP, NC - Onsite
Please mention Visa, Rate and Current Location of the Candidate.
JD:
• Design, develop, and maintain data pipelines and ELT workflows in Snowflake using DBT, Python, and other tools.
• Implement and optimize data models, Snowflake schemas, and SQL transformations for scalable analytics solutions.
• Develop and manage user-defined functions (UDFs), stored procedures, and SnowSQL scripts for automation and advanced data processing.
• Integrate data from multiple sources (e.g., Oracle, Teradata, APIs, streaming platforms) into Snowflake with high performance and reliability.
• Optimize query performance, storage usage, and compute resources in Snowflake.
• Implement data quality, monitoring, and governance best practices.
• Collaborate with cross-functional teams including Data Analysts, Architects, and BI developers to deliver robust end-to-end data solutions.
Required Skills:
• 4–10 years of overall experience in Data Engineering.
• Strong hands-on experience in Snowflake — including data load, schema design, performance tuning, and SnowSQL scripting.
• Strong programming experience in Python for data ingestion, transformation, and automation.
• Proficiency in DBT (Data Build Tool) for modeling, transformations, and workflow orchestration.
• Excellent command over SQL (complex queries, optimization, window functions, etc.).
• Experience working with large-scale data sets and performance optimization techniques.
Good to Have:
• Exposure to real-time data ingestion frameworks (Kafka, Kinesis, Spark Streaming, etc.) and streaming analytics.
• Experience with ETL/ELT tools (Informatica, Talend, Airflow, etc.).
• Understanding of data warehousing best practices and cloud platforms (AWS, Azure, GCP).
• Knowledge of JavaScript for Snowflake stored procedures.
Soft Skills:
• Strong problem-solving and analytical mindset.
• Excellent communication and collaboration skills.
• Ability to work in a fast-paced, customer-focused environmen