Hi
My name is Rohit Chauhan, and I am a Staffing Specialist at Novia Infotech LLC. I am reaching out to you on an exciting job opportunity with one of our clients.
Job Title: Lead Data Engineer
(Snowflake, dbt, Airflow)
Location: Raleigh, NC / Dallas, TX / Phoenix, AZ
Job Summary
We are seeking a highly skilled Lead Data Engineer to design, build, and operate secure and scalable data pipelines on Snowflake. The ideal candidate will have strong experience in modern data engineering practices including Data Vault 2.0 modeling, orchestration, infrastructure as code, and data governance.
This role requires hands-on experience building end-to-end data pipelines from ingestion to consumption layers using tools such as dbt Cloud, Apache Airflow, and Qlik Sense while ensuring compliance with regulatory and audit requirements.
The role is onsite and available in:
- Raleigh
- Dallas
- Phoenix
Key Responsibilities
Data Engineering & Pipeline Development
- Design and implement scalable data ingestion frameworks.
- Build end-to-end data pipelines from raw ingestion to consumption layers.
- Implement Data Vault 2.0 models (Hubs, Links, Satellites).
- Develop and optimize Snowflake database objects.
- Ensure high-performance and scalable data pipelines.
Snowflake Development
- Build and maintain data pipelines on Snowflake.
- Create and optimize tables, streams, and tasks.
- Implement clustering and micro-partition optimization.
- Configure Snowpipe and external tables.
- Optimize warehouse performance and costs.
Data Modeling & Transformation
- Implement data models using dbt Cloud.
- Build dimensional and Data Vault models.
- Develop reusable dbt macros and packages.
- Implement testing and documentation.
- Manage deployment environments.
Orchestration
- Build and manage workflows using Apache Airflow.
- Develop and maintain Airflow DAGs.
- Implement SLAs and monitoring.
- Manage retries and backfills.
- Integrate Airflow with dbt workflows.
Security & Governance
- Implement role-based access controls.
- Configure masking and row access policies.
- Implement data classification and tagging.
- Maintain audit-ready controls.
- Enforce least-privilege access.
Infrastructure & DevOps
- Implement infrastructure as code using Terraform.
- Build CI/CD pipelines.
- Manage environment deployments.
- Implement Git-based workflows.
Data Quality & Observability
- Implement data quality checks.
- Build automated testing frameworks.
- Monitor data pipelines.
- Implement anomaly detection.
- Track lineage and data freshness.
Cost & Performance Optimization
- Optimize Snowflake warehouse sizing.
- Configure auto-suspend and auto-resume.
- Implement resource monitoring.
- Optimize query performance.
- Manage compute costs.
Compliance & Audit
- Implement audit controls.
- Support regulatory compliance.
- Maintain audit logs.
- Document processes and controls.
- Support compliance frameworks such as SOX and PCI.
Required Skills
- 6–8 years of data engineering experience.
- Strong hands-on experience with Snowflake.
- Experience with dbt Cloud.
- Experience with Apache Airflow.
- Experience with Qlik Sense.
- Strong SQL and Python skills.
- Experience with Data Vault 2.0 modeling.
- Experience with ETL frameworks.
- Experience with Terraform.
- Experience with Git workflows.
- Experience with AWS fundamentals.
Preferred Skills
- Experience with data governance tools.
- Experience with Kafka-based ingestion.
- Experience with data observability tools.
- Experience with BI tools.
- Experience with semantic modeling.
- Experience with cost optimization.
|
Rohit Chauhan IT Recruiter A: 4421 Avenida Ln, McKinney, TX, 75070
|
You received this message because you are subscribed to the Google Groups "NoviaJobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to noviajobs+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/noviajobs/CAJ0-OE_M9Vv3001Qn--dp95h_PaCzF5nDBRfWq-9kJkB5n_yww%40mail.gmail.com.
No comments:
Post a Comment