Skip to main content

Lead Data Engineer in Raleigh

Energy Jobline is the largest and fastest growing global Energy Job Board and Energy Hub. We have an audience reach of over 7 million energy professionals, 400,000+ monthly advertised global energy and engineering jobs, and work with the leading energy companies worldwide.

We focus on the Oil & Gas, Renewables, Engineering, Power, and Nuclear markets as well as emerging technologies in EV, Battery, and Fusion. We are committed to ensuring that we offer the most exciting career opportunities from around the world for our jobseekers.

Job DescriptionJob Description

Jersey Hired is scouting for a heavy-hitting Senior Data Engineer to join a global consulting powerhouse. This isn't just about moving data from Point A to Point B, it’s about architecting a fortress.

 

We need a pro who can design, build, and operate secure, audited, and cost-efficient pipelines on Snowflake. You’ll be the master of the journey: taking raw ingestion through Data Vault 2.0 models and delivering it into high-impact consumption layers. If you’re a Terraform-wielding, dbt-loving, Airflow-orchestrating engineer who treats "audit-ready" as a lifestyle, we want to talk.

 

The Mission

 

  • Architect & Build: Design scalable ingestion frameworks using Qlik, Glue, and ETLs.
  • Model at Scale: Implement Raw → DV 2.0 (Hubs/Links/Sats) → Consumption patterns in dbt Cloud with obsessive testing (uniqueness, relationships, freshness).
  • Snowflake Mastery: Build performant objects (tables, streams, tasks) and fine-tune clustering and micro-partitioning for peak efficiency.
  • Orchestrate Excellence: Author Airflow (MWAA) DAGs and dbt Cloud jobs that are idempotent, rerunnable, and strictly tracked against SLAs.
  • Secure the Perimeter: Enforce RBAC/ABAC, masking, and row-access policies. You’ll operationalize controls that make auditors smile—think change management, separation of duties, and evidence capture.
  • Ops & Observability: Bake tests into dbt, monitor via ACCOUNT_USAGE, and forward metrics to Splunk/Datadog.
  • FinOps: Right-size warehouses and manage multi-cluster concurrency to keep performance high and costs low.
  •  

What You Bring to the Table

 

The Basics

 

  • Bachelor’s Degree + 6 years of advanced data engineering/enterprise architecture experience.
  • OR a High School Diploma/GED + 10 years of the same high-level experience.

 

Technical Must-Haves

 

  • Snowflake Power User: Deep experience in secure account setup, storage integrations, Snowpipe, and cross-region replication. You understand the networking "under the hood" (AWS PrivateLink, VPC/DNS flows).
  • dbt Cloud Specialist: You know Dimensional and Data Vault 2.0 modeling, Jinja/macros, and the discipline of a DEV/QA/UAT/PROD promotion flow.
  • Airflow (MWAA) Expert: You’ve built modular DAGs, handled backfills, and know exactly when to use Airflow vs. dbt’s orchestration.
  • The Compliance Mindset: You’ve worked in regulated environments (SOX, GLBA, FFIEC, or PCI) and understand runbooks, PIR/RCAs, and audit log immutability.
  • Coding/Cloud: Advanced SQL, Python (ETL/Airflow), and AWS fundamentals (S3, IAM, CloudWatch).

 

Bonus Points (The "Cherry on Top")

 

  • Experience with Snowflake Governance (Universal Search, masking automation).
  • Familiarity with Iceberg/External Tables or Kafka-driven ingestion.
  • Observability tools like Great Expectations, Monte Carlo, or Collibra.
  • Platform Engineering: Reusable Terraform modules, FinOps charge-back utilities, and service-account hardening.
  • BI/Semantic Layer: Designing metric layers for ThoughtSpot, Looker, or Power BI.

 

Does your code survive an audit and your pipelines never miss an SLA? Apply now and let’s get to work.

If you are interested in applying for this job please press the Apply Button and follow the application process. Energy Jobline wishes you the very best of luck in your next career move.

Lead Data Engineer in Raleigh

Raleigh, NC
Full time

Published on 03/05/2026

Share this job now