Lead Data Engineer - Qlik/ Snowflake/ DBT

Remote, USA Full-time
Overview This is a remote role that may only be hired in the following locations: NC, TX, AZ You will design, build, and operate secure, audited, and cost-efficient data pipelines on Snowflake—from raw ingestion to Data Vault 2.0 models and onward to business-friendly consumption layers (mart/semantic). You’ll use Qlik/Glue/ETLs for ingestion, dbt Cloud for modeling/testing, MWAA/Airflow and/or dbt Cloud’s orchestration for scheduling, and Terraform (with HashiCorp practices) for infrastructure-as-code. The ideal candidate must have hands-on experience with data ingestion frameworks and Snowflake platform database/schema design, security, networking, and governance that satisfy regulatory and compliance audit requirements. Responsibilities Modeling & Warehousing • Design and implement scalable data ingestion frameworks • Implement Raw → DV 2.0 (Hubs/Links/Satellites) → Consumption patterns in dbt Cloud with robust tests (unique/not null/relationships/freshness). Build performant Snowflake objects (tables, streams, tasks, materialized views) and optimize clustering/micro-partitioning. • Orchestration • Author and operate Airflow (MWAA) DAGs and/or dbt Cloud jobs; design idempotent, rerunnable, lineage-tracked workflows with SLAs/SLOs. Security & Governance • Enforce RBAC/ABAC, network policies/rules, masking/row access policies, tags, data classification, and least-privilege role hierarchies. • Operationalize audit-ready controls (change management, approvals, runbooks, separation of duties, evidence capture). IaC & DevOps • Use CI/CD flows, Terraform, Git branching for code promotion. Data Quality & Observability • Bake tests into dbt; implement contract checks, reconciliations, and anomaly alerts. • Monitor with Snowflake ACCOUNT_USAGE/INFORMATION_SCHEMA, event tables, and forward logs/metrics to SIEM/APM (e.g., Splunk, Datadog). Cost & Performance • Right-size warehouses, configure auto-suspend/auto-resume, multi-cluster for concurrency, resource monitors, and query optimization. Compliance • Build controls and evidence to satisfy internal audit, SOX/GLBA/FFIEC/PCI-like expectations. Qualifications Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership Preferred: Snowflake Platform (hands-on, production): • Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations. • Data protection: Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging. • Workloads & features: Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns. • Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication. • Networking concepts: AWS PrivateLink/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/endpoint flows. DBT Cloud: • Dimensional + Data Vault 2.0 modeling in dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts. • Testing and documentation discipline; deployment environments (DEV/QA/UAT/PROD) and job orchestration. Orchestration: • Airflow (MWAA): Operators/Sensors (dbt, Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG design. • Experience deciding when to run in dbt Cloud orchestration vs Airflow, and integrating both cleanly. Data Quality & Observability: • Contract tests, reconciliations, freshness SLAs, anomaly detection; surfacing lineage and test results to stakeholders. • Query tuning (profiling, pruning, statistics awareness, result caching). Audit & Controls: • Change control with approvals/evidence, break-glass procedures, production access separation, audit log retention/immutability. • Runbooks, PIR/RCAs, control mapping (e.g., to SOX/GLBA/PCI-like controls where relevant). Programming & Cloud: • Python (ETL utils, Airflow tasks), SQL (advanced), and AWS basics (S3, IAM, CloudWatch, MWAA fundamentals). Bonus Skills : • Snowflake governance: data classification at scale, Universal Search, tags + masking automation. • Iceberg/external table strategies; Kafka or event-driven ingestion patterns. • Great Expectations, Monte Carlo/Anomalo/Atlan/Collibra/BigID integrations. • dbt: advanced macros, dbt mesh, custom materializations, Slim CI, state comparison, deferral, exposures to BI lineage. • BI/Semantic: ThoughtSpot/Looker/Power BI metric-layer design; semantic modeling concepts. • Packaging & distribution: internal dbt packages, reusable Terraform modules, cookie-cutter project templates. • Platform engineering: FinOps for Snowflake, cost charge-back/show-back, warehouse auto-tuning utilities. • Security engineering: SCIM/SSO (Okta), MFA patterns, service-account hardening, ephemeral credentials. • SRE practices: SLIs/SLOs, on-call runbooks, incident management. #LI-XG1 Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at Qualifications: Bachelor's Degree and 6 years of experience in Advanced data engineering, enterprise architecture, project leadership OR High School Diploma or GED and 10 years of experience in Advanced data engineering, enterprise architecture, project leadership Preferred: Snowflake Platform (hands-on, production): • Secure account setup: databases/schemas/stages, RBAC/ABAC role design, grants, network policies/rules, storage integrations. • Data protection: Dynamic Data Masking, Row Access Policies, Tag-based masking, PII classification/lineage tagging. • Workloads & features: Streams/Tasks, Snowpipe, external tables, file formats, copy options, retries & dedupe patterns. • Operations: warehouse sizing, multi-cluster, resource monitors, Time Travel & Fail-safe, cross-region/account replication. • Networking concepts: AWS PrivateLink/S3 access patterns, external stages, (at least) high-level familiarity with VPC/DNS/endpoint flows. DBT Cloud: • Dimensional + Data Vault 2.0 modeling in dbt (H/L/S), snapshots, seeds, exposures, Jinja/macros, packages, artifacts. • Testing and documentation discipline; deployment environments (DEV/QA/UAT/PROD) and job orchestration. Orchestration: • Airflow (MWAA): Operators/Sensors (dbt, Snowflake, S3), XComs, SLAs, retries, backfills, alerting, and modular DAG design. • Experience deciding when to run in dbt Cloud orchestration vs Airflow, and integrating both cleanly. Data Quality & Observability: • Contract tests, reconciliations, freshness SLAs, anomaly detection; surfacing lineage and test results to stakeholders. • Query tuning (profiling, pruning, statistics awareness, result caching). Audit & Controls: • Change control with approvals/evidence, break-glass procedures, production access separation, audit log retention/immutability. • Runbooks, PIR/RCAs, control mapping (e.g., to SOX/GLBA/PCI-like controls where relevant). Programming & Cloud: • Python (ETL utils, Airflow tasks), SQL (advanced), and AWS basics (S3, IAM, CloudWatch, MWAA fundamentals). Bonus Skills : • Snowflake governance: data classification at scale, Universal Search, tags + masking automation. • Iceberg/external table strategies; Kafka or event-driven ingestion patterns. • Great Expectations, Monte Carlo/Anomalo/Atlan/Collibra/BigID integrations. • dbt: advanced macros, dbt mesh, custom materializations, Slim CI, state comparison, deferral, exposures to BI lineage. • BI/Semantic: ThoughtSpot/Looker/Power BI metric-layer design; semantic modeling concepts. • Packaging & distribution: internal dbt packages, reusable Terraform modules, cookie-cutter project templates. • Platform engineering: FinOps for Snowflake, cost charge-back/show-back, warehouse auto-tuning utilities. • Security engineering: SCIM/SSO (Okta), MFA patterns, service-account hardening, ephemeral credentials. • SRE practices: SLIs/SLOs, on-call runbooks, incident management. #LI-XG1 Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at Education:UNAVAILABLEEmployment Type: FULL_TIME Apply tot his job
Apply Now

Similar Jobs

QA Engineer (Remote or Los Angeles)

Remote, USA Full-time

Applied ML QA Engineer in Remote

Remote, USA Full-time

QA Engineer - Remote / Entry Level

Remote, USA Full-time

Quality Assurance Manager job at CGI in Lafayette, LA, Knoxville, TN, Fairfax, VA

Remote, USA Full-time

Skynova Inc: Manual QA Engineer (ca EUR 51,000 per year)

Remote, USA Full-time

Leadership, QA Manager/REMOTE

Remote, USA Full-time

QUALITY ASSURANCE MANAGER – Crescend Technologies – York, PA

Remote, USA Full-time

QA Manager/Lead

Remote, USA Full-time

Patterson – Sr. QA Manager (Remote) – Omaha, NE

Remote, USA Full-time

Manager, Quality Assurance & Regulatory

Remote, USA Full-time

HighLevel (GHL) & AI Automation Specialist – Full-Time (US Time Zone)

Remote, USA Full-time

Tax Manager (Remote with optional hybrid)

Remote, USA Full-time

Senior Security Software Engineer - Cloud & Infra Security

Remote, USA Full-time

Experienced Part-Time Data Entry Specialist – Remote Online Job Opportunity with Flexible Hours and Professional Growth at arenaflex

Remote, USA Full-time

**Experienced Patient Records Data Entry Specialist – Electronic Medical Records Management**

Remote, USA Full-time

YouTube Documentary Editor (Calm, Slow-Paced)

Remote, USA Full-time

Experienced Customer Service Representative – Sales, Account Management, and Order Management Expertise for arenaflex

Remote, USA Full-time

SVP, Data Science, AI

Remote, USA Full-time

Clinical Instructor, Adjunct Faculty (Part-Time)

Remote, USA Full-time

Landing Page Designer & Conversion Rate Optimization (CRO) Specialist

Remote, USA Full-time
Back to Home