logo

View all jobs

Databricks Architect

Alexandria, VA

Databricks Architect – Cloud Data & AI Solutions
Location: Hybrid (Remote + Onsite as needed)
Clearance: Must be eligible for Public Trust

We’re looking for a highly skilled Databricks Architect to lead the design and implementation of scalable, secure, and high-performance data platforms. This role is ideal for a seasoned data engineering expert with deep experience in Databricks, cloud platforms, and advanced analytics—ready to drive innovation and transformation across mission-critical environments.

As a trusted technical leader, you’ll collaborate with engineering, data science, and business teams to deliver modern data architectures and AI-powered solutions that meet the highest standards of performance, compliance, and impact.

Key Responsibilities:

  • Architect and implement scalable data platforms and pipelines using Databricks (Spark, Delta Lake, MLflow, etc.)
  • Lead the migration from legacy systems to modern Lakehouse architectures
  • Design secure, high-performing data ingestion, transformation, and consumption workflows
  • Define and enforce data governance, quality, and security standards in collaboration with InfoSec and Compliance teams
  • Mentor and guide data engineers on Databricks best practices and performance optimization
  • Integrate Databricks with enterprise tools like Power BI, Tableau, Kafka, Snowflake, and others
  • Evaluate and implement AI/ML workflows within the Databricks ecosystem
  • Partner with cloud architects to ensure robust infrastructure on AWS, Azure, or GCP
  • Create and maintain technical documentation, architecture diagrams, and data models

Required Qualifications:

  • Bachelor’s or Master’s in Computer Science, Engineering, or related field
  • 8+ years of experience in data engineering or architecture roles
  • 3+ years of hands-on experience architecting solutions with Databricks
  • Deep expertise in Apache Spark, Delta Lake, MLflow, and Lakehouse architecture
  • Proficiency in SQL, Python, and/or Scala
  • Strong experience with cloud platforms (AWS, Azure, or GCP)
  • Solid understanding of distributed computing, performance tuning, and data security
  • Familiarity with CI/CD pipelines, DevOps, and orchestration tools (e.g., Airflow, Data Factory)
  • Excellent communication and stakeholder management skills
  • Must be eligible to obtain and maintain a Public Trust clearance

Preferred Qualifications:

  • Databricks Certified Data Engineer or Solutions Architect
  • Experience with MLOps and deploying ML models in production
  • Familiarity with data mesh, data fabric, or other modern enterprise data architectures
  • Background in public sector, healthcare, or financial services is a plus

Share This Job

Powered by