PowerToFly
Recent searches
  • Events
  • Companies
  • Resources
  • Log in
    Don’t have an account? Sign up
Filters
Clear All
Advanced filters
Job type
  • Reset Show results
Date posted
  • Reset Show results
Experience level
  • Reset Show results
Company
  • Reset Show results
Skills
  • Reset Show results
Clear All
Cancel Show Results
Active filters:
Results 14563 Jobs

Wondering why you’re not getting hired?

Take your 3-min quiz and find out!

  • See what’s holding you back
  • Know exactly what to fix
  • Get a plan to move forward
Take the Quiz!
Loading...
Loading more jobs...

No more jobs to load

No more jobs to load

Project Delivery Specialist - Lead Databricks Engineer - Data Lakes - Remote
Save Job
Deloitte LLP

Project Delivery Specialist - Lead Databricks Engineer - Data Lakes - Remote

Hybrid United States
Posted 3 hours ago
Save Job

Watch this video to learn more about Deloitte LLP

Job Details

Lead Databricks Engineer - DataLake - Project Delivery Specialist

Are you an experienced, passionate pioneer in technology who wants to work in a collaborative environment? As an experienced Lead Databricks Engineer - DataLake, you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery.

Work you'll do/Responsibilities
  • Lead end-to-end delivery of data lake/lakehouse workloads on Databricks: ingestion, processing, curation, and consumption layers.
  • Architect and implement pipelines using Spark (PySpark/Scala), Delta Lake, and orchestration tooling (e.g., Databricks Workflows, ADF/Airflow-as applicable).
  • Define data modeling patterns (bronze/silver/gold, dimensional/denormalized, CDC handling) and optimize for analytics use cases.
  • Establish engineering standards: repo structure, coding conventions, branching strategy, documentation, and reusable frameworks.
  • Implement data quality and validation controls (reconciliation, anomaly detection, schema enforcement, expectations).
  • Drive performance tuning: partitioning, file sizing, Z-ORDER, caching, cluster sizing, job parallelism, and query optimization.
  • Partner with governance/security to implement access controls (e.g., Unity Catalog, RBAC/ABAC), PII handling, encryption, and audit logging. Build/maintain CI/CD (continuous integration/continuous delivery) for notebooks/jobs (Databricks Repos, Git integration; IaC where applicable).
  • Monitor and improve operational reliability: observability dashboards, alerting, runbooks, incident response support.
  • Manage dependencies across teams (source system owners, platform/infrastructure, BI, data science) and provide delivery estimates and plans.
  • Mentor engineers, conduct code reviews, and lead technical design sessions.
  • Communicate regularly with Engagement Managers (Directors), project team members, and representatives from various functional and / or technical teams, including escalating any matters that require additional attention and consideration from engagement management
  • Independently and collaboratively lead client engagement workstreams focused on improvement, optimization, and transformation of processes including implementing leading practice workflows, addressing deficits in quality, and driving operational outcomes

The Team

Our Deloitte Customer team empowers organizations to build deeper relationships with customers through innovative strategies, advanced analytics, Generative AI, transformative technologies, and creative design. We can enhance customer experiences and drive sustained growth and customer value creation and capture, through customer and commercial strategies, digital products and innovation, marketing, commerce, sales, and service. We are a team of strategists, data scientists, operators, creatives, designers, engineers, and architects. Our team balances business strategy, technology, creativity, and ongoing managed services to solve the biggest problems that affect customers, partners, constituents, and the workforce.

Our Hybrid Cloud & Infrastructure Engineering teams works with the Customer group to bring a flexible capability and fluid capacity model to the delivery of small technology projects and enhancements.

Qualifications

Required:
  • 7+ years experience with Apache Spark, Databricks SQL, Python, ETL, Lakeflow, Cloud Orchestration
  • 7+ years in data engineering with strong hands-on Databricks experience in production environments.
  • Advanced skills in Spark and PySpark/Scala, with strong SQL capabilities.
  • Deep experience with Delta Lake and lakehouse/data lake design patterns.
  • Experience building batch and/or streaming pipelines; familiarity with CDC concepts.
  • Proven technical leadership (leading squads, setting standards, design ownership).
  • Experience with cloud storage and security fundamentals (object storage, networking basics, IAM).Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Limited immigration sponsorship may be available
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve

Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html
Company Details
Deloitte LLP
 New York City, NY, United States
Work at Deloitte LLP

Don't imagine what's next. Discover it. We provide industry-leading audit & assurance services, consulting, tax and advisory services to many of... Read more

Did you submit an application for the Project Delivery Specialist - Lead Databricks Engineer - Data Lakes - Remote on the Deloitte LLP website?