Databricks Data Engineer - Senior Consultant
Onsite
United States
United States
Posted 6 hours ago
Job Details
Job Title: Senior Consultant - Databricks Senior Data Engineer
Job Summary
As a Senior Data Engineer, you will oversee the end-to-end design, deployment, and optimization of enterprise-scale data engineering solutions using Databricks on any major cloud platform (AWS, Azure, or GCP). This highly strategic role focuses on leading innovation in big data architecture and analytics, shaping best practices, advising senior stakeholders, and ensuring that data solutions align with business objectives and drive measurable results.
Recruiting for this role ends on 2/13/2026.
Work you'll do
Key Responsibilities
Qualifications
Preferred:
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $136,700- $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html
#LI-WM1
Job Summary
As a Senior Data Engineer, you will oversee the end-to-end design, deployment, and optimization of enterprise-scale data engineering solutions using Databricks on any major cloud platform (AWS, Azure, or GCP). This highly strategic role focuses on leading innovation in big data architecture and analytics, shaping best practices, advising senior stakeholders, and ensuring that data solutions align with business objectives and drive measurable results.
Recruiting for this role ends on 2/13/2026.
Work you'll do
Key Responsibilities
- Architect and Deliver Solutions: Lead the development, implementation, and scaling of advanced data engineering solutions using Databricks across AWS, Azure, or GCP environments.
- Champion Best Practices: Establish, document, and promote best-in-class approaches for data architecture, integration, and modelling.
- Pipeline Ownership: Oversee the design, development, and maintenance of robust data pipelines and data architectures that support large-scale, enterprise data needs.
- Drive Excellence: Initiate and manage efforts to improve data quality, operational efficiency, and process scalability.
- Technology Leadership: Evaluate, pilot, and integrate new big data and analytics technologies, ensuring the organization remains at the cutting edge.
- Strategic Data Governance: Consult on, design, and implement governance, security, and compliance strategies tailored to modern cloud data ecosystems.
- Team Leadership and Mentoring: Lead, coach, and develop teams of data engineers and architects, fostering technical growth and effective project delivery.
- Stakeholder Engagement: Communicate technical concepts and business value to diverse stakeholders, including executives, business leads, and technology teams.
- DevOps and Automation: Oversee the implementation of CI/CD practices with tools such as Azure DevOps, AWS Code Pipeline, Jenkins, TFS, or PowerShell for streamlined deployments and operations.
Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field (Master's preferred).
- 5+ years of hands-on experience in data engineering with a strong focus on Databricks, deployed on any major cloud (AWS, Azure, GCP).
- Minimum of 5 years Technical Proficiency:
- Expertise with cloud-native databases, storage solutions, and distributed compute platforms.
- Deep understanding of Lakehouse architecture, Apache Spark, Delta Lake, and related big data technologies.
- Advanced skills in data warehousing, 3NF, dimensional modeling, and enterprise-level data lakes.
- Experience with Databricks components including Delta Live Tables, Autoloader, Structured Streaming, Databricks Workflows, and orchestration tools (e.g., Apache Airflow).
- Expertise in designing and supporting incremental data loads and building metadata-driven ingestion/data quality frameworks using PySpark.
- Hands-on experience with Databricks Unity Catalog and implementing fine-grained security and access control.
- Proven track record in deploying code and solutions via automated CI/CD pipelines.
- A minimum of 1 year experience leadership in managing complex, cross-functional data projects and technical teams.
- Experience with performance optimization of Data engineering pipelines, code, compute resources
- Ability to travel up to 50%, on average, based on the work you do and the clients and industries/sectors you serve
- Limited immigration sponsorship may be available.
Preferred:
- Comprehensive knowledge of the AWS, Azure, and GCP cloud ecosystems and associated big data stacks is strongly preferred.
- Demonstrated skill in performance tuning and optimization within Databricks/Apache Spark environments.
- Stays current with the latest Databricks feature releases and platform enhancements.
- Exceptional communication and stakeholder management abilities, including comfort interfacing with executive leadership.
- Experience with Databricks Lakeflow is plus
- Experience in AI/ML is plus
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $136,700- $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html
#LI-WM1
Learn more about Deloitte LLP
Mission
We're connecting diverse talent to big career moves. Meeting people who boost your career is hard - yet networking is key to growth and economic empowerment. We’re here to support you - within your current workplace or somewhere new. Upskill, join daily virtual events, apply to roles (it’s free!).
Are you hiring? Join our platform for diversifiying your team