PowerToFly
Recent searches
  • Events
  • Companies
  • Resources
  • Log in
    Don’t have an account? Sign up
Results 16557 Jobs
Loading...
Loading more jobs...

No more jobs to load

No more jobs to load

Senior Data Engineer - Project Delivery Specialist II

Deloitte LLP

Save Job
Deloitte LLP

Senior Data Engineer - Project Delivery Specialist II

Onsite United States
Posted an hour ago
Save Job

Watch this video to learn more about Deloitte LLP

Job Details

Are you an experienced, passionate pioneer in technology who wants to work in a collaborative environment? As an experienced Senior Data Engineer, you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery.

Work you'll do/Responsibilities

You will support the client in person by being in the office for 4 working days, there is a need for specialized expertise to design, build, and maintain data pipelines and infrastructure, ensuring reliable access to high-quality data for business operations and analytics.

You will accelerate the development and stabilization of critical data teams. This will improve data quality, ensure timely insights, optimize system performance, and directly enable business teams to make better, data-driven decisions. Ultimately, the role is key to achieving smoother operations, regulatory compliance, and unlocking greater value from data assets.
  • Resolve pipeline failures, perform root cause analysis, and apply fixes promptly.
  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data using Databricks (PySpark) and Azure Data Factory.
  • Connect multiple data sources (e.g., databases, APIs, snowflake, salesforce and SAP) and ensure reliable, automated data flows.
  • Implement data validation, error handling, and monitoring to ensure data accuracy and completeness.
  • Tune Spark jobs and data storage (e.g., Delta Lake) for cost-effective, efficient performance.
  • Communicate regularly with Engagement Managers (Directors), project team members, and representatives from various functional and / or technical teams, including escalating any matters that require additional attention and consideration from engagement management
  • Independently and collaboratively lead client engagement workstreams focused on improvement, optimization, and transformation of processes including implementing leading practice workflows, addressing deficits in quality, and driving operational outcomes

The Team

AI & Engineering leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements.

Our AI & Data practice offers comprehensive solutions for designing, developing, and operating advanced Data and AI platforms, products, insights, and services. We help clients innovate, enhance, and manage their data, AI, and analytics capabilities, ensuring they can grow and scale effectively.

Qualifications

Required
  • 7+ years working with Azure Platform components: Data Factory, Databricks, Synapse Analytics, Data Lake Storage, Event Hubs.
  • 7+ years working experience with Databricks experience: Spark (PySpark/Scala), Delta Lake, notebook orchestration, job clustering.
  • 5+ years working with data pipeline orchestration with Azure Data Factory or Synapse Pipelines.
  • 5+ years working experience using SQL and advanced data querying/analysis techniques.
  • 5+ years building CI/CD pipelines for data workloads using Azure DevOps or GitHub Actions.
  • 5+ years working experience with Power BI or other visualization tools for data validation.
  • 5+ years working with Scripting languages: Python (preferred), Scala, SQL.
  • 2+ years working experience with containerization (Docker/Kubernetes) and microservices concepts as applied to data solutions.
  • Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience
  • Limited immigration sponsorship may be available
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve
  • Position requires 4 days onsite at the client in St. Louis
  • Must be able to start by March 16 th , 2026

Preferred
  • Analytical/ Decision Making Responsibilities
  • Analytical ability to manage multiple projects and prioritize tasks into manageable work products
  • Can operate independently or with minimum supervision
  • Excellent Written and Communication Skills
  • Ability to deliver technical demonstrations
Company Details
Deloitte LLP
 New York City, NY, United States
Work at Deloitte LLP

Don't imagine what's next. Discover it. We provide industry-leading audit & assurance services, consulting, tax and advisory services to many of... Read more

Did you submit an application for the Senior Data Engineer - Project Delivery Specialist II on the Deloitte LLP website?