Help us maintain the quality of jobs posted on PowerToFly. Let us know if this job is closed.
Job Details
Data Engineer – Dublin, Hybrid At UnitedHealth Group and Optum, we want to make healthcare work better for everyone. This depends on hiring the best and brightest. With a thriving ecosystem of investment and innovation, our business in Ireland is constantly growing to support the healthcare needs of the future. Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward in a way that better serves everyone. With our hands at work across all aspects of health, we use the most advanced development tools, AI, data science and innovative approaches to make the healthcare system work better for everyone. As Data Engineer, you will be streamlining the flow of information and deliver insights to manage our consumer analytics. The Data Engineer is essential for providing robust data solutions and contribute by creating data strategy plans. Working alongside the Consumer Analytics team, you’ll have the opportunity to help making decisions easier for members and helping members save on healthcare costs. In addition to having impact on a great team, you'll also discover the career opportunities you'd expect from an industry leader. Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin office and telecommuting from a home-based office in a hybrid work model. Primary Responsibilities of the Data Engineer:
- Developing and maintaining data pipelines that extract, transform and load (ETL) data from various sources into a centralised data storage system, such as a data warehouse or data lake
- Ensuring smooth slow of data from source systems to destination systems while adhering to data quality and integrity standards
- Integrating data from multiple sources and systems, including databases, APIs, log files, streaming platforms and external providers
- Applying data processing techniques to handle complex data structures, handle missing or incomplete data and prepare the data for analysis, reporting or machine learning tasks
- Contribute to common frameworks and best practices in code development, deployment and automation/orchestration of data pipelines
- Implement data governance in line with company standards
- Monitoring data pipelines and data systems to detect and resolve issues promptly
- Develop monitoring tools, alerts and automated error handling mechanisms to ensure data integrity and system reliability
- Bachelor’s degree in a relevant field, or equivalent experience
- Demonstrable experience with data mining
- Hands on experience developing data pipelines that demonstrate a strong understanding of software engineering principles
- Proficiency in SQL and experience with an object-oriented / object functional scripting language: Python, Scala, Java, etc.
- Understanding of DevOps tools, Git workflow and building CI/CD pipelines
- Demonstrated effective project ownership, management, drive & delivery
- Experience with Snowflake or Databricks
- Proven hands-on experience of developing big data processes using big data tools – such as Spark, Hadoop, Kafka, etc.
- Experience in ETL / ELT tools
- Experience in orchestration tools ADF / Airflow
- Experience building data pipelines on either AWS, Azure or GCP, following best practices in Cloud deployments
- Experience working in projects with agile/scrum methodologies
About the Company
UnitedHealth Group
Minnetonka, MN, United States
UnitedHealth Group is a health care and well-being company that’s dedicated to improving the health outcomes of millions worldwide. We are... Read more