Help us maintain the quality of jobs posted on PowerToFly. Let us know if this job is closed.
Job Details
Experience Level: Experienced Hire Categories:
- Engineering & Technology
- Remote - United States, US
- Design, develop, and maintain scalable data pipelines with Snowflake, Apache Airflow, dbt (SQL), and Python with AWS services.
- Support and contribute to the continuous improvement, tuning, applications, infrastructure developments, process controls, and upgrades of the data platform
- Provide guidance, hands-on development, and operational support for the deployment of database scripts and changes across multiple environments
- Collaborate with Moody's technical teams and business owners as needed during the design and implementation
- Manage individual time and tasks effectively.
- Collaborate with cross-functional teams to understand data requirements and implement effective solutions.
- Bachelor's degree or equivalent, Master's is a plus - or equivalent experience
- 3+ year of experience contributing to and providing technical leadership in a data engineering or software development team
- Hands on experience in design and development of data integration/ETL data pipelines from various data sources and data formats
- Knowledge of best practices for Airflow DAG management, scheduling, and monitoring.
- Experience with implementing data quality checks, error handling, and retries in Airflow workflows.
- Hands-on experience designing, developing, and maintaining data pipelines using Apache Airflow, including creating custom operators, sensors, and plugins.
- Familiarity with Airflow Executors for task parallelism and resource management.
- Experience within all phases of software development working across Agile teams, product owners and external stakeholders
- Experience driving technical ideas and communicating clearly to both technical and non-technical audiences at all levels of the organization
- Strong development, testing, debugging skills at all levels (unit, system, integration, and performance testing) along with detail-oriented documentation skills
- Effective communication and problem-solving skills.
- Strong database engineering skills with emphasis Postgres SQL, DynamoDB, Snowflake
- Hands on experience implementing, managing, supporting, and developing with AWS database technologies, AWS RDS (Microsoft SQL Server, PostgreSQL) and AWS DynamoDB
- Strong understanding of data quality best practices and data governance principles.
- Strong understanding of object-oriented programming, design, and architectural patterns
- Experience with development technologies: Git (Bitbucket or GitHub), Jira, Rally, Asana, Jenkins, Cypress, Postman
- Experience with AWS cloud technologies: ECS/Fargate, ECR (EC2 Container Registry), Lambda, DynamoDB, Aurora PostgreSQL, API (Application Programming Interface) Gateway, VPC (Virtual Private Cloud), ALB, NLB, Neptune
- Experience working with big data technologies: Apache Airflow, Apache Spark, Apache Kafka, AWS Kinesis, AWS Redshift
- Experience with development languages: Java (Spring Boot), Scala, PySpark
About the Company
Moody's
New York City, NY, United States
In a world shaped by increasingly interconnected risks, Moody's helps customers develop a holistic view of these risks to advance their business... Read more