Data Engineer RESPONSIBILITIES- Collaborate with internal and external stakeholders to understand and help determine and fully develop acceptance criteria to help build and drive our data strategy
- Customer Data Platform maintenance with Tealium or similar product.
- Work with the data engineering team to design, maintain and execute on our vision for uShip’s Data Lake, Data Warehouse, and Data Exchange
- Partner with team to improve existing processes and develop new ways to increase efficiency, building out automation whenever possible.
- Participate in developing effective ways to both enhance data quality and reliability
- Resolve data-related technical issues and support data infrastructure needs for internal or external stakeholders
- Assist in building out the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources into our Data Warehouse using technologies such AWS, orchestration and replication tooling, dbt, Snowflake, etc.
- Maintain existing legacy transformation processes in Pentaho, SSMS and SSIS.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics to support our reporting and analytics team
- Source, collect and prepare data for prescriptive or predictive modeling
REQUIREMENTS- 3+ years of experience in the following database technologies: SQL Server, DynamoDB, Snowflake, MongoDB, and PostgreSQL.
- 2+ years of hands-on experience with Tealium or similar Customer Data Platform.
- 3+ years of experience in Python and T-SQL is required.
- 2+ years with data warehouse modeling approaches such as Star or Snowflake schema design, and denormalizing highly transactional datasets.
- 2+ years of experience with ELT/ETL tools such as FiveTran, Stitch, AWS Glue, Pentaho, dbt etc. as well as data mining, and segmentation techniques.
- Experience with NodeJS with an understanding of JSON objects, C# or GO is nice to have but not required.
- Experience supporting and working with cross-functional teams in a dynamic environment
- Strong organizational skills
- Understanding the basics of distributed systems
- Knowledge of algorithms and data structures
- Awareness of data governance and data security principles
PREFERRED EXPERIENCE- Data Science or Python Certification(s)
- AWS Certification for Database Specialty
- Understanding of how agile development processes, preferably with a background in an agile development organization
- Familiarity with BI tools such as Tableau, Power BI, etc.
- Experience building AI/ML models
LOCATION: Remote or hybrid (with the exception of CA, MA, MT, AK, HI or NYC, unfortunately, we will not employ remotely in these locations). Office is Austin, TX 78701
SALARY: $110k+ in base salary, estimated $125k+ total comp including bonus, 401k matching, paid insurance, etc.