Remote
I'm Interested
powertofly approved What S&P Global Has to Offer:

We met with women at S&P Global to hear about the teams they're leading, the products they're building and how they integrate work with life.

Hear directly from Irina, Megan, Sameena and Meredith.

Job Details

Position Summary

We are looking for an adept, action-oriented Principal Data Engineer to design and build out a multi-tenant data mesh to enable our soon-to-be-launched digital transformation product which uses advanced NLP, knowledge engineering, and ML to accelerate innovation in engineering, manufacturing, and scientific operations. The perfect candidates will have strong data infrastructure and data architecture skills, a proven track record of collaborating and iteratively implementing data-intensive solutions, strong operational skills to drive efficiency and speed, strong project leadership, and a strong vision for how data engineering can proactively create positive impact for companies. You will be a part of an early-stage team. You will educate stakeholders, mentor team members, and have a significant stake in defining the future of the Data Engineering function for the product.

Job Responsibilities
  • Design, build, and maintain a multi-tenant Data Mesh within the AWS cloud comprised of Data Lakes, Warehouses, Streaming, Graphs, and analytical NoSQL stores
  • Drive adoption and standardization of data governance, lineage, cataloging, and stewardship practices across teams
  • Work closely with data scientists, micro-service developers, and security experts to build out a big data platform incrementally and securely
  • Work closely with the product management and development teams to rapidly translate the understanding of customer data and requirements to product and solutions
  • Maintain an excellent understanding of the business’s long-term goals and strategy and ensures that the design and architecture are aligned with these
  • Define and manage SLA’s for data sets and processes running in production
  • Design for disaster recovery balancing availability and consistency in multi-region scenarios
  • Research and experiment with emerging technologies and tools related to big data
  • Establish and reinforce disciplined software engineering processes and best-practices
Ideal Qualifications
  • Comfort and ideally substantial experience operating big data infrastructure in a cloud-based ecosystem (AWS preferred)
  • Deep understanding of the theoretical and practical tradeoffs of various data formats in object/file stores (Parquet, Avro, JSON, etc.) in combination with a variety of ETL tools (Spark, Presto, etc.)
  • Deep understanding of the theoretical and practical tradeoffs of various NoSQL stores (Cassandra, Elasticsearch, DynamoDB, etc.) with respect to different read/write patterns and availability/consistency requirements
  • Mastery of operating and designing stream-based data systems (Kafka, AWS Kinesis, GCP PusSub, etc.) particularly under varying load
  • Be proficient in modern big data architectural approaches (Kappa/Lambda architectures, Data Lake Zones, etc.)
  • Experience with data governance, lineage, cataloging tooling (Apache Atlas, Apache Ranger, AWS Glue Catalog, etc.)
  • Experience with data pipeline and workflow management tools (AWS Data Pipeline, Apache Airflow, Argo, etc.)
  • Experience with stream-processing systems (ksqlDB, Spark Streaming, Apache Beam/Flink, etc.)
  • Experience with software engineering standard methodologies (unit testing, code reviews, design document, continuous delivery)
  • Develop and deploy production-grade services, SDK’s, and data infrastructure emphasizing performance, scalability, and self-service.
  • Ability to conceptualize and articulate ideas clearly and concisely
  • Entrepreneurial or intrapreneurial experience where you helped lead the creation of a new product & organization
Nice to Have’s
  • Strong algorithms, data structures, and coding background with either Java, Python or Scala programming experience
  • Experience working with knowledge graphs stores (Stardog, TigerGraph, Ontotext GraphDB, Neo4j) and surrounding semantic technology (OWL, RDF, SWRL, SPARQL, JSON-LD)
  • Experience working with Snowflake data warehouses and dimensional modeling practices
  • BA/BS or Masters in Computer Science, Math, Physics, or other technical fields
  • Experience with at least 10+ terabyte datasets, ideally up to multiple petabytes
What We Offer
  • Competitive base salary and bonus
  • A comprehensive, benefits package that includes medical, dental, vision and life insurance plans, paid time off, a generous 401k match with no vesting period, parental leave and 3 volunteering days each year. For more information on benefits, please access the benefits page on our careers site: https://careers.ihsmarkit.com/benefits.php.
  • For work locations in the state of Colorado, the anticipated minimum base salary for this role would be $140,000 - $210,000. Compensation will be determined by the education, experience, knowledge, and abilities of the applicant.

We’re building a software solution that connects data in revolutionary ways, illuminating answers that were previously impossible to find and empowering our clients to envision the future so they can determine the best course of action in the present. Join us!

-----------------------------------------------

Equal Opportunity Employer:

S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment.

If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person.

US Candidates Only:

The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law.


-----------------------------------------------------------------------

PowerToFly's EEOC Statement:

Diversity, Equity, Inclusion and Belonging (DEIB) are the cornerstone of everything we do at PowerToFly. Together, we build a culture of belonging that guides our mission, values, services, products, employees, clients, and community. This active and intentional foundation fuels our innovation and impact to advance economic equality in underrepresented communities. We are a global team committed to mitigating bias and lifting barriers in our recruitment and hiring processes.

PowerToFly is an equal opportunity employer and ensures individuals seeking employment are considered without regard to race, color, ancestry, age, sex, sexual orientation, religion, physical or mental disability, ethnicity, national origin, citizenship, veteran status, marital status, pregnancy, or any other qualities protected by applicable federal, state, or local laws.



Req ID R31684-2

Required Skills
About the Company

At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening up possibilities. We Accelerate Progress in the world.... Read more

Mission
We're connecting diverse talent to big career moves. Meeting people who boost your career is hard - yet networking is key to growth and economic empowerment. We’re here to support you - within your current workplace or somewhere new. Upskill, join daily virtual events, apply to roles (it’s free!).
Are you hiring? Join our platform for diversifiying your team
Principal Data Engineer
I'm Interested