Job Details
We are looking for a Data Engineer to build and maintain scalable data pipelines and ingestion frameworks using a mix of open-source and custom-built tooling.
Watch this video to learn more about SoftwareOne
You will design and operate reliable data systems that ingest, validate, and process data from APIs, databases, and event streams—ensuring high data quality, performance, and availability.
Key Responsibilities
Build and maintain scalable data pipelines and ingestion frameworks
Design robust data fetching systems for APIs, databases, and streaming sources
Ensure data quality, validation, and consistency across all pipelines
Optimize systems for performance, scalability, and cost efficiency
Implement observability, monitoring, and fault-tolerant architectures
Manage batch and real-time data processing workflows
Collaborate with platform, analytics, and product teams to deliver trusted datasets
Contribute to data architecture, standards, and best practices
Core Skills & Experience
Strong experience in data engineering and distributed systems
Proficiency in Databricks, Azure Data Factory, Python, SQL, Terraform, REST APIs.
Experience with data pipeline tools
Hands-on with streaming technologies
Experience with cloud platforms
Knowledge of data storage systems
Understanding of data modeling, partitioning, and indexing strategies
Experience with API integrations and event-driven architectures
Familiarity with CI/CD and Infrastructure as Code
SoftwareOne is a leading global software and cloud solutions provider that is redefining how organizations build, buy and manage everything in the... Read more