Senior Data Engineer in City of London

New Today

Energy Jobline is the largest and fastest growing global Energy Job Board and Energy Hub. We have an audience reach of over 7 million energy professionals, 400,000+ monthly advertised global energy and engineering jobs, and work with the leading energy companies worldwide.
We focus on the Oil & Gas, Renewables, Engineering, Power, and Nuclear markets as well as emerging technologies in EV, Battery, and Fusion. We are committed to ensuring that we offer the most exciting career opportunities from around the world for our jobseekers.
Job Description
The Client
This firm is a highly respected, technology-centric investment business operating across a broad range of asset classes. Their success is built on a mix of quantitative research, cutting-edge engineering and scalable data infrastructure. Engineers here play a central role: they design, build and maintain the platforms that underpin research, trading and large-scale data analysis.
It’s a collaborative environment where technical ownership is encouraged, engineering craft is valued, and impactful work directly supports sophisticated investment strategies.
What You'll Get
Work on the design and build of fast, scalable market-data systems used across trading and research groups.
Contribute to a modern engineering ecosystem: Python, cloud- tooling, containerisation, large-scale data lake technologies.
Partner closely with exceptional quantitative researchers, data engineers and traders.
Influence architectural decisions and continuously refine pipeline performance.
Join a culture that values rigour, curiosity and continual improvement.
Benefit from strong compensation and long-term career growth within a high-performing engineering organisation.
Role Overview
Design, implement, and maintain high-throughput, low-latency pipelines for ingesting and processing tick-level market data at scale.
Operate and optimise timeseries databases (KDB, OneTick) to efficiently store, query, and manage granular datasets.
Architect cloud- solutions for scalable compute, storage, and data processing, leveraging AWS, GCP, or Azure.
Develop and maintain Parquet-based data layers; contribute to evolving the data lake architecture and metadata management.
Implement dataset versioning and management using Apache Iceberg.
Collaborate closely with trading and quant teams to translate data requirements into robust, production-grade pipelines.
Implement monitoring, validation, and automated error-handling to ensure data integrity and pipeline reliability.
Continuously profile and optimise pipeline throughput, latency, and resource utilisation, particularly in latency-sensitive or HFT-like environments.
Maintain clear, precise documentation of data pipelines, architecture diagrams, and operational procedures.
What You Bring
3+ years of software engineering experience, preferably focused on market-data infrastructure or quantitative trading systems.
Strong Python expertise with a solid grasp of performance optimisation and concurrency.
Proven experience designing, building, and tuning tick-data pipelines for high-volume environments.
Hands-on experience with Parquet storage; experience with Apache Iceberg or similar table formats is a plus.
Practical experience with containerisation (Docker) and orchestration platforms (Kubernetes).
Strong background in profiling, debugging, and optimising complex data workflows.
Experience with timeseries databases (KDB, OneTick) and/or performance-critical C++ components.
Deep understanding of financial markets, trading data, and quantitative workflows.
Excellent communication skills with the ability to articulate technical solutions to engineers and non-engineers alike.
If you are interested in applying for this job please press the Apply Button and follow the application process. Energy Jobline wishes you the very best of luck in your next career move.
Location:
City Of London
Job Type:
FullTime

We found some similar jobs based on your search