If you’re a fan of movies like The Matrix and Source Code then the next LSEG Tech Talks is for you. The upcoming session of the popular tech webinar series dives into a concept inspired by science fiction. To quote Senior Software Architect at LSEG, Rajith Priyanga, “It’s almost like when Neo learns martial arts.” That’s how he describes the entirely new concept that doesn’t even have a dictionary definition yet called snapshotable memory.
An introduction to snapshotable memory
In the world of financial markets, speed is essential. A few milliseconds could be the difference between a loss or a profit worth millions. This is the high stakes world where LSEG operates. When performing high rates of business processing operations at low latencies, the time taken for each operation is crucial. Collectively, every tiny operation adds up to the total latency of the business operations.
“Therefore, working with data standing on secondary storage such as databases or external files wasn’t an option for us. Even in-memory databases weren’t fast enough to meet the needs as they work with data structures that prioritise efficient storage over speed,” explains Rajith. The programmes in financial markets utilise custom data structures, which are enhanced to process data at the lowest possible latencies by reducing time and space complexities. Their unique nature means these data structures can’t be stored in a conventional database or a file like other programs.
This is where snapshotable memory comes into play, which was inspired by the Sci-Fi stories like Matrix and Source Code. It was recognised that the custom data structures LSEG created to improve performance was also the most efficient way of storing data. Further, they lived in computer memory, which is the fastest form of storage, unlike slower hard drives. Therefore, memory snapshot technology enables the possibility of sharing the custom data structures in memory across multiple machines and programs. As Rajith explains, “It’s simply cloning a memory area of a running process, without disturbing it, and implanting it to another running process.”
Snapshotable memory in action
Already, the technology is in use at Europe’s largest clearing system, which is the London Clearing House. In this system, the client position structures utilised for centralised financial risk management is stored in snapshotable memory. Millions of nodes are connected as a graph to further speed up processing. The end result? In a conventional industry-grade database setup, it’d take 15+ minutes to load this data into memory. In contrast, loading a memory snapshot with this custom data structure takes mere seconds.
“Thanks to memory snapshots taken periodically, we now have the ability to rapidly recover from a disaster. It only takes a few seconds to bring back another instance of the same process on a different machine by loading a snapshot memory. This also allows us to perform batch processing at exceptional speeds by reusing memory snapshots of real-time processing components for faster in-memory operations at batch processors,” states Rajith.
Of course, there were challenges along the way in creating such a novel solution. During the upcoming session of the LSEG Tech Talks, Rajith will be offering a detailed introduction to snapshot memory, the challenges the company overcame to develop it, and its potential applications. If you’re curious to learn more, click here to register now.