Alex is Lead Architect at RBC Capital Markets' technology division. "My role is about finding out what business problems within the bank need solving, and using technology to do so," he says. "As an architect, I think about what systems are going to provide that solution and what hardware those systems are going to run on." The same way an architect working on a building project will visit the site regularly to ensure that what construction workers are building fits the blueprints, an infrastructure architect will manage a technology project from inception through to deployment.
That means Alex needs to have a good understanding of what's going on at the bank from a business perspective as well its technology projects and how they interact with each other. "Technology teams often work in isolation, so my job is to act as a broker between them and make sure everything is running efficiently," explains Alex. "I act as the glue that sits between lots of different groups."
Building big data solutions
The banking industry as a whole is looking at how to improve the processes around how to manage risk when there are increasingly high volumes of data passing through the bank every day. Banks have been attempting to find new technology solutions to cut down the time it takes to produce risk reports, which involve running "what if" scenarios on the data to see what could potentially happen if certain market conditions were to occur. These reports are invaluable as most decisions traders make are based on the risk calculations of various scenarios in these reports.
Historically, banks have generated these reports using systems that took all the data at the end of the day to produce a report the following morning. "This was somewhat problematic, because if you wanted to look into something new during the day, it didn't give you the scope to alter the model and run a new scenario," explains Alex.
The solution Alex and his team developed was a big data technology, which was based on "MapReduce" - a technology originally developed by Google for running searches. This new technology opens up huge benefits for reporting. "Now, if we want to run a scenario on some new conditions, we can enter those parameters into the big data platform and generate a report in near real time," explains Alex. "This allows us to be increasingly responsive." In addition, the technology is also highly transferrable.
Scaling out rather than scaling up
From an infrastructure perspective, one of the interesting things about this big data technology is how it has been scaled up. "Traditionally, if you needed more power from a platform, you'd need to move it to a faster machine with more processing power," explains Alex. This meant investing large sums of money to buy the machines and maintain them.
The new system runs on lots of low-end commodity servers. "This means that when the system needs more power, we can scale it out, rather than scale it up, simply by adding more servers to the cluster," explains Alex. This is more cost effective and easier to do than replacing a high-end machine. It also means that the systems can be spread across the globe.