Credit Valuation Adjustment: Solve Computational Challenges and Reduce Grid TCO

Credit Valuation Adjustment Feature

Financial institutions have long used credit valuation adjustment (CVA) to monitor and manage counterparty credit risk, meet regulatory and reporting requirements, and even price and hedge CVA. In the post-2008 credit crisis world, however, this evaluation activity has steadily increased in demand and complexity, creating tough computational challenges for firms. Where once large banks used CVA on a monthly basis, they’re now performing these calculations at daily to real-time (sub-second) intervals. The active, continuous management demanded by counterparty credit risk (CCR) raises two computational problems in particular: latency sensitivity and vast scale. Alone, these two would challenge traditional scale-out grid infrastructures. ... [ Read More ]

[Infographic] Risk Management & HPC


In the post-2008 credit crisis world, risk evaluation by financial institutions has steadily increased in demand and complexity. Where large banks once performed risk analysis on a monthly basis, they’re now performing these calculations at least daily, if not in near-real-time. Cray makes this possible with the embedded Aries network, which eliminates the performance and latency issues of more traditional “fat tree” approaches, as well as the DataWarp™ applications I/O accelerator which effectively removes data access bottlenecks. See the infographic below for a look at Cray’s approach compared to legacy approaches – and to understand the difference in value.   ... [ Read More ]

Top 5 Blog Posts From 2015


Happy New Year! Check out some of the top blog posts from 2015: Why Do Better Forecasts Matter? Meteorologists have been striving to increase the accuracy of their work since the first weather forecasts were issued in the mid-19th century by pioneers such as Robert FitzRoy. In the 1950s and 1960s, the development of computers and observational technologies enabled forecast generation using the numerical weather prediction theories of Vilhelm Bjerknes and Lewis Fry Richardson. Today the numerical models that support our weather and climate forecasts are true “grand challenge” problems, requiring on the order of 10,000 billion (1016) mathematical operations and the exchange of 150 billion (1.5 x 1014) bytes of information to generate a ... [ Read More ]

Algorithmic Trading: Faster Execution or Smarter Strategies?


The short answer is: You need both. Since the advent of the first high-frequency trading (HFT) firm, the quest for low-latency trading has been paramount. Strategies that were profitable before HFT are now obsolete. Among those strategies with questionable profitability today are: Arbitrage: Markets move too quickly to allow time for arbitrage. Market making: HFT imposes excessive risks on those traders. Event trading: Competing against HFT in terms of speed of response to scheduled economic reports and conventional news is impossible, since HFT systems can process and react to the information quicker. Faster execution is necessary to take advantage of short-term opportunities. Profitability is directly correlated to volume ... [ Read More ]

The Best Practices are No Longer the “Best” Practices – Part II


Earlier this week, I shared my views about best practices and cybersecurity. Now I want to move beyond best practices as your sole defense. The traditional cybersecurity mindset is one of prevention, believing threats cannot penetrate — and this is why security plans fail. It’s easy to assume defenses are successful against an insidious threat. Metrics will show an effective compliance program, intrusion detection and access denial. Yet to take for granted that the threat is gone, rather than having simply moved to another path within your network, is foolhardy. Assuming there are numerous threats to your security measures that are coming in a dynamic and continuous fashion may seem paranoid, but just because you’re paranoid doesn’t mean ... [ Read More ]