Stock Market Surge

August 11, 2011 by USA Post 

Stock Market SurgeStock Market Surge, Standard & Poors since downgraded the U.S. government Credit last week, global financial markets have been on a wild ride.

In the midst of a week in the Dow Jones industrials fell 630 points on Monday, 420 points on Tuesday, then down 520 points on Wednesday, the financial media and technology company Bloomberg LP has been in the eye of the storm.

An unprecedented amount of data on securities trading, stock quotes and other financial news and information has flowed through the Bloomberg Professional service, a platform for stock trading tools and investment and financial information used by more than 300,000 finance professionals worldwide.

Bloomberg processed 41 billion ticks – a tick is a change in the operations of a title, offers and bargain prices – on Friday 05 August, an increase of 33 percent over the peak of the last major market in March 2011 (earthquake and tsunami from Japanese). This compares with 20 billion ticks of the financial crisis in 2008, 27.5 billion ticks during the flash crash in May 2010 and 30 billion ticks of Japan after the tsunami.

All this coupled with a stressful week Vipul CIO Nagrath Bloomberg and his months of testing and preparation for data transfer end was put to the ultimate test.

Bloomberg last Friday processes twice the ticks many as it did during the 2008 financial crisis.

Nagrath spoke with senior editor Shane O’Neill about how your IT staff has been handling one of the most volatile trading weeks in the history of stock market.

Is this the most trade volume data [tick] you’ve seen? Even more than the stock market crashes of 2008?

Since last Thursday, the trade in large volumes of data is unprecedented. Both the data rate and daily aggregated data volumes are the highest we’ve ever seen. Are dramatically compared to recent events, including the crash of 2008, the tsunami in Tokyo, and the collapse of inflammation of May 2010. The tsunami in Japan boosted the brand volume up 8 percent from where it was after the fall of flash. But this week we’re 33 percent from where we were after the tsunami. Has been spectacular.

What kind of adjustments must be made to Bloomberg data systems to accommodate the recent volatility?

We have been anticipating volatility and did a lot of testing to prepare for this type of trading volume. But that does not mean that we sleep easy. We did not know how high he went, so we were on alert to make sure nothing went wrong.

There is a sustained transfer rate throughout the day, and the concern was that the rate of transaction volume of data could peak as fast I could not handle the performance. But we were able to handle both the influx per second data rate, and the volume of aggregate data for the day.

Along the way, we had to make sure we were covered with wise people, that someone was watching our systems. So in reality the coding system tuning or do not have to do anything because we’ve spent the last two months preparing for just such events. And that preparation paid off.

Report to Team

Please feel free to send if you have any questions regarding this post , you can contact on

Disclaimer: The views expressed on this site are that of the authors and not necessarily that of U.S.S.POST.


Comments are closed.