Would Cell be good for financial modeling?

blakjedi

Veteran
I was taking a tour of the chicago mercantile exchanges (futures and options markets) and due to sheer volume of interactive trades they find it difficult to monitor trades in realtime using hand held devices (ipaq, etc).

Does anyone think that a Cell based device or desktop would be more capable of keeping up with the volume of multiple thousands of transactions per second that occur then visualizing it??
 
Last edited by a moderator:
Im actually working on something similar right now. Using a dual P4 machine, but grabbing feeds from Bloomberg, and building models for the traders to view (nice pretty colours)!

The main bottleneck is getting the dataset from the network, not the speed of the machine. there isnt much point caching locally for anything other than historic data, so no speed up that way.

Going from a P4 3.0 to dual 3.2s netted me an improvement of exactly 0%. running the same code on the gigabit port of the switch netted me 8% improvement. I think this is due to latency rather than the bandwidth though.

Ali
 
so exactly what kinds of maths are involved in this kind of modeling... its sounds like you are doing "spreadsheet graphing"... as opposed to modeling expected trends and comparing with actual data
 
As long as you have lots of code that can be easily parallelized and isn't branch-heavy, then it will run great on a Cell-like architecture.
 
Ali said:
Im actually working on something similar right now. Using a dual P4 machine, but grabbing feeds from Bloomberg, and building models for the traders to view (nice pretty colours)!
Interesting stuff. As you've seen there's nothing really in the computation that's problematic for even mid-level CPUs. It all depends on what you're doing, how large a data set, & how real time you want the output. Eg a real-time garden variety Black-Scholes test on the entire live feed of the Chicago Board of Trade may be an exercise in managing I/O more than computational intricacy.
 
Back
Top