High Frequency Trading: a Big Data application
a) High Frequency Trading reviewed
High Frequency Trading (HFT) is currently available on forty electronic markets across six continents. HFTis used for trades and arbitrage in execution of a wide range of strategies, across a broad range of instruments including equities, foreign exchange and derivatives. A recent study by Morgan Stanley and Olive Wyman indicates that approximately 70% of global equities trading is executed by machines without human intervention. The market share of HFT in the US was 36% for all types of instruments (55% for equities) in 2010/11 (FSOC 2012), Europe was estimated at 30-50% (FTSE Global Markets 2011), and in Australia as recently reported (ASIC 2013) it accounted for 22% of total equity market turnover. ASIC also reported that dark trading represented 25-30% of trading with 7% of that total in HFT form.
HFT has different definitions dependent on market, instruments or strategy so care is needed when dealing with trend data. As a consequence ASIC in REP 331 (2013) defined in excess of 100 terms as applied to their research into this form of trading. There is often confusion between electronic trading, algorithmic trading and high frequency trading (as a combination of both).
The most generally accepted description of this particular form of trading is given by IOSCO with the characteristics of:
- HFT is a type of algorithmic trading but not all forms of algorithmic trading are high frequency
- sophisticated technological tools are required
- it is highly quantitative
- high daily portfolio turnover and order to trade ratio
- usually involves flat positions at the end of the trading day (although long positions are not unknown)
- it is latency sensitive (time taken from order placement to response with 34 microseconds reportedly the fastest), driving requirements such as direct electronic access and co-location. It is claimed for current systems 100,000 orders can be processed per second, 500 million per day. The present target of advancing speeds to nanosecond (billionths of a second), it is estimated, could generate an extra $100million per year in earnings on global trades).
In practice HFT strategies typically cluster around three specific objectives: low latency ie the critical speed factor; market making that exploits price discrepancies and is rewarded for posting liquidity and, statistical arbitrage (FTSE Global Markets 2011). Serious investments in various forms are required to achieve chosen strategies. Stock Exchanges are required to make the investment in data storage, processing and communications to address the issues of latency and equality. Gore Hill cost the ASX $37 million with cabinet rentals costing client traders $2500 per month. There are already plans to extend the use of these facilities into Cloud Computing as part of a network of global data centres.
Traders need to develop algorithms capable of not only executing commercially beneficial patterns of trade but also responding to changing patterns of information while minimising risks. The life span of an algorithm is reported to be as low as 14 days given the competitive countervailing measures. Individual algorithms can cost from $10,000 to $1 million to develop. The forecast expenditure on algorithms for 2013 is $51 billion (Hughes-Liley, 2012). Costs will rise if further development results in the replacement of software by a hardware medium. The medium affects speeds and costs. Algorithms are already being customised to both specific buyer and market conditions, including current difficulties of low volumes etc. The next generation of trading algorithms will be made up of: trading envelopes; execution constraints, and liquidity seeking tactics. Data for these advanced algorithms is already available (see below)
Back office routines and systems will require upgrades. Issues...
Please join StudyMode to read the full document