What Is Historical Data Analysis?
In the arena of active trading, market participants dedicate substantial time and effort to gaining insight into how a market’s past behaviour relates to its future. The acquisition of timely market data and relevant news garners large capital allocations, with firms around the world spending nearly US$27 billion on market-related information annually.1)Retrieved 3 January 2017 http://www.prnewswire.com/news-releases/global-spend-on-financial-market-data–news-showed-strong-growth-in-2015before-the-impact-of-currency–burton-taylor-report-300233317.html No matter if one’s approach to the marketplace is rooted in fundamental or technical analysis, profitability depends on the recognition of future opportunities and the elimination of past mistakes.
Historical data analysis is the study of market behaviour over a given period of time. The phrase “market behaviour” is used in reference to the many different facets of the market and their interactions. Recorded market-related data such as price, volatility and volume are able to be quantified and studied over a defined period.
Through detailed examination of a market’s past behaviour, traders and investors can gain perspective on the inner workings of that market. The information obtained over the course of the process may prove useful in developing a viable trading plan or improving an existing methodology.
Historical data analysis pertaining to an individual security or market can be useful in several ways:
- Market insight: Extensive study of the past behaviour of a financial instrument or market can provide the trader with an idea of which exhibited characteristics are normal and which are extraordinary.
- System development: A clear definition of when, what, and how to trade a given market are the starting points for the creation of a trading system. Through historical data analysis, a statistical “edge” may be identified and developed for active trade.
- Consistency: The selection of trades with a predefined expectation can give the trader confidence in the potential outcome. Through understanding how a given trade has performed over time, unexpected results can be reduced.
It has been said that those who do not understand history are doomed to repeat it. The discipline of historical data analysis aspires to not only avoid the mistakes of the past, but establish a working advantage moving into the future.
Financial Data Mining
Data mining is the process of analysing large, and sometimes-unrelated, data sets for useful information. As technology has evolved, the ability to conduct a data mining operation has become readily accessible to anyone with computing power and a database. This ability to quickly sift through large amounts of information in an attempt to identify relationships and patterns hidden within the data is extremely valuable in the financial markets.
Historical data analysis is essentially a data mining project that focuses on data sets related to the past behaviour of a specific market or financial instrument. Recorded market-related statistics such as price, volume, open interest and assorted volatility measures are a few types of market data that can provide cause and context for seemingly erratic market moves.
In order to conduct a data mining operation with focus upon a specific market or security, the following inputs are required:
- Computing power: Access to a personal computer with an adequate processor, hard-drive space and RAM is required. For instance, use of trading platforms such as Metatrader4 and Trading Station require a minimum of a 300 MHz processor, 256MB of RAM and 60 MB of available hard-drive space. Hardware requirements vary depending upon the trading software package, but as a general rule, the more power the better.
- Data set: Selection of a specific time period, or quantity of data to be analysed, is a key element of a useful study. Many broker-provided software packages furnish complimentary market data to the user, in addition the ability to purchase specialised data sets.
- Query: Basic questions, typically in the form of customised algorithms, are necessary to begin deciphering data.
A study of historical data pertaining to a security or market may prove to have predictive value. Concealed patterns, relationships and tendencies within the data may be identified and capitalised upon by future trading activities.
Market Data: Price
Market-relevant data comes in many different varieties. As mentioned earlier, volatility measures, volume and open interest are all examples of market data. However, the most referenced form of any market-related information is pricing data.
Pricing data, or simply price, is the exact value at which both the buyer and seller of a security agree to conduct an exchange. By law, pricing data must be factual and independently verifiable.2)Retrieved 4 January 2016 https://definitions.uslegal.com/c/cost-or-pricing-data/ Because traders and investors are largely concerned with pricing fluctuations as they pertain to a specific market or security, historical pricing data is meticulously inspected for information useful in the prediction of future price variances.
There are two major classifications of pricing data:
- End-of-day (EOD) data: This data is gathered and reported at the trading session’s end. It is used by long-term investors, swing traders and true day traders to gain perspective on a trading session’s action. EOD data can be grouped in terms of weeks, months and years.
- Intraday data: The traded prices of a security over the course of a trading session are known as intraday data. It focuses on the pricing fluctuations occurring within a single trading session. It may be obtained in real-time, or in historical context using time-based increments or tick-by-tick format (known as tick data). Typically, intraday data is more costly than EOD data, and its availability varies depending upon the instrument or market desired.
For chart-based technical analysts and traders, pricing data is deciphered through the use of automated charting software applications. No matter which classification of pricing data one selects, the software program commissioned with deciphering the data will use predefined parameters to sort and compile the data set. Each desired parameter—delineated in terms of days, minutes, or number of ticks—will represent a unique period.
For each period, there are four key aspects of price that prove valuable in the analysis of historical data:
- Open: The open is the first price traded at the beginning of a given period.
- Close: The close is the last price traded at the end of a given period.
- High: The high is the greatest price traded during a given period.
- Low: The low is the smallest price traded during a given period.
The open, close, high and low price values often play an important role in chart construction and analysis, and serve as the basis for many trading strategies.
It is important to remember that any historical data study needs to have a defined time horizon. The trading approach itself has great bearing upon which time parameters are most relevant to the data analysis.
For instance, if one is looking to invest in blue-chip stocks for retirement, then a 20-year study of S&P 500 daily closing prices based upon EOD data may be the most appropriate. Likewise, if one is involved in the scalping of currencies on the forex, study of a currency’s intraday price action in increments of 5, 15, and 30 minutes, will prove much more useful than its weekly closing prices.
In the current electronic marketplace, the availability of historical market data has improved greatly. Trading service companies and brokerage firms offer different types of market data at varying costs to the trader. FXCM currently offers up to 10 years of complimentary historical data, in addition to premium data services compatible with Metatrader4, NinjaTrader and other platforms.
Perhaps the most commonly implemented form of historical data analysis is backtesting. Backtesting is the application of a trading method or strategy to a selected historical data set. Automated trading systems, algorithmic trading and more traditional trading approaches often rely upon statistical data compiled through an extensive backtesting study.
In order to conduct a backtest, one must have a defined trading strategy and access to a relevant data set. After both are in place, the strategy is used as an overlayment upon the data, and a simulation of the strategy’s performance is conducted. Backtesting studies can be simple or intricate, and largely depend upon the sophistication of the trading approach.
Upon completion of the testing, performance metrics can be applied to the results and used to determine the viability of the strategy. Several key statistics are quantified through a comprehensive backtesting study:
- Number of opportunities: The extent and frequency of trade setups created by a strategy over a specified period of time is a crucial piece of information.
- Success rate: A strategy’s win/loss percentage, or probability of success, can be useful in determining whether it is a suitable means of trade for a given product or market. It can also shed some light upon the optimal time and product to engage.
- Risk vs reward: A backtesting study can determine the necessary amount of capital needed to properly execute a trading approach upon a market or product. The diagnosis of a market’s inherent volatility can be useful in identifying the degree of risk facing the trading strategy.
In earlier days, backtesting was an arduous task performed manually with pencil and paper. Fortunately for modern-day traders, automation has streamlined the procedure, exponentially improving efficiency. Trading platforms provide software functionality capable of executing detailed strategy backtesting operations.
Challenges And Pitfalls
Although historical data analysis is a powerful tool in both system development and strategic fine-tuning, there are also a few pitfalls of which to be aware:
- Hindsight bias: Hindsight bias can be a major problem affecting the accuracy of a backtesting study. Also known as the “I knew it all along” bias, it is the tendency for individuals to assume that unpredictable events can be forecasted ahead of time. Hindsight bias is severely detrimental to historical data analysis because certain results may be perceived avoidable and disregarded. It actively compromises the objectivity of the study, thereby producing skewed results.
- Data omissions and errors: The physical accuracy of the historical data set is of paramount importance to the backtesting study. Even a relatively small number of data errors can impact a study’s results greatly over time. This factor is especially important in the examination of intraday data. When considering small time frames or tick-by-tick intervals, precision in the recording of pricing data can be elusive. The quality of the historical data set is crucial to the accuracy of the backtest, and small mistakes can compromise the integrity of study results.
- Software performance: A software “glitch” can destroy the credibility of test results. Strategy testing software is the filter by which market data is sifted. If there is any discrepancy between the software’s desired function and its actual function, the results of the backtest are inaccurate. It can be extremely difficult to spot software errors. Manual checks and automated diagnostics are both needed to ensure accuracy.
- Underestimation of randomness: Random chance plays an important role in the marketplace. A trading strategy may produce outstanding results during a backtest, yet struggle in live market conditions. Factors such as slippage, enhanced volatility and periodic fundamental changes in market structure can be impossible to account for, serving to compromise the viability of a trading strategy.
Human psychology and technological failure can affect the relevance of any backtest or study of market history. Inevitably, it serves the trader well to be aware of the old axiom: “past performance does not guarantee future results.”
Historical data analysis is a common method of placing the sometimes “irrational” behaviour exhibited by markets into context. Through an extensive review of the past, traders and investors alike can eliminate many mistakes while preserving future opportunities.
However, it is important to be cognisant in regards to the quality, sources and reliability of the historical market data itself. Errors are sometimes unavoidable, but through the proper due diligence, exercises such as financial data mining and backtesting can provide invaluable information to the trader.
As with most aspects of trading, historical data analysis can contribute to a trader’s long-term success when used in concert with other analytical tools and proper risk-management principles.
Any opinions, news, research, analyses, prices, other information, or links to third-party sites are provided as general market commentary and do not constitute investment advice. FXCM will not accept liability for any loss or damage including, without limitation, to any loss of profit which may arise directly or indirectly from use of or reliance on such information.
References [ + ]
|1.||↑||Retrieved 3 January 2017 http://www.prnewswire.com/news-releases/global-spend-on-financial-market-data–news-showed-strong-growth-in-2015before-the-impact-of-currency–burton-taylor-report-300233317.html|
|2.||↑||Retrieved 4 January 2016 https://definitions.uslegal.com/c/cost-or-pricing-data/|