Jump to content

Holes in MetaQuotes historical data (via Alpari UK History Center) - DO NOT USE


Recommended Posts

Lately I have been finding big holes in historical data that I've downloaded from Alpari UK via the History Center window in MetaTrader.

 

To find the holes, I used a script called "history_data_analysis.mq4" which you can download from:

 

http://codebase.mql4.com/3598

 

I have just downloaded the latest EURUSD data and ran the script to look for holes that are greater than a day. Here is the report:

 

__________________________________________________________________________________________________________________________
・・ ケ	|  *ト鞨・鉋・ (萵 / 糅褌)			|  ミ珸・・ (bars)	|  ト・・濵・ (:・)	|  **Gap  (pips)	|  マ・湜
--------------------------------------------------------------------------------------------------------------------------

  1     	  24.12.2007 [19:00]	       -	26.12.2007 [09:10]	  2290		   1 day(s)  14:010			   110		   

  2     	  31.12.2007 [18:01]	       -	02.01.2008 [09:01]	  2340		   1 day(s)  15:00			   510		   NewYear  

  3     	  24.12.2008 [19:58]	       -	26.12.2008 [08:03]	  2165		   1 day(s)  12:05			   560		   

  4     	  31.12.2008 [20:01]	       -	02.01.2009 [06:01]	  2040		   1 day(s)  10:00			   680		   NewYear  

  5     	  24.12.2009 [19:00]	       -	28.12.2009 [00:00]	  1679		   1 day(s)  03:59			   170		   (***粲・ 糺蓖鐱 蓖・

  6     	  31.12.2009 [19:00]	       -	04.01.2010 [00:00]	  1679		   1 day(s)  03:59			   50		   NewYear  (***粲・ 糺蓖鐱 蓖・

  7     	  12.03.2010 [23:00]	       -	25.03.2010 [22:14]	  12792		   8 day(s)  21:12			   4920		   (***粲・ 糺蓖鐱 蓖・

  8     	  08.06.2010 [23:06]	       -	14.06.2010 [10:14]	  4927		   3 day(s)  10:07			   2050		   (***粲・ 糺蓖鐱 蓖・

  9     	  09.07.2010 [23:00]	       -	15.07.2010 [12:12]	  5051		   3 day(s)  12:11			   1490		   (***粲・ 糺蓖鐱 蓖・

__________________________________________________________________________________________________________________________

 

You can see that there was a huge hole of over 8 days and 492 pips from 2010-03-12 23:00 to 2010-03-25 22:14.

 

You can see the gap in History Center:

 

http://stashbox.org/1009698/Alpari%20UK_History%20Center%20EURUSD%2CM1_2010-03-12to2010-03-25_Hole.gif

 

and also on the chart:

 

http://stashbox.org/1009699/alpari%20uk_eurusd%2Cm1_2010-03-12to2010-03-26_hole.gif

 

This kind of gap would obviously be problematic in backtesting - e.g. if there is a buy trade open before the hole then there would a sudden huge loss in the trade.

 

Other people have also reported the same problem:

Missing 2 Weeks Data in March 2010: http://forum.mql4.com/31767

verify history center data: http://forum.mql4.com/31357

 

Please check your own historical data to see if you have the same holes.

 

So if you choose to use data that you get via the MetaTrader History Center, do not use an Alpari UK terminal, or a terminal from brokers who provide MetaQuotes data when you download through the History Center. You will know whether the data comes from MetaQuotes because a message box appears that says so when you press the Download button in History Center:

 

http://stashbox.org/1009704/Alpari%20UK_History%20Center_Download%20Warning.gif

 

Most brokers only provide MetaQuotes data when you download from the History Center in their terminal. e.g. even Dukascopy gives you MetaQuotes data and not their own trade server data, even though they provide their own high quality tick and candle data on their web site:

 

http://stashbox.org/1009711/Dukascopy_History%20Center_Download%20Warning.gif

 

There are some brokers who do actually provide their own data when you download from the History Center in their terminal, such as Alpari US (NZ). Notice that their Download Warning says that the data comes from Alpari NZ and not MetaQuotes:

 

http://stashbox.org/1009705/Alpari%20US_History%20Center_Download%20Warning.gif

 

Alpari US data does not have such large holes like the Alpari UK / MetaQuotes data (last time I checked), so Alpari US History Center data may be better to use (but you should check).

 

GOMarkets also provides their own data:

 

http://stashbox.org/1009708/GOMarkets_History%20Center_Download%20Warning.gif

 

However I have heard from others that it also has a lot of holes. Please check yourself (and let us know here).

 

 

I have just checked a number of other brokers to see their History Center Download Warning message box, and all of the following state that the data comes from MetaQuotes:

CMS / BeamFX

EXNESS

FXDD

FOREX.com

FXClearing

FXCM

FXOpen

FXPro

Gallant FX

IKON

Interbank FX

MB Trading

LiteForex

JadeFX

ODL

NordFX

MIG Bank

NordMarkets

PFG Best

Tadawul FX

 

So I think at the moment, if you want to stick with using data from History Center (because it's easiest), the best platform to use may either be Alpari US or GOMarkets (if you find no large holes).

 

One problem with using an Alpari US terminal for backtesting is that leverage will be much lower (soon to be 50), so testing strategies that require high leverage (such as martingale / grid) may produce poor results. A solution to this is to simply an Alpari US terminal only for downloading data, then copy the .hst files into a terminal from a broker who provides high leverage (e.g. Alpari UK).

 

It would be good if other people here can check other brokers to see where the data comes from when they try to download from History Center. Maybe we could find other sources of better quality historical data than MetaQuotes.

 

Another option is to use FXDD's trade server data that you can download from http://www.fxdd.com/en/mt1m-data.html and import into your backtesting terminal. But be aware that the UTC offset is 3 (I think).

 

Here is a good post that covers a number of historical data options: http://forum.mql4.com/30244#279698

 

Personally, I use real tick data from Dukascopy to do my backtests. I wrote my own programs in C# to process the data to create HST and FXT files. There is also birt's method: http://indo-investasi.com/showthread.php/3294-99-modeling-quality-when-backtesting-with-MT4-build-225.

 

Using Dukascopy tick data with variable spread (real Ask prices) and commission is currently the most accurate method of backtesting possible, however it takes much more work and a more technical mindset to do it, so it is not appropriate for most people.

Link to comment
Share on other sites

Yeap, I can confirm that gaps. It started as per my knowledge in May this year. First I noticed it when I downloaded history data and some files where quite smaller (50%+) than other months. However, I do use tickdata for backtesting but its a bit more troublesome.

 

You might workaround the gaps:

1.) Refresh/recalculate your data of the selective Pair in the History Center

2.) open a respective pair/timeframe/chart, right click into and select refresh. You can monitor the status in the Journal and as a result how many bars have been imported.

Bad thing, the Tester will show "quite" some mismatched data and low/na data quality (But I didnt noticed any differences in tests to tickdata...)

You need to repeat this steps while closing and opening your MT4....

 

Cheers

Link to comment
Share on other sites

I can confirm the gaps too. I use birt's 99% data for any serious backtesting.

 

Otherwise I have a tester with no holes. But I haven't updated any data for a couple of months for fear of obtaining those holes.

"It is inconceivable that anyone will divulge a truly effective get-rich scheme for the price of a book."

Victor Niederhoffer (1943–), US hedge fund manager and statistician

Link to comment
Share on other sites

Best way to resole this issue for people who want to use the Dukasdata but have no technical expertise to do it would be for someone who already did it to save the .hst data files (in an archive and post the link) for all the important pairs and then for the original .hst files to be written over. I've tried this on multiple machine I am performing backtests and it worked great without having to repeat the importing process for every machine.

 

Cheers

Link to comment
Share on other sites

Best way to resole this issue for people who want to use the Dukasdata but have no technical expertise to do it would be for someone who already did it to save the .hst data files (in an archive and post the link) for all the important pairs and then for the original .hst files to be written over. I've tried this on multiple machine I am performing backtests and it worked great without having to repeat the importing process for every machine.

 

Cheers

This is good idea as the process to download & process Dukasdata is very time consuming. The final data per pair is very large, so maybe if some torrents can be used for this process? Then whoever already downloads the data can leave their PCs ON to share the load.

Link to comment
Share on other sites

1.) Refresh/recalculate your data of the selective Pair in the History Center
I tried that at least 5 times for the current holes and they were never filled. I think if MetaQuotes decides to fill the holes themselves on their server then doing a refresh / recalculation afterwards may work.

 

 

2.) open a respective pair/timeframe/chart, right click into and select refresh. You can monitor the status in the Journal and as a result how many bars have been imported.
This gets the data from the broker's trade srever, and not MetaQuotes. Brokers only provide a limited amount of data on their server, e.g. a few months or a fixed number of bars. Your data would also be mixed in with the MetaQuotes data that you downloaded via History Center, which is not good if the characteristics of the broker's data is different from MetaQuotes data (e.g. the broker may give very low spread in return for trading commission).

 

 

You need to repeat this steps while closing and opening your MT4....
This step is quite important after having done a data import, otherwise you wouldn't see the data when you look at the available offline data (File, Open Offline). I only found out from experience about this requirement (and I presume you did too). After closing the terminal you need to then wait for some seconds for processing to finish (the hard disk drive LED lights up; wait for it to turn off) before reopening the terminal. Edited by hyperdimension
Link to comment
Share on other sites

Otherwise I have a tester with no holes.
That is technically impossible, as ticks are discrete and not a continuous stream, so there will always be gaps of time between ticks. The problem is when you see very large gaps of time between two ticks, such as a day or a week during a time of week that you'd expect to see ticks.

 

I have seen holes in Dukascopy's tick data; some last for hours. The program that I wrote to process Dukascopy's tick data into HST and FXT files checks the time interval between every two ticks and writes to a text file every time it finds an interval that is larger than what I specify (e.g. 3600s). I see many entries in the file.

 

Though relative to MetaQuotes data, Dukascopy's tick data is high quality.

Link to comment
Share on other sites

save the .hst data files (in an archive and post the link) for all the important pairs and then for the original .hst files to be written over.
Note that this method would not be the same as testing over tick data, as you would only be using Dukascopy's 1-minute bars. MetaTrader would then create fake ticks from those bars when you run a backtest. This is Ok as long as the intention was not to backtest over real tick data. It would then be very similar to using FXDD's data, who provide 1 minute bar data on their web site.

 

The problem then would be finding someone who would be dedicated enough to do the work of keeping the HST data up-to-date for others to regularly download.

Link to comment
Share on other sites

  • 4 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...