OK. This one is exactly about algo trading, but is relevant in my view:

Tino, P. and G. Dorffner, 2001,

*Machine Learning*, 45(2)

**Predicting the Future of Discrete Sequences from Fractal Representations of the Past**http://www.cs.bham.ac.uk/~pxt/PAPERS/pfm.ml.ps.gzAbstract

This article compares the ability of a Fractal Prediction Machine (FPM) with a classical Markov model (MM) and a variable memory length Markov model (VLMM). The results of this comparison show that different parameter selection leads to very different learning scenarios. The FPMs are considered to be more intuitive, useful versions alternatives to VLMMs, although they do not have the same theoretical background to VLMMs.

Five experiments were carried out, using very different data sets for each one. The data sets were: DNA coding and non-coding regions, language data from the Bible, sequences of quantized activity changes if a laser in a chaotic regime, a Feigenbaum binary sequence and finally a set of quantized daily volatility changes of the Dow Jones Industrial Average. All experiments involved a FPM with a contraction coefficient k = ˝. Other parameters, such as memory depth, number of dimensions in the geometric representation were varied between each experiment.

This article showed that the FPM outperformed the classical MM and was generally also better than he VLMM, except on one data set: the DNA sequences, the uniform distribution of which favours the MM. One of the main advantages of FPMs, according to the authors, is the self-organising character of constructing fractal-based predictive models. Another important aspect of this work was that two of the data sets were related to natural language: the study of words from the Bible and topological structure of the Feigenbaum sequence (which is a restricted indexed context-free grammar).