Prediction and chaos in the markets

It’s been a while.

Screen Shot 2019-05-08 at 00.41.35

The May 6th, 2019 UK edition of Metro published an article entitled “Can we trust machines to predict the stock market with 100% accuracy?“, by Sonya Barlow.

The piece, which is more in-depth and better-researched than one might expect, included a single sentence from me, which was a portion of my response to this question from Sonya:  “The more AI [is] used in predictions, the less that AI can predict – Can that really be the case?”

Here is my response in full:

It’s not in general true that the more AI is used in predictions, the less it can predict, if the AI is not a part of the system it is predicting (predicting sunspot activity, for example).  But in the kinds of cases many people are interested in, such as AI for financial prediction, it’s a different matter, because in finance, the AI is typically part of the system it is trying to predict.  That is, the predictions the AI makes are used to take action (buy, sell, short, whatever) in that system.  And the presence of predictors (machine or human) in a system, taking action in that same system on the basis of their predictions, makes the system more difficult to predict (by machines or humans).
Why is this so?  To see why, consider a relatively simple, one-on-one system with two members: you and your opponent.  The best way to predict what your opponent is going to do is to model them: figure out what their strategy is, and predict that they will do whatever that strategy recommends in the current situation.  You then choose your best action given what you predict they will do.  But if they are also a predictor like you, then you both have a problem.  Even if you know what your opponent’s strategy is — it’s to predict what you are going to do, and act appropriately — predicting what they will do depends on what they predict you will do, which in turn depends on your prediction of what they are going to do, which is back where we started. Thus, the behaviour of the system is an unstable, chaotic circle.
This doesn’t mean that we’ll stop using AIs to predict — on the contrary, they will become (even more) obligatory, just to stay in the predictive arms race.  To fail to use them would make you more easily predictable, and thus at a relative disadvantage.