Nav: Home

Deep learning and stock trading

March 16, 2017

In their study, researchers of the School of Business and Economics have shown that algorithms based on artificial intelligence are able to make profitable investment decisions. When applied to the S&P 500 constituents from 1992 to 2015, their stock selections generated annual returns in the double digits -- whereas the highest profits were made at times of financial turmoil. The findings have recently been published in the European Journal of Operational Research (EJOR) -- a leading outlet in the field of operational research and decision making.

In March 2016, Lee Sedol -- one of the best human players in the Asian board game Go - lost against AlphaGo, a software developed by Google DeepMind. Compared to chess, Go is much more complex, and has long been considered an Everest for artificial intelligence research. Driving force for such achievements are computer programs that are loosely inspired by how biological brains work, i.e., by learning from examples and independently extracting relationships from millions of data points relationships from millions of data points. 'Artificial neural networks are primarily applied to problems, where solutions cannot be formulated with explicit rules,' explains Dr. Christopher Krauss of the Chair for Statistics and Econometrics at FAU. 'Image and speech recognition are typical fields of application, such as Apple's Siri. But the relevance of deep learning also increases in other domains, such as weather forecasting or the prediction of economic developments.'

Analysing capital market data

The international team around Dr. Christopher Krauss -- consisting of Xuan Anh Do (FAU) and Nicolas Huck (ICN Business School, France) -- were the first researchers to apply a selection of state-of-the-art techniques of artificial intelligence research to a large-scale set of capital market data. 'Equity markets exhibit complex, often non-linear dependencies,' says Krauss. 'However, when it comes to selecting stocks, established methods are mainly modelling simple relationships. For example, the momentum effect only focuses on a stock's return over the past months and assumes a continuation of that performance in the months to come. We saw potential for improvements.' To find out whether automated learning approaches perform better than a naïve buy-and-hold strategy, researchers studied the S&P 500 Index, which consists of the 500 leading US stocks. For the period from 1992 to 2015, they generated predictions for each individual stock for every single trading day, leveraging deep learning, gradient boosting, and random forests.

Outperformance with machine learning

Each of these methods was trained with approximately 180 million data points. In the course of this training, the models learned a complex function, describing the relationship between price-based features and a stock's future performance. The results were astonishing: 'Since the year 2000, we observed statistically and economically significant returns of more than 30% per annum. In the nineties, results were even higher, reflecting a time when our machine learning approaches had not yet been invented,' adds Krauss. These results pose a serious challenge to the efficient-market hypothesis. Returns were particularly high during times of financial turmoil, e.g., the collapse of the dot-com bubble around the year 2000 or the global financial crisis in 2008/2009. Dr Krauss: 'Our quantitative algorithms have turned out to be particularly effective at such times of high volatility, when emotions are dominating the markets.'

Deep learning still has potential

However, Dr. Krauss lowers expectations: "During the last years of our sample period, profitability decreased and even became negative at times. We assume that this decline was driven by the rising influence of artificial intelligence in modern trading - enabled by increasing computing power as well as by the popularization of machine learning." However, the researchers agree that deep learning still has significant potential: 'Currently, we are working on... working on very promising follow-up projects with far larger data sets and very deep network architectures which have been specifically designed for identifying temporal dependencies,' says Krauss. '"First results already show significant improvements of predictional accuracy - also in recent years." - also in recent years.'
-end-
The results of the study were published under the title "Deep neural networks, gradient-boosted trees, random forests: Statistical arbitrage on the S&P 500" in the European Journal of Operational Research.

University of Erlangen-Nuremberg

Related Learning Articles:

School spending cuts triggered by great recession linked to sizable learning losses for learning losses for students in hardest hit areas
Substantial school spending cuts triggered by the Great Recession were associated with sizable losses in academic achievement for students living in counties most affected by the economic downturn, according to a new study published today in AERA Open, a peer-reviewed journal of the American Educational Research Association.
Lessons in learning
A new Harvard study shows that, though students felt like they learned more from traditional lectures, they actually learned more when taking part in active learning classrooms.
Learning to look
A team led by JGI scientists has overhauled the perception of inovirus diversity.
Sleep readies synapses for learning
Synapses in the hippocampus are larger and stronger after sleep deprivation, according to new research in mice published in JNeurosci.
Learning from experience is all in the timing
Animals learn the hard way which sights, sounds, and smells are relevant to survival.
Learning language
When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing.
When it comes to learning, what's better: The carrot or the stick?
Does the potential to win or lose money influence the confidence one has in one's own decisions?
Learning a second alphabet for a first language
A part of the brain that maps letters to sounds can acquire a second, visually distinct alphabet for the same language, according to a study of English speakers published in eNeuro.
Learning to read comes at a cost
Learning how to read may have some disadvantages for learning grammar.
Heartbeat paces learning
The processing of external information varies during the phases of the cardiac cycle, shows a new study from the University of Jyväskylä.
More Learning News and Learning Current Events

Best Science Podcasts 2019

We have hand picked the best science podcasts for 2019. Sit back and enjoy new science podcasts updated daily from your favorite science news services and scientists.
Now Playing: TED Radio Hour

Rethinking Anger
Anger is universal and complex: it can be quiet, festering, justified, vengeful, and destructive. This hour, TED speakers explore the many sides of anger, why we need it, and who's allowed to feel it. Guests include psychologists Ryan Martin and Russell Kolts, writer Soraya Chemaly, former talk radio host Lisa Fritsch, and business professor Dan Moshavi.
Now Playing: Science for the People

#538 Nobels and Astrophysics
This week we start with this year's physics Nobel Prize awarded to Jim Peebles, Michel Mayor, and Didier Queloz and finish with a discussion of the Nobel Prizes as a way to award and highlight important science. Are they still relevant? When science breakthroughs are built on the backs of hundreds -- and sometimes thousands -- of people's hard work, how do you pick just three to highlight? Join host Rachelle Saunders and astrophysicist, author, and science communicator Ethan Siegel for their chat about astrophysics and Nobel Prizes.