Predicting the Future with RNNs

Written by TKS New York student Jack McDonald (email:

The Problem

A friend of mine recently started investing in public securities. While he understood the risk he faced after choosing to do this, he believed his general knowledge of the markets would allow for him to create success at of his risky decision. As the weeks went by, I saw him get increasingly frustrated by shrinking of his portfolio. This decline in assets became so drastic, that he eventually sold all of his positions in order to secure the practically meaningless amount of money.

Millions across the world have fallen into similar financial dilemmas as a result of number of factors.

How could these struggles be avoided?

Technology has been able to improve practically every aspect of our lives, has it been integrated into the very foundation of our society: finance?

Trending AI Articles:

1. Basics of Neural Network

2. Bursting the Jargon bubbles — Deep Learning

3. How Can We Improve the Quality of Our Data?

4. Machine Learning using Logistic Regression in Python with Code

These are the questions many ask when looking at the struggles, and oftentimes downfalls that financial immaturity brings. The answer to these two questions is yes.

While many find it hard to believe that the a computer can predict and act on the intuition of the human mind, the relatively new wave of machine learning development has opened an abundance of doors to make this ability more feasible.

Within this specific domain, an architecture known as a recurrent neural network(RNN) has made leaps and bounds in the development of an automated securities trader.

In an extremely basic overview, a neural network learns by taking in input, puting that input in the proper form for analysis, and trying to most accurately answer a question using the results of that analysis

Functional Makeup of An RNN

RNN’s are neural networks that contain recurrent layers to sequentially process data. The input to these layers is always a sequence, since an RNN deals with sequential data. Within each recurrent layer is composed of a memory cell. This memory cell is used repeatedly to compute the outputs of each time step. Part of this output value is fed into the cell of the next time step, creating a recurrent pattern that is efficient for the handling and analysis of larger pools of data.

Time Series

The most frequent input for an RNN is data referred to as time series. A time series, in the specific context of securities trading, might be, for instance, the price of a public stock over a thirty day period. In each time series, the algorithm will try to decipher whether the price is experiencing an upward or downward trend, seasonality, meaning that the price’s rise or decline is determined upon external factors (prices heavily fluctuate), or both trend and seasonality. After an analysis a select time series, the algorithm will predict the next time series block in the existing chart. After the thirty days of the time series has passed, the algorithm is able to compare it’s prediction to the actual occurrence. Once these differences are analyzed, the algorithm, like other forms of neural network, will preform back-propagation, and improve it’s analysis techniques according to these differences.

LSTM (Long Short-Term Memory) Cells

While some accuracy can be gained through the use of single time block analysis, the ability to analyze and compare the findings from all of the time series blocks is where true value is earned, and where the computer can continuously build upon its understanding of the movement of a set of data.

LSTM cells are memory cell within recurrent layers of RNNs. While the cells themselves are made of much the same components as normal recurrent layers, the additional components of the cell allow for the algorithm to take up to 100 or more time series for joint analysis.

These tools of network creation have allowed for algorithms to utilize long-term knowledge for application in areas that rely heavily upon time series forecasting.

My Project

I an effort to test my knowledge of the subject, I built an RNN to assess cryptocurrency prices. The input to my model were three columns containing the starting and closing values of the top four most heavily traded tokens: LiteCoin, Bitcoin, Ethereum, and Bitcash.

After merging the scraped files into one dataframe, the system could very easily analyze the patterns in the input, being that the input itself was already in numeric form.

Although the data imported included prices for four currencies, the basis for the model was to predict the price movements of just one of these currencies, by using not only the past movements of this currency (in this case, LiteCoin), but also those of the additional three currencies.

The structure of the model itself consisted of three, practically identical LSTM layers with a 20% dropout, batch normalization, 128 input nodes, and a ‘relu’ activation function. The last layer of the model was a dense layer with 32 input nodes and an activation of ‘relu’.


  • Recurrent neural networks are frameworks for analyzing pools of sequential data
  • The input of these networks is a time block or time series
  • LSTM cells allow for the broader analysis of sequential data through the analysis of multiple timeblocks.