How Neural Networks Work
We will use the following data in this exercise:
| Date |
Open |
High |
Low |
Close |
Adj Close |
Volume (M) |
| 2015-01-02 |
35.2 |
35.3 |
34.4 |
34.8 |
30.8 |
9.7 |
| 2015-01-05 |
34.9 |
35.2 |
34.0 |
34.3 |
30.4 |
16.2 |
| 2015-01-06 |
34.4 |
35.2 |
34.0 |
34.8 |
30.8 |
17.7 |
| 2015-01-07 |
35.2 |
35.9 |
35.0 |
35.8 |
31.7 |
19.3 |
Single Perceptron
Consider a single perceptron with the following properties:
- two inputs, High and Low, and one output, Volume,
- activation function is the rectified linear (ReLU) function, f(x)=max(0,x),
- the weights are initially 0.1 and 0.2.
Calculate either by hand or using python:
- the prediction for 2015-01-02
- the squared error
- the updates to the weights if learning rate =0.0001
- the prediction for 2015-01-02 using the updated weights
- Repeat for 2015-01-05
Two-Layer Neural Network
Consider a two-layer fully connected network the same prediction task
where the first layer consists of two perceptrons of which the output feeds into
one single perceptron in the second layer. Initialize the
four weights in the first layer to (0.1, 0.01, 0.2, 0.02) and
2 weights in the second layer to (0.1, 0.2).
- Calculate the prediction for 2015-01-02
- Calculate the squared error
- Calculate the updates to the weights in the second layer
if learning rate = 0.001
- Use backpropagation to update the weights in the first layer
- Used the updated weights to recalculate the prediction for 2015-01-02
- Repeat for 2015-01-05