Using Keras (now part of TensorFlow) is really easy. The complexity comes when you deal with large amounts of data figuring out the topology of a neural network. With the topology comes hyperparameter tuning and all that. It’s a bit like painting: it’s easy to hold a brush but it takes years to paint something worth looking at.

Gist

    import tensorflow as tf
    from tensorflow.keras.models import Sequential
    from tensorflow.keras.layers import Dense


    import numpy as np
    x_input = np.array([[1,2,3,4,5]])
    y_input = np.array([[10]])


    model = Sequential()
    model.add(Dense(units=32, activation="tanh", input_dim=x_input.shape[1], kernel_initializer='random_normal'))
    model.add(Dense(units=1, kernel_initializer='random_normal'))


    model.compile(loss='mse', optimizer='sgd', metrics=['accuracy'])


    model.summary()

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
dense_10 (Dense)             (None, 32)                192
_________________________________________________________________
dense_11 (Dense)             (None, 1)                 33
=================================================================
Total params: 225
Trainable params: 225
Non-trainable params: 0
_________________________________________________________________



    history = model.fit(x_input, y_input, epochs=10, batch_size=32)

Epoch 1/10
1/1 [==============================] - 0s 163ms/step - loss: 36.1381 - acc: 0.0000e+00
Epoch 2/10
1/1 [==============================] - 0s 1ms/step - loss: 0.0645 - acc: 1.0000
Epoch 3/10
1/1 [==============================] - 0s 2ms/step - loss: 0.0075 - acc: 1.0000
Epoch 4/10
1/1 [==============================] - 0s 961us/step - loss: 8.8381e-04 - acc: 1.0000
Epoch 5/10
1/1 [==============================] - 0s 1ms/step - loss: 1.0349e-04 - acc: 1.0000
Epoch 6/10
1/1 [==============================] - 0s 1ms/step - loss: 1.2137e-05 - acc: 1.0000
Epoch 7/10
1/1 [==============================] - 0s 882us/step - loss: 1.4188e-06 - acc: 1.0000
Epoch 8/10
1/1 [==============================] - 0s 2ms/step - loss: 1.6660e-07 - acc: 1.0000
Epoch 9/10
1/1 [==============================] - 0s 1ms/step - loss: 1.8859e-08 - acc: 1.0000
Epoch 10/10
1/1 [==============================] - 0s 1ms/step - loss: 2.2737e-09 - acc: 1.0000



    model.predict(x_input, batch_size=128)

array([[10.000018]], dtype=float32)

    model.predict(np.array([[1,2,5,4,5]]))

array([[7.1825438]], dtype=float32)