How to run TensorBoard in Jupyter Notebook

(Comments)

jtb

TensorBoard is a great tool providing visualization of many metrics necessary to evaluate TensorFlow model training. It used to be difficult to bring up this tool especially in a hosted Jupyter Notebook environment such as Google Colab, Kaggle notebook and Coursera's Notebook etc. In this tutorial, I will show you how seamless it is to run and view TensorBoard right inside a hosted or local Jupyter notebook with the latest TensorFlow 2.0.

You can run this Colab Notebook while reading this post.

Start by installing TF 2.0 and loading the TensorBoard notebook extension:

!pip install -q tf-nightly-2.0-preview
# Load the TensorBoard notebook extension
%load_ext tensorboard

Alternatively, to run a local notebook, you can create a conda virtual environment and install TensorFlow 2.0.

conda create -n tf2 python=3.6
activate tf2
pip install tf-nightly-gpu-2.0-preview
conda install jupyter

Then you can start TensorBoard before training to monitor it in progress: within the notebook using magics.

import tensorflow as tf
import datetime, os

logs_base_dir = "./logs"
os.makedirs(logs_base_dir, exist_ok=True)
%tensorboard --logdir {logs_base_dir}

Right now you can see an empty TensorBoard view with the message "No dashboards are active for the current data set", this is because the log directory is currently empty.

Lets' create, train and log some data with a very simple Keras model.

def create_model():
  return tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape=(28, 28)),
    tf.keras.layers.Dense(512, activation='relu'),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(10, activation='softmax')
  ])

def train_model():
  
  model = create_model()
  model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])

  logdir = os.path.join(logs_base_dir, datetime.datetime.now().strftime("%Y%m%d-%H%M%S"))
  tensorboard_callback = tf.keras.callbacks.TensorBoard(logdir, histogram_freq=1)

  model.fit(x=x_train, 
            y=y_train, 
            epochs=5, 
            validation_data=(x_test, y_test), 
            callbacks=[tensorboard_callback])

train_model()

Now go back to previous TensorBoard output, refresh it with the button on the top right and watch the update view.

tb1

The same TensorBoard backend is reused by issuing the same command. If a different logs directory was chosen, a new instance of TensorBoard would be opened. Ports are managed automatically.

Any new interesting feature worth mentioning is the "conceptual graph". To see the conceptual graph, select the “keras” tag. For this example, you’ll see a collapsed Sequential node. Double-click the node to see the model’s structure:

tag_k

Conclusion and further reading

In this quick tutorial, we walked through how to fire up and view a full bloom TensorBoard right inside Jupyter Notebook. For further instructions on how to leverage other new features of TensorBoard in TensorFlow 2.0, be sure to check out those resources.

TensorBoard Scalars: Logging training metrics in Keras

Hyperparameter Tuning with the HParams Dashboard

Model Understanding with the What-If Tool Dashboard

Current rating: 2.7

Comments