CS 5043: HW3

Assignment notes:

Introduction

In this assignment, we are going to implement a cross-validation procedure that examines FVAF performance as a function of both the number of training folds and a regularization parameter. We will focus on predicting column 1 of the torque matrix, which corresponds to elbow torque in N-m.

Part 1: Model Testing Function

Implement a python function that:
  1. Takes as input: training, validation and test sets, and a training hyper-parameter.

    Each data set is a pair of Numpy matrices.

    For our experiment, the training hyper-parameter is the parameter for L2 regularization.

  2. Create a Keras network with at least one hidden layer that will predict the specified outputs given the provided inputs. This network should not be huge.

  3. This network must use the L2 regularization parameter. This can be done by setting the kernel_regularizer and bias_regularizer parameters when you are creating your Dense layer, e.g.:
    kernel_regularizer=keras.regularizers.l2(lambda)
    

  4. Train the network. Use early termination based on the validation performance. Use a patience of 40 epochs.

    Here is an example of setting up and using the callback:

    early_stopping = keras.callbacks.EarlyStopping(monitor='val_loss', 
           min_delta=0, patience=params['patience'], 
           verbose=0, mode='auto', 
           baseline=None, 
           restore_best_weights=True)
           
    model.fit(x=ins_training, y=outs_training,
              validation_data = (ins_val, outs_val),
              epochs=epochs, callbacks=[early_stopping])
    
    

  5. Compute and return the FVAF for each of the training, validation and test sets.

    Here is an example function that will do this:

    def compute_fvaf(model, ins, outs):
        y_mse = model.evaluate(x=ins, y=outs, batch_size=outs.shape[0])
        fvaf = 1 - y_mse / np.var(outs)
        return fvaf
    

Part 2: Train and Evaluate Models for all Rotations

Create a function that tests all possible rotations of the data set for a given hyper-parameter choice:
  1. Takes as input the MI and torque lists that were loaded in our earlier examples (and that you used for HW 2). Each of these lists has 20 elements, one for each fold. Each element is a Numpy array.

    This function also takes as input the regularization parameter and the number of folds to use for training (Ntraining).

  2. Loops over all possible rotations of our data (20 in total):
  3. Computes and return the mean FVAF across the rotations for the training/validation/testing sets. I also find it useful to return the raw data

Part 3: Cross-Validation

Implement a function that tries all combinations of regularization parameters and numbers of training folds:
  1. Takes as input the MI and torque lists, a list of regularization values, and a list of numbers of folds to be used for training.

  2. We will be logging the mean FVAF for the training/validation/testing data sets for each combination of regularization/ntraining folds. I preallocate a Numpy matrix to store these metrics in.

  3. Loop over all combination of regularization values and number of training folds:

  4. Returns the performance matrices

Part 4: Report Performance

  1. Possible regularization values: [0, .0001, .001, .01] (changed: 2019-02-18)

  2. Possible training set sizes: [1, 2, 5, 10, 18])

  3. Cross-validation rotations: only do every 4th rotation (so, a total of 5 models per hyperparameter set). (changed: 2019-02-18)

  4. Create a plot of training set performance as a function of the number of folds used for training.

  5. Create a plot of test set performance as a function of the number of folds used for training.

  6. Create a plot of the highest performing regularization parameter (relative to the validation data set) as a function of the number of training folds.

    np.argmax() can be useful here

  7. Create a plot of the test set performance for the highest performing regularization parameter (relative to the validation data set) as a function of the number of training folds.


Submission


Hints


andrewhfagg -- gmail.com

Last modified: Mon Feb 18 23:23:33 2019