Undergrad Research Project - Computer Vision Research

Spring 2017

Xiang Xu
Kris Kitani
Project description

Parameter reduction in deep learning model has been widely studied in recent time. Most methods focus on deleting unnecessary connections or forcing weight sharing using k-NN, hashing or other tricks. These algorithms usually require fine-tuning of the network after having trained it. This is very inefficient and other methods have tried to directly reduce the parameter during training. Such methods may involve using binary weights for the model during training. Another different approach is to use RL to find the optimal network structure. Such method has been successfully used to build a model from the ground up. For my research, I will focus more on the third approach by finding new network structure that can performs well with as fewer parameters as possible. I plan to first try out RL for deleting connections and unnecessary modules as the network is train. Then I plan to use this approach to train the Recurrent Memory-Cache Neural Network (a project that I have been working on) from the ground up.

Return to project list