This repository has been archived by the owner on Aug 5, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 491
Borrowing Weights from a Pretrained Network
Domenic Curro edited this page Feb 23, 2016
·
5 revisions
Weights are learnable parameters in the network, which are tuned by the back propagation phase.
Weights are generated by caffe when your network in constructed.
To borrow the weights of an already trained model, we need to do two things:
- Rename our layer to match the name of the original model's layer. The weights are assigned by layer name, thus using the original network's layer name, we get it's weights.
For example, let say the original model had a layer name ip1
, then we should name our layer ip1
:
layer {
name: "ip1"
type: "InnerProduct"
bottom: "pool2"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 500
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
- Train our new hybrid model declaring the location of the weights:
caffe train —solver ourSolver.prototxt —weights theirModel.caffemodel
The other layers of our network will be initialized just like any other brand new layer (usually ~zero).