CenterNet

Posted by SNC_official

Copy the project to Neural Network Console Cloud

Network Architecture : Main

Type Value
Output 3,006,793
CostParameter 15,379,272
CostAdd 1,126,720
CostMultiply 945,696
CostMultiplyAdd 1,597,149,184
CostDivision 3,140
CostExp 29,792
CostIf 1,306,768

Network Architecture : Runtime

Type Value
Output 2,900,160
CostParameter 15,379,272
CostAdd 1,095,360
CostMultiply 893,952
CostMultiplyAdd 1,597,149,184
CostDivision 3,136
CostExp 3,136
CostIf 1,299,712

Training Procedure : Optimizer

Optimize network “Main” using “Training” dataset.

  • Batch size : 256
  • Solver : Adam
    • Learning rate(Alpha) : 0.001
      • decayed every 1 iteration using an exponential rate of 0.5 .
    • Beta1 : 0.9
    • Beta2 : 0.999
    • Epsilon : 1e-08
  • Weight decay : 0.001

Experimental Result : Learning Curve

References

  • Sony Corporation. Neural Network Console : Not just train and evaluate. You can design neural networks with fast and intuitive GUI. https://dl.sony.com/
  • Sony Corporation. Neural Network Libraries : An open source software to make research, development and implementation of neural network more efficient. https://nnabla.org/
  • Convolution – Chen et al., DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs. https://arxiv.org/abs/1606.00915, Yu et al., Multi-Scale Context Aggregation by Dilated Convolutions. https://arxiv.org/abs/1511.07122
  • BatchNormalization – Ioffe and Szegedy, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. https://arxiv.org/abs/1502.03167
  • ReLU – Vinod Nair, Geoffrey E. Hinton. Rectified Linear Units Improve Restricted Boltzmann Machines. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.165.6419&rep=rep1&type=pdf
  • Adam – Kingma and Ba, Adam: A Method for Stochastic Optimization. https://arxiv.org/abs/1412.6980