how working layers in deep learning ?

1 visualización (últimos 30 días)
voxey
voxey el 7 de En. de 2020
Respondida: Sanyam el 4 de Jul. de 2022
how working layers in deep learning ?
  • Relu
  • Pool
  • Con
  • inception
  • Droput
  • weith---? what is the purpose of weight ?
  • How to reduce training time ?

Respuestas (1)

Sanyam
Sanyam el 4 de Jul. de 2022
Hey @voxey
To understand these concepts in depth, I would suggest you to have look at the deep learning and image processing courses provided by mathworks
Still there is a brief overview of the concepts which you asked:
1) Relu : It's an activation function which is used to introduce non-linearity to the network and helps our network to learn non-linear decision boundaries better
2) Pooling : pooling is an operation used in CNNs. It is done to reduce the size of feature maps. Also it makes the network robust by introducing rotational/translational changes
3) Convolution : It is an operation in CNNs. It's main purpose to extract features from the image
4) Inception : Architecture used in GoogleNet. Refer this link
5) Dropout : It is a regularization technique used to prevent the neural net from overfitting
6) weight : It is a learnable parameter, network learns it over training to perform the task for which it's trained
7) Reducing training time : You can explore many options like using transfer learning,training on GPU, reducing number of epochs etc

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Productos


Versión

R2013b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by