Main Content

Effects of Enabling or Disabling Custom Deep Learning Processor Layer Modules

Analyze how deep learning processor layer modules affect custom bitstream generation. Identify when to enable or disable modules in order to speed up custom bitstream generation.

This table lists the deep learning processor configuration modules and when you should enable or disable these modules to optimize custom bitstream generation.

Deep Learning Processor ParameterDeep Learning Processor ModuleReason to Enable ModuleReason to Disable Module
ModuleGenerationconvWhen network contains convolution or pooling layers.Reduce FPGA resource utilization when the network does not contain any convolution or pooling layers.
LRNBlockGenerationconvThe network uses one or more cross-channel normalization layers.The network does not use cross-channel normalization layers.
SegmentationBlockGenerationconvThe network contains one or more max unpooling layers.Reduce bitstream resource utilization or the network does not contain any fully connected layers.
ModuleGenerationfcThe network has a fully connected layer.The network has no fully connected layer.
SoftmaxBlockGenerationfcImplement the softmax layer in hardware.Implement the softmax layer in software.
ModuleGenerationcustomThe network has a custom layer or any layers supported by the custom processor module.The network does not contain any custom layers or layers supported by the custom processor module.
AdditioncustomThe network has addition operations.The network has no addition operations.
MishLayercustomThe network uses a mish activation layer.The network does not use a mish activation layer.
MultiplicationcustomThe network has layers that perform multiplication operations.The network has no layers that perform multiplication operations.
Resize2DcustomThe network has a resize layer.The network does not have a resize layer.
SigmoidcustomThe network has a sigmoid layer.The network does not have sigmoid layers.
SwishLayercustomThe network has a swish activation layer.The network does not have a swish activation layer.
TanhLayercustomThe network has a hyperbolic tangent activation layer.The network does not have a hyperbolic tangent activation layer.

See Also

| | | |

Topics