FLOPs of DAG neural network
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
DL
el 25 de Mayo de 2021
Comentada: David Willingham
el 26 de Mayo de 2021
Hi, everyone. Is there any way to measure the FLOPs or computational complexity of DAG neural network or functions? I tried to statistic the excution time then calculate the FLOPs by profile roughly, but I think DAG is based on C++ or accelerated, therefore, the result of FLOPs is not trustable? Any suggestion?
0 comentarios
Respuesta aceptada
David Willingham
el 26 de Mayo de 2021
Hi Dianxin,
FLOPs is a performance measure that's not typically used for Deep Learning. Performance can be measured in many ways, here's a list of some:
Throughput - E.g. Predictions per sec
Training Time - E.g. Time to reach x% of validation accuracy
Memory - E.g. How many MB of the network based on the weights
Power - E.g. For embedded devices, how much energy is required to make a prediction
Regards,
4 comentarios
David Willingham
el 26 de Mayo de 2021
Hi, by default optmizations are not enabled for inference. You can download this support package to enable it for the "predict function".
If you want more optimizations, these are built into our Coder products, which automatically generate native code for the target environment.
MATLAB Coder (C & C++) - including Intel MKL-DNN for Intel processors and ARM Compute Library for ARM Cortex processors
GPU Coder (Cuda) - NVIDIA® CUDA libraries, including TensorRT™, cuDNN, cuFFT, cuSolver, and cuBLAS
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!