How To Load And Run Intel-tensorflow Model On Ml.net
Solution 1:
The oneDNN supported in ML.NET depends on the ML.NET integration. If they enable oneDNN in the TensorFlow C++ API, ML.NET could have oneDNN support.
You can try installing stock Tensorflow 2.5 in your ML.NET environment with intel OneDNN enabled. You can install stock Tensorflow wheel from this link: https://pypi.org/project/tensorflow/#files
To install the wheel file: pip install __.whl
.
To enable oneDNN optimizations, please set the environment variable TF_ENABLE_ONEDNN_OPTS:
set TF_ENABLE_ONEDNN_OPTS=1
To ensure verbose log is displayed: set DNNL_VERBOSE=1
For more information on oneDNN verbose mode, please refer: https://oneapi-src.github.io/oneDNN/dev_guide_verbose.html
For more information on Intel Optimization for tensorflow, please refer: https://software.intel.com/content/www/us/en/develop/articles/intel-optimization-for-tensorflow-installation-guide.html
Post a Comment for "How To Load And Run Intel-tensorflow Model On Ml.net"