You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: AI-and-Analytics/Getting-Started-Samples/IntelTensorFlow_GettingStarted/README.md
+45-40Lines changed: 45 additions & 40 deletions
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ This sample code shows how to get started with TensorFlow*. It implements an exa
17
17
18
18
| Optimized for | Description
19
19
|:--- |:---
20
-
| OS | Ubuntu* 22.0.4 (and newer) <br> Windows* 10 and newer
20
+
| OS | Ubuntu* 22.0.4 and newer
21
21
| Hardware | Intel® Xeon® Scalable processor family
22
22
| Software | TensorFlow
23
23
@@ -37,7 +37,7 @@ The sample includes one python file: TensorFlow_HelloWorld.py. it implements a s
37
37
y_batch = y_data[step*N:(step+1)*N, :, :, :]
38
38
s.run(train, feed_dict={x: x_batch, y: y_batch})
39
39
```
40
-
In order to show the harware information, you must export the environment variable `ONEDNN_VERBOSE=1` to display the deep learning primitives trace during execution.
40
+
In order to show the harware information, you must export the environment variable `export ONEDNN_VERBOSE=1` to display the deep learning primitives trace during execution.
41
41
>**Note**: For convenience, code line os.environ["ONEDNN_VERBOSE"] ="1" has been added in the body of the script as an alternative method to setting this variable.
42
42
43
43
Runtime settings for`ONEDNN_VERBOSE`, `KMP_AFFINITY`, and`Inter/Intra-op` Threads are set within the script. You can read more about these settings in this dedicated document: *[Maximize TensorFlow* Performance on CPU: Considerations and Recommendations for Inference Workloads](https://software.intel.com/en-us/articles/maximize-tensorflow-performance-on-cpu-considerations-and-recommendations-for-inference)*.
@@ -53,48 +53,51 @@ You will need to download and install the following toolkits, tools, and compone
53
53
54
54
**1. Get Intel® AI Tools**
55
55
56
-
Required AI Tools: <Tensorflow*><!-- List specific AI Tools that needs to be installed before running this sample -->
56
+
Required AI Tools: 'Intel® Extension for TensorFlow* - CPU'
57
57
<br>If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). AIand Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option inAI Tools Selector.<br>
58
-
or simple pip install in your current ready python environment
59
-
```
60
-
pip install tensorflow==2.14
61
-
```
62
-
please see the[supported versions](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html).
58
+
please see the [supported versions](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html).
63
59
64
-
## Run the Sample
60
+
>**Note**: If Docker option is chosen inAI Tools Selector, refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
65
61
66
-
>**Note**: Before running the sample, make sure Environment Setup is completed.
67
-
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html) to see relevant instructions:
**2. (Offline Installer) Activate the AI Tools bundle base environment**
71
63
72
-
### AI Tools Offline Installer (Validated)
73
-
1. If you have not already done so, activate the AI Tools bundle base environment. If you used the default location to install AI Tools, open a terminal andtype the following
64
+
If the default path is used during the installation of AI Tools:
cd oneAPI-samples/AI-and-Analytics/Getting-Started-Samples/IntelTensorFlow_GettingStarted
89
87
```
90
-
### Run the Script
88
+
## Run the Sample
91
89
92
-
Run the Python script.
90
+
>**Note**: Before running the sample, make sure Environment Setup is completed.
91
+
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html) to see relevant instructions:
### AI Tools Offline Installer (Validated)/Conda/PIP
93
95
```
94
96
python TensorFlow_HelloWorld.py
95
97
```
98
+
### Docker
99
+
AI Tools Docker images already have Get Started samples pre-installed. Refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
96
100
## Example Output
97
-
98
101
1. With the initial run, you should see results similar to the following:
3. Run the sample again. You should see verbose results similar to the following:
117
119
```
118
-
2024-03-1216:01:59.784340: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:117] Plugin optimizer for device_type CPUis enabled.
2024-03-1216:01:59.784340: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:117] Plugin optimizer for device_type CPUis enabled.
>**Note**: See the *[oneAPI Deep Neural Network Library Developer Guide and Reference](https://oneapi-src.github.io/oneDNN/dev_guide_verbose.html)*for more details on the verbose log.
134
+
135
+
>**Note**: See the *[oneAPI Deep Neural Network Library Developer Guide and Reference](https://oneapi-src.github.io/oneDNN/dev_guide_verbose.html)*for more details on the verbose log.
131
136
132
137
4. Troubleshooting
133
138
134
-
If you receive an error message, troubleshoot the problem using the **Diagnostics Utility for Intel® oneAPI Toolkits**. The diagnostic utility provides configuration and system checks to help find missing dependencies, permissions errors, and other issues. See the *[Diagnostics Utility for Intel® oneAPI Toolkits User Guide](https://www.intel.com/content/www/us/en/develop/documentation/diagnostic-utility-user-guide/top.html)*for more information on using the utility.
139
+
If you receive an error message, troubleshoot the problem using the **Diagnostics Utility for Intel® oneAPI Toolkits**. The diagnostic utility provides configuration and system checks to help find missing dependencies, permissions errors, and other issues. See the *[Diagnostics Utility for Intel® oneAPI Toolkits User Guide](https://www.intel.com/content/www/us/en/develop/documentation/diagnostic-utility-user-guide/top.html)*for more information on using the utility.
135
140
or ask support from https://github.com/intel/intel-extension-for-tensorflow
136
-
141
+
137
142
## Related Samples
138
143
139
-
* [Intel Extension Fot TensorFlow Getting Started Sample](https://github.com/oneapi-src/oneAPI-samples/blob/development/AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_TensorFlow_GettingStarted/README.md)
144
+
* [Intel Extension For TensorFlow Getting Started Sample](https://github.com/oneapi-src/oneAPI-samples/blob/development/AI-and-Analytics/Getting-Started-Samples/Intel_Extension_For_TensorFlow_GettingStarted/README.md)
0 commit comments