2022년 3월 6일 일요일

Build Tensorflow 2.7 python wheel for Jetpack 4.6 (Xavier NX)

 In the previous few articles, I learned how to build Tensorflow directly on Raspberry Pi and cross-build Tensorflow Lite on the host computer.


In this article, I will finally look at how to build Tensorflow in the Jetson series.

The Jetson development environment used in this article is as follows.

  • JetPack : 4.6 (TensorRT : 8.0.1, CUDA 10.2, cuDNN:8.2.1)
  • Python : 3.8 (Ananconda Virtual Environment)
  • Numpy : 1.21.5
  • gcc : 7.5.0 (CUDA does not compile in 8 or higher)
  • bazel : 4.2.1
  • Jetson : Jetson Xavier NX


To be honest, I've had several failures to make this build a success. The biggest reason is the memory problem. Because it takes a long time to build, it is not easy to monitor the system environments like cpu, memory, processes when an error occurs. As a result of monitoring several times using htop, it was found that insufficient memory was the cause of various errors.

Xavier NX has 8GB of memory, but not enough to build TensorFlow.

Therefore, virtual memory should be secured using swap file and zram and memory usage should be reduced by adjusting bazel build parameters.


Prerequisite


Creating a Python 3.8 Anaconda Virtual Environment


As you can see from the table below, TensorFlow 2.7 or higher requires Python 3.7 or higher. Therefore, prepare Python 3.8 as an anaconda (miniconda) virtual environment.
For information on installing Anaconda on Jetson Xavier NX and preparing the virtual environment, refer to the following article.


<TensorFlow build environment>


The work from now on is the work in the Anaconda Python 3.8 virtual environment.



spypiggy@spypiggy-desktop:~/src$ conda create --name py_38 python=3.8
spypiggy@spypiggy-desktop:~/src$ conda activate py_38
(py_38) spypiggy@spypiggy-desktop:~/src$ 


nvpmodel

Power models of Xavier NX range from 0 to 8. For more information on the power model of Xavier NX, refer to the following article.


Building TensorFlow takes a lot of time. In the case of Xavier NX, it takes about 10 hours, and in the case of Nano, it takes more than 24 hours. Therefore, it is advantageous to use the CPU as much as possible. If nvpmodel is set to 8, the system's performance can be maximized because 6 CPUs are used with 20W power. 
However, even though job=4 parameter was set during the Basel build process, at some point, all CPUs(6 Cores) were operating at 100%, and memory usage was also outside the specified range. So, in my case, I set nvpmode to 7 and set four high-performance CPUs to use 20W power. I rarely use Basel, so I lack a deep understanding of Basel. It might be possible in nvpmodel 8 as well if you fine tune your basel build settings. But I would safely use nvpmodel 7.


spypiggy@spypiggy-desktop:~/src$ sudo nvpmodel -m 7
NVPM WARN: patching tpc_pg_mask: (0x1:0x4)
NVPM WARN: patched tpc_pg_mask: 0x4
spypiggy@spypiggy-desktop:~/src$ sudo nvpmodel -q
NV Fan Mode:quiet
NV Power Mode: MODE_20W_4CORE
7


tmux 

tmux is an open-source terminal multiplexer for Unix-like operating systems. It allows multiple terminal sessions to be accessed simultaneously in a single window. It is useful for running more than one command-line program at the same time. It can also be used to detach processes from their controlling terminals, allowing remote sessions to remain active without being visible
<From Wikipedia>

tmux installation is not always required. However, since you have to work on the remote ssh console for more than 10 hours, it is convenient to stop the console in the middle or check the progress of the work by reconnecting after a long time after shutting down the host. If you build directly from the ssh console, the entire build process may be stopped if the console is terminated or a network failure occurs. 10 hours of work can be in vain. To prevent this unfortunate situation., tmux allows you to maintain a session even when ssh is terminated.


Install required packages to build tensorflow 

Packages that can be installed with conda always take precedence over pip to be installed with conda.


(py_38) spypiggy@spypiggy-desktop:~/src$ conda install Pillow matplotlib pandas mock scipy portpicker \
    scikit-learn pybind11 h5py==3.1.0 six wheel enum34

Also, packages that cannot be installed with conda can be installed with pip. You do not need to use the pip3 command in a virtual environment.


(py_38) spypiggy@spypiggy-desktop:~/src$ pip install keras_applications==1.0.8 --no-deps 
(py_38) spypiggy@spypiggy-desktop:~/src$ pip install keras_preprocessing==1.1.2 --no-deps 
(py_38) spypiggy@spypiggy-desktop:~/src$ pip install gdown


And the package using apt-get is also installed.

(py_38) spypiggy@spypiggy-desktop:~/src$  sudo apt-get update
(py_38) spypiggy@spypiggy-desktop:~/src$  sudo apt-get install -y  \
        build-essential gfortran  curl git  libcurl3-dev  libfreetype6-dev \
        libhdf5-serial-dev libhdf5-dev libc-ares-dev libeigen3-dev \
        libatlas-base-dev libopenblas-dev libblas-dev \
        liblapack-dev libzmq3-dev  pkg-config  rsync  software-properties-common \
        swig  unzip  python3-h5py zip  zlib1g-dev


Install bazel 

TensorFlow builds use the Basel build system. So we install the Basel build system. Just install open jdk 11 to be used in Basel 4.2.1.


(py_38) spypiggy@spypiggy-desktop:~/src$  sudo apt-get install -y  openjdk-11-jdk \
        openjdk-11-jre-headless 


Then, download the basel source code and build it.

(py_38) spypiggy@spypiggy-desktop:~/src$ mkdir bazel
(py_38) spypiggy@spypiggy-desktop:~/src$ cd bazel
(py_38) spypiggy@spypiggy-desktop:~/src/bazel$ curl -fSsL -O https://github.com/bazelbuild/bazel/releases/download/4.2.1/bazel-4.2.1-dist.zip
(py_38) spypiggy@spypiggy-desktop:~/src/bazel$ unzip bazel-4.2.1-dist.zip
(py_38) spypiggy@spypiggy-desktop:~/src/bazel$ bash ./compile.sh
(py_38) spypiggy@spypiggy-desktop:~/src/bazel$ sudo cp output/bazel /usr/local/bin/


If the Basel build ended without any issues, you can check it as follows.


(py_38) spypiggy@spypiggy-desktop:~/src$ bazel --version
bazel 4.2.1- (@non-git)


Increase memory using ZRAM

The 8GB memory of Xavier NX is 2 or 4 times that of Nano, but if you start the build without any preparation, the probability of an error is very high. Free up as much memory as possible with the following operations. And when the work is finished, adjust the swap memory capacity again.


install zram tool

Since there is already a well-made installation script file, download it from git.

$ git clone https://github.com/StuartIanNaylor/zram-swap-config \
&& cd zram-swap-config
$ sudo ./install.sh


The following is an example of setting in Xavier NX. Zram is applied to 40% of the memory.

(py_38) spypiggy@spypiggy-desktop:~/src$ cat /etc/zram-swap-config.conf
MEM_FACTOR=40
DRIVE_FACTOR=300
COMP_ALG=lz4
SWAP_DEVICES=1
SWAP_PRI=75
PAGE_CLUSTER=0
SWAPPINESS=90


install swap tool

When you build a large software packages like openCV, you may experience an out of memory phenomenon. Increasing the swap file size can prevent this malfunction.



git clone https://github.com/JetsonHacksNano/installSwapfile
cd installSwapfile
./installSwapfile.sh


Above script file will increase 6GB swap files. You can change the swap file size by modifying the scripts. If you want to uninstall the swap setting, open the fstab file and delete the swap file line and reboot.

My Xavier NX's memory looks like this after all.

(py_38) spypiggy@spypiggy-desktop:~/src$ sudo cat /proc/swaps
Filename                                Type            Size    Used    Priority
/mnt/swapfile                           file            6291452 412     -1
/dev/zram0                              partition       9551992 485960  75
(py_38) spypiggy@spypiggy-desktop:~/src$ free -m
              total        used        free      shared  buff/cache   available
Mem:           7773         923        3188           2        3660        6632
Swap:         15472         474       14997

7.7 GB of memory and 15 GB of swap memory(zram + swap file) were prepared.


Building the tensorflow python wheel

Now, download the TensorFlow source code and start building.

First, download the source code and do configure.


cd src
git clone -b v2.7.0 https://github.com/tensorflow/tensorflow.git
cd tensorflow
./configure

The config operation is important because it prepares the bazel build environment.





Python environment shows the virtual environment value of anaconda (miniconda). Just press Enter to use it. "compute capability" corresponds to the NVidia GPU model number. These values are detailed at https://en.wikipedia.org/wiki/CUDA.

And numa is not supported by ARM64 CPU, so remove that option from .bashrc file

<modify the .bashrc file>

config=v1 option will be replaced by v2 later build step.


And add the link environment.


sudo sh -c "echo '/usr/local/cuda/lib64' >> /etc/ld.so.conf.d/nvidia-tegra.conf"
sudo ldconfig


Now, before starting work, create a session using tmux and work. This process can be omitted when working directly in Xavier NX. However, if you are working with remote ssh and you omit it, the ssh program should not exit until the build is complete. The tmux command resets the Python virtual environment. So, start the virtual environment again.


(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ tmux new tf
(base) spypiggy@spypiggy-desktop:~/src/tensorflow$ conda activate py_38
(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ t


The appearance of the ssh remote console will change slightly. If you see a green area at the bottom of the screen, this is normal.

Finally, it's build time.


(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ sudo bazel  \
  --host_jvm_args=-Xmx7g  \
  build \
  --discard_analysis_cache --notrack_incremental_state --nokeep_state_after_build \
  --config=monolithic \
  --config=noaws \
  --config=nohdfs \
  --config=nonccl \
  --config=v2 \
  --define=tflite_pip_with_flex=true \
  --define=tflite_with_xnnpack=true \
  --jobs=4 \
  --local_ram_resources=HOST_RAM*.5 \
  //tensorflow/tools/pip_package:build_pip_package


This command takes about 10 hours on Xavier NX if successful. Therefore, if you are a tmux user, you can exit the session with "Ctrl + b , d" commands.  To completely delete the session, type exit in the tmux environment(You can do this after the build work is finished.).

And you can quit ssh.  It's a good time interval to watch the Netflix Squid Game series.

To reconnect later, do the following:


(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ tmux ls
tf-0: 1 windows (created Sat Mar  5 21:23:25 2022) [84x41] (group tf)
(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ tmux attach -t tf


Finally, after about 10 hours, the build work was finished.


INFO: Reading rc options for 'build' from /home/spypiggy/src/tensorflow/.bazelrc:
  'build' options: --define framework_shared_object=true --java_toolchain=@tf_toolchains//toolchains/java:tf_java_toolchain --host_java_toolchain=@tf_toolchains//toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --define=with_xla_support=true --config=short_logs --config=v2 --define=no_aws_support=true --define=no_hdfs_support=true
INFO: Reading rc options for 'build' from /home/spypiggy/src/tensorflow/.tf_configure.bazelrc:
  'build' options: --action_env PYTHON_BIN_PATH=/home/spypiggy/miniconda3/envs/py_38/bin/python3 --action_env PYTHON_LIB_PATH=/home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages --python_path=/home/spypiggy/miniconda3/envs/py_38/bin/python3 --config=tensorrt --action_env CUDA_TOOLKIT_PATH=/usr/local/cuda-10.2 --action_env TF_CUDA_COMPUTE_CAPABILITIES=7.2 --action_env GCC_HOST_COMPILER_PATH=/usr/bin/aarch64-linux-gnu-gcc-7 --config=cuda
INFO: Reading rc options for 'build' from /home/spypiggy/src/tensorflow/.bazelrc:
  'build' options: --deleted_packages=tensorflow/compiler/mlir/tfrt,tensorflow/compiler/mlir/tfrt/benchmarks,tensorflow/compiler/mlir/tfrt/jit/python_binding,tensorflow/compiler/mlir/tfrt/jit/transforms,tensorflow/compiler/mlir/tfrt/python_tests,tensorflow/compiler/mlir/tfrt/tests,tensorflow/compiler/mlir/tfrt/tests/saved_model,tensorflow/compiler/mlir/tfrt/transforms/lhlo_gpu_to_tfrt_gpu,tensorflow/core/runtime_fallback,tensorflow/core/runtime_fallback/conversion,tensorflow/core/runtime_fallback/kernel,tensorflow/core/runtime_fallback/opdefs,tensorflow/core/runtime_fallback/runtime,tensorflow/core/runtime_fallback/util,tensorflow/core/tfrt/common,tensorflow/core/tfrt/eager,tensorflow/core/tfrt/eager/backends/cpu,tensorflow/core/tfrt/eager/backends/gpu,tensorflow/core/tfrt/eager/core_runtime,tensorflow/core/tfrt/eager/cpp_tests/core_runtime,tensorflow/core/tfrt/fallback,tensorflow/core/tfrt/gpu,tensorflow/core/tfrt/run_handler_thread_pool,tensorflow/core/tfrt/runtime,tensorflow/core/tfrt/saved_model,tensorflow/core/tfrt/saved_model/tests,tensorflow/core/tfrt/tpu,tensorflow/core/tfrt/utils
INFO: Found applicable config definition build:short_logs in file /home/spypiggy/src/tensorflow/.bazelrc: --output_filter=DONT_MATCH_ANYTHING
INFO: Found applicable config definition build:v2 in file /home/spypiggy/src/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:tensorrt in file /home/spypiggy/src/tensorflow/.bazelrc: --repo_env TF_NEED_TENSORRT=1
INFO: Found applicable config definition build:cuda in file /home/spypiggy/src/tensorflow/.bazelrc: --repo_env TF_NEED_CUDA=1 --crosstool_top=@local_config_cuda//crosstool:toolchain --@local_config_cuda//:enable_cuda
INFO: Found applicable config definition build:monolithic in file /home/spypiggy/src/tensorflow/.bazelrc: --define framework_shared_object=false
INFO: Found applicable config definition build:noaws in file /home/spypiggy/src/tensorflow/.bazelrc: --define=no_aws_support=true
INFO: Found applicable config definition build:nohdfs in file /home/spypiggy/src/tensorflow/.bazelrc: --define=no_hdfs_support=true
INFO: Found applicable config definition build:nonccl in file /home/spypiggy/src/tensorflow/.bazelrc: --define=no_nccl_support=true
INFO: Found applicable config definition build:v2 in file /home/spypiggy/src/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:linux in file /home/spypiggy/src/tensorflow/.bazelrc: --copt=-w --host_copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --define=PROTOBUF_INCLUDE_PATH=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14 --config=dynamic_kernels --distinct_host_configuration=false --experimental_guard_against_concurrent_changes
INFO: Found applicable config definition build:dynamic_kernels in file /home/spypiggy/src/tensorflow/.bazelrc: --define=dynamic_loaded_kernels=true --copt=-DAUTOLOAD_DYNAMIC_KERNELS
INFO: Analyzed target //tensorflow/tools/pip_package:build_pip_package (455 packages loaded, 29475 targets configured).
INFO: Found 1 target...
Target //tensorflow/tools/pip_package:build_pip_package up-to-date:
  bazel-bin/tensorflow/tools/pip_package/build_pip_package
INFO: Elapsed time: 32535.484s, Critical Path: 1596.45s
INFO: 6481 processes: 753 internal, 5728 local.
INFO: Build completed successfully, 6481 total actions


Now build the wheel like this:


(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ ./tensorflow/tools/pip_package/build_pip_package.sh /tmp/tensorflow_pkg
2022. 03. 06. (일) 19:43:08 KST : === Preparing sources in dir: /tmp/tmp.cfIiYxCqoF
~/src/tensorflow ~/src/tensorflow
~/src/tensorflow
~/src/tensorflow/bazel-bin/tensorflow/tools/pip_package/build_pip_package.runfiles/org_tensorflow ~/src/tensorflow
~/src/tensorflow
/tmp/tmp.cfIiYxCqoF/tensorflow/include ~/src/tensorflow
~/src/tensorflow
2022. 03. 06. (일) 19:44:03 KST : === Building wheel
warning: no files found matching 'README'
warning: no files found matching '*.pyd' under directory '*'
warning: no files found matching '*.pyi' under directory '*'
warning: no files found matching '*.pd' under directory '*'
warning: no files found matching '*.so.[0-9]' under directory '*'
warning: no files found matching '*.dylib' under directory '*'
warning: no files found matching '*.dll' under directory '*'
warning: no files found matching '*.lib' under directory '*'
warning: no files found matching '*.csv' under directory '*'
warning: no files found matching '*.h' under directory 'tensorflow/include/tensorflow'
warning: no files found matching '*.proto' under directory 'tensorflow/include/tensorflow'
warning: no files found matching '*' under directory 'tensorflow/include/third_party'
2022. 03. 06. (일) 19:46:11 KST : === Output wheel file is in: /tmp/tensorflow_pkg

(py_38) spypiggy@spypiggy-desktop:~/src/tensorflow$ ls -al /tmp/tensorflow_pkg/
total 394860
drwxrwxr-x  2 spypiggy spypiggy      4096  3월  6 19:46 .
drwxrwxrwt 17 root     root         36864  3월  6 19:48 ..
-rw-rw-r--  1 spypiggy spypiggy 404288101  3월  6 19:46 tensorflow-2.7.0-cp38-cp38-linux_aarch64.whl


Finally, I succeeded in building TensorFlow 2.7. Let's check by installing Anaconda Python 3.8 in a virtual environment. Since we built for Python 3.8, it is of course correct to install it in a virtual environment where Python 3.8 is installed.

 


(py_38) spypiggy@spypiggy-desktop:/tmp/tensorflow_pkg$ pip install tensorflow-2.7.0-cp38-cp38-linux_aarch64.whl 
Processing ./tensorflow-2.7.0-cp38-cp38-linux_aarch64.whl
Collecting tensorflow-io-gcs-filesystem>=0.21.0
  Downloading tensorflow_io_gcs_filesystem-0.24.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (2.8 MB)
     |████████████████████████████████| 2.8 MB 45 kB/s 
Collecting termcolor>=1.1.0
  Downloading termcolor-1.1.0.tar.gz (3.9 kB)
Collecting keras<2.8,>=2.7.0rc0
  Downloading keras-2.7.0-py2.py3-none-any.whl (1.3 MB)
     |████████████████████████████████| 1.3 MB 42 kB/s 
Collecting astunparse>=1.6.0
  Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting opt-einsum>=2.3.2
  Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB)
     |████████████████████████████████| 65 kB 38 kB/s 
Collecting typing-extensions>=3.6.6
  Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)
Requirement already satisfied: keras-preprocessing>=1.1.1 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorflow==2.7.0) (1.1.2)
Collecting grpcio<2.0,>=1.24.3
  Downloading grpcio-1.44.0-cp38-cp38-manylinux_2_17_aarch64.whl (54.8 MB)
     |████████████████████████████████| 54.8 MB 42 kB/s 
Collecting google-pasta>=0.1.1
  Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
     |████████████████████████████████| 57 kB 35 kB/s 
Collecting absl-py>=0.4.0
  Downloading absl_py-1.0.0-py3-none-any.whl (126 kB)
     |████████████████████████████████| 126 kB 45 kB/s 
Collecting libclang>=9.0.1
  Downloading libclang-13.0.0-py2.py3-none-manylinux2014_aarch64.whl (26.0 MB)
     |████████████████████████████████| 26.0 MB 45 kB/s 
Collecting flatbuffers<3.0,>=1.12
  Downloading flatbuffers-2.0-py2.py3-none-any.whl (26 kB)
Collecting gast<0.5.0,>=0.2.1
  Downloading gast-0.4.0-py3-none-any.whl (9.8 kB)
Requirement already satisfied: wheel<1.0,>=0.32.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorflow==2.7.0) (0.37.1)
Requirement already satisfied: numpy>=1.14.5 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorflow==2.7.0) (1.21.5)
Requirement already satisfied: six>=1.12.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorflow==2.7.0) (1.16.0)
Collecting wrapt>=1.11.0
  Downloading wrapt-1.13.3.tar.gz (48 kB)
     |████████████████████████████████| 48 kB 45 kB/s 
Requirement already satisfied: h5py>=2.9.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorflow==2.7.0) (3.1.0)
Collecting tensorflow-estimator<2.8,~=2.7.0rc0
  Downloading tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB)
     |████████████████████████████████| 463 kB 55 kB/s 
Collecting protobuf>=3.9.2
  Downloading protobuf-3.19.4-cp38-cp38-manylinux2014_aarch64.whl (913 kB)
     |████████████████████████████████| 913 kB 36 kB/s 
Collecting tensorboard~=2.6
  Downloading tensorboard-2.8.0-py3-none-any.whl (5.8 MB)
     |████████████████████████████████| 5.8 MB 32 kB/s 
Collecting google-auth-oauthlib<0.5,>=0.4.1
  Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB)
Collecting werkzeug>=0.11.15
  Downloading Werkzeug-2.0.3-py3-none-any.whl (289 kB)
     |████████████████████████████████| 289 kB 87 kB/s 
Collecting tensorboard-plugin-wit>=1.6.0
  Downloading tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB)
     |████████████████████████████████| 781 kB 46 kB/s 
Collecting markdown>=2.6.8
  Downloading Markdown-3.3.6-py3-none-any.whl (97 kB)
     |████████████████████████████████| 97 kB 61 kB/s 
Requirement already satisfied: requests<3,>=2.21.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow==2.7.0) (2.27.1)
Collecting tensorboard-data-server<0.7.0,>=0.6.0
  Downloading tensorboard_data_server-0.6.1-py3-none-any.whl (2.4 kB)
Collecting google-auth<3,>=1.6.3
  Downloading google_auth-2.6.0-py2.py3-none-any.whl (156 kB)
     |████████████████████████████████| 156 kB 21 kB/s 
Requirement already satisfied: setuptools>=41.0.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow==2.7.0) (58.0.4)
Collecting rsa<5,>=3.1.4
  Downloading rsa-4.8-py3-none-any.whl (39 kB)
Collecting cachetools<6.0,>=2.0.0
  Downloading cachetools-5.0.0-py3-none-any.whl (9.1 kB)
Collecting pyasn1-modules>=0.2.1
  Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
     |████████████████████████████████| 155 kB 45 kB/s 
Collecting requests-oauthlib>=0.7.0
  Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting importlib-metadata>=4.4
  Downloading importlib_metadata-4.11.2-py3-none-any.whl (17 kB)
Collecting zipp>=0.5
  Downloading zipp-3.7.0-py3-none-any.whl (5.3 kB)
Collecting pyasn1<0.5.0,>=0.4.6
  Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
     |████████████████████████████████| 77 kB 36 kB/s 
Requirement already satisfied: idna<4,>=2.5 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow==2.7.0) (3.3)
Requirement already satisfied: certifi>=2017.4.17 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow==2.7.0) (2021.10.8)
Requirement already satisfied: charset-normalizer~=2.0.0 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow==2.7.0) (2.0.12)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/spypiggy/miniconda3/envs/py_38/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow==2.7.0) (1.26.8)
Collecting oauthlib>=3.0.0
  Downloading oauthlib-3.2.0-py3-none-any.whl (151 kB)
     |████████████████████████████████| 151 kB 32 kB/s 
Building wheels for collected packages: termcolor, wrapt
  Building wheel for termcolor (setup.py) ... done
  Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4848 sha256=53a718cbd76bfb53bb1a186a9613af9bc21ac6ff050e264c576bf7185c53359e
  Stored in directory: /home/spypiggy/.cache/pip/wheels/a0/16/9c/5473df82468f958445479c59e784896fa24f4a5fc024b0f501
  Building wheel for wrapt (setup.py) ... done
  Created wheel for wrapt: filename=wrapt-1.13.3-cp38-cp38-linux_aarch64.whl size=81199 sha256=c9a1fdd72b276337b7e2ea8ad397ece37ad2828c307a6e3544959407c546d012
  Stored in directory: /home/spypiggy/.cache/pip/wheels/bb/05/57/f0c531fdf04b11be18b21ab4d1ec5586a6897caa6710c2a1a5
Successfully built termcolor wrapt
Installing collected packages: pyasn1, zipp, rsa, pyasn1-modules, oauthlib, cachetools, requests-oauthlib, importlib-metadata, google-auth, werkzeug, tensorboard-plugin-wit, tensorboard-data-server, protobuf, markdown, grpcio, google-auth-oauthlib, absl-py, wrapt, typing-extensions, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard, opt-einsum, libclang, keras, google-pasta, gast, flatbuffers, astunparse, tensorflow
Successfully installed absl-py-1.0.0 astunparse-1.6.3 cachetools-5.0.0 flatbuffers-2.0 gast-0.4.0 google-auth-2.6.0 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 grpcio-1.44.0 importlib-metadata-4.11.2 keras-2.7.0 libclang-13.0.0 markdown-3.3.6 oauthlib-3.2.0 opt-einsum-3.3.0 protobuf-3.19.4 pyasn1-0.4.8 pyasn1-modules-0.2.8 requests-oauthlib-1.3.1 rsa-4.8 tensorboard-2.8.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.1 tensorflow-2.7.0 tensorflow-estimator-2.7.0 tensorflow-io-gcs-filesystem-0.24.0 termcolor-1.1.0 typing-extensions-4.1.1 werkzeug-2.0.3 wrapt-1.13.3 zipp-3.7.0


Installation is complete. Now let's check if the tensorflow package is working properly. You can check that it works with the GPU version

(py_38) spypiggy@spypiggy-desktop:/tmp/tensorflow_pkg$ python
Python 3.8.12 (default, Nov  5 2021, 09:55:51) 
[GCC 10.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
>>> tf.__version__
'2.7.0'
>>> print(tf.config.list_physical_devices('GPU'))
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]


Wrapping up

Many developers have already released the tensorflow wheel package for Jetson on GitHub. And NVidia is also releasing packages such as TensorFlow PyTorch for Jetson series.  NVidia's download page is as follows.


<NVidia tensorflow download page>


Downloading from this page and using it is the most common way. However, if you need TensorFlow to work with a higher Python version (we used 3.8), you have to build it yourself using the method introduced above.


댓글 없음:

댓글 쓰기