NVIDIA® Jetson Nano™ Developer Kit is a small, powerful single-board computer designed to make AI accessible to makers, learners, and embedded developers. It lets you run multiple neural networks in parallel for applications like image classification, object detection, segmentation, and speech processing.
This Getting Started will guide you through setting up your Jetson Nano and configuring it for AI image processing using the Pi Camera Module V2 with Python and C++.
The Jetson Nano uses a microSD card for its operating system and storage. Use a high-performance card with a large enough capacity for your project. This guide used a Samsung Evo Plus 64GB UHS-1 rated at Speed Class 10 (C10) and UHS Speed Class 3 (U3).
Note: The official Jetson Nano guide recommends a 16GB minimum capacity card – we had problems creating the image on this size card.
There are 3 options to power the Jetson Nano:
This guide used a 5V / 4A (20W) desktop power supply with 2.1 mm inner diameter and 5.5 mm outer diameter, centre positive plug via the Barrel Jack.
Prepare the microSD card system image using a Windows, MacOS or Linux PC with an SD card writer or adapter. balenaEtcher ( https://www.balena.io/etcher/ ) was used on a Windows 10 PC to flash the image. The steps are similar for MacOS and Linux PC’s.
Using large SD cards will generate warnings in Etcher. Be careful to select the correct drive!
Set up the board and peripherals, including the Pi Camera module V2 for first boot.
The CSI interface only works with the Pi V2 camera modules
The Jetson Nano basic setup is now complete.
JetsonHacks.com have provided useful scripts to test the camera operation and ensure the Python and C++ environment is setup ready for AI development.
Git clone https://github.com/JetsonHacksNano/CSI-Camera.git
cd CSI-Camera
gst-launch-1.0 nvarguscamerasrc sensor_id=0 !
'video/x-raw(memory:NVMM),width=3280, height=2464, framerate=21/1, format=NV12' !
nvvidconv flip-method=2 ! 'video/x-raw, width=816, height=616' !
nvvidconv ! nvegltransform ! nveglglessink -e
Depending on the orientation of your camera, you may need to adjust the flip-method index in the above command line. Here are the different settings:
flip-method: video flip methods
Default: 0, “none”
(0): none – Identity (no rotation)
(1): counterclockwise – Rotate counter-clockwise 90 degrees
(2): rotate-180 – Rotate 180 degrees
(3): clockwise – Rotate clockwise 90 degrees
(4): horizontal-flip – Flip horizontally
(5): upper-right-diagonal – Flip across upper right/lower left diagonal
(6): vertical-flip – Flip vertically
(7): upper-left-diagonal – Flip across upper left/low
The camera should display a video image in a new window.
Face Recognition Test
This AI test uses Python3 libraries and a trained model to recognise faces in the video image.
sudo apt install python3-numpy
sudo apt install libcanberra-gtk-module
python3 face_detect.py
You may need to edit the flip-method in the definition of gstreamer_pipeline in face_detect.py:
def gstreamer_pipeline(
…
framerate=21,
flip_method=2,
This test uses a C++ framework to compile a video test from simple_camera.cpp.
In the latest Jetson Nano image, the opencv libraries and header files are installed in a different location to the example code so the include and library paths need updating in the examples.
g++ -std=c++11 -Wall -I/usr/include/opencv4 simple_camera.cpp -L/usr/lib/aarch64-linux-gnu -lopencv_core -lopencv_highgui -lopencv_videoio -o simple_camera
You may need to open simple_camera.cpp and edit the flip_method in the main function.
int main()
{
…
int framerate = 60 ;
int flip_method = 2;
./simple_camera
Avoid corrupting the SD card image by shutting down the system correctly.
The power supply needs to discharge before re-connecting the Jetson Nano otherwise it does not boot.
Congratulations! You should have successfully set up and configured your Jetson Nano Developer Kit. The Pi Camera V2 module was connected and tested using Python & C++ code examples. Everything is now ready to start exploring the AI image processing capabilities of the Jetson Nano.
NVIDIA have produced a series of Deep Learning examples and Tutorials that you can follow at: https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit#next
From a quick tap to smashing that love button and show how much you enjoyed this project.