Use Case:
Ingesting sensors, images, voice and video from moving vehicles and running deep learning in the running vehicle. Transporting data and messages to remote data centers via Apache MiniFi and NiFi over Secure S2S HTTPS.
Background:
NVidia Jetson TX1 is a specialized developer kit for running a powerful GPU as an embedded device for robots, UAV and specialized platforms. I envision it's usage in field trucks for intermodal, utilities, telecommunications, delivery services, government and other industries with field vehicles.
Installation and Setup
You will need a workstation running Ubuntu 16 with enough disk space and network access. This will be to download all the software and push it over a network to your NVidia Jetson TX1. You can download Ubuntu here (https://www.ubuntu.com/download/desktop). Fortunately I had a MiniPC with 4GB of RAM that reformatted with Ubuntu to be the host PC to build my Jetson. You cannot run this from a Mac or Windows machine.
You will need a monitor, mouse and keyboard for your host machine and a set for your NVIdia Jetson.
First step boot to your NVidia Jetson and set up WiFi networking and make sure your monitor, keyboards and mouse work.
Make sure you download the latest NVidia JetPack on your host Ubuntu machine https://developer.nvidia.com/embedded/jetpack The one I used was JetPack 3.1 and that included: 64-bit Ubuntu 16.04 cuDNN 6.0 TensorRT 2.1 CUDA 8.0.
Initial login: ubuntu/ubuntu
After installation it will be nvidia/nvidia.
Please change that password, security is important and this GPU could do some serious bitcoin mining.
sudo su apt update apt-get install git zip unzip autoconf automake libtool curl zlib1g-dev maven swig bzip2 apt-get purge libopencv4tegra-dev libopencv4tegra apt-get purge libopencv4tegra-repo apt-get update apt-get install build-essential apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev apt-get install python2.7-dev apt-get install python-dev python-numpy libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libjasper-dev libdc1394-22-dev apt-get install libgtkglext1 libgtkglext1-dev apt-get install qtbase5-dev
apt-get install libv4l-dev v4l-utils qv4l2 v4l2ucp cd $HOME/NVIDIA-INSTALL ./installer.sh
Downloaded and run NVidia Jetson TX1 JetPack from host Ubuntu computer ./JetPack-L4T-3.1-linux-x64.run.
This will run on the host server for probably an hour and require networking connection between the two and a few reboots.
I added a 64GB SD Card as the space on the Jetson is tiny. I would recommend adding a big SATA hard drive.
umount /dev/sdb1 mount -o umask=000 -t vfat /dev/sdb1 /media/
Turn on the fan on the Jetson, echo 255 > /sys/kernel/debug/tegra_fan/target_pwm.
Download MiniFi https://nifi.apache.org/minifi/download.html or https://hortonworks.com/downloads/#dataflow. You will need to install JDK 8.
sudo add-apt-repository ppa:webupd8team/java sudo apt update sudo apt install oracle-java8-installer -y download minifi-0.2.0-bin.zip unzip *.zip bin/minifi.sh start
In the next part, we will classify images.
Part 2: Classifying Images with ImageNet
Part 3: Detecting Faces in Images
Part 4: Ingesting with MiniFi and NiFi
Shell Call Example
/media/nvidia/96ed93f9-7c40-4999-85ba-3eb24262d0a5/jetson-inference-master/build/aarch64/bin/detectnet-console pic-007.png outputface7.png facenet
Source Code
https://github.com/tspannhw/jetsontx1-TensorRT
References
Deep Learning in Medicine: Classifying Melanoma Part 2: Implementing with TensorFlow and Keras
Running TensorFlow on YARN 3.1 with or without GPU and with or without Docker
Using Apache MXNet GluonCV with Apache NiFi for Deep Learning Computer Vision
Deep Learning 101: Using Apache MXNet in DSX Notebooks
Apache Deep Learning 101: Using Apache MXNet in Apache Zeppelin Notebooks
Creating a Kibana dashboard of Twitter data pushed to Elasticsearch with NiFi
Apache Deep Learning 101: Processing Apache MXNet Model Server Results
This website uses cookies for analytics, personalisation and advertising. To learn more or change your cookie settings, please read our Cookie Policy. By continuing to browse, you agree to our use of cookies.
HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service
© 2011-2019 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.