AI AT THE EDGE
A series of 3 Workshops
Postponed until Further Notice
Local and embedded machine learning (ML) is a key component for real-time data analytics in upcoming computing environments like the Internet of Things (IoT), edge computing and mobile ubiquitous systems. The goal of the workshop is to give hands on training to professionals in the field and students who want to explore the field of Edge devices.
About The Workshops
There has been tremendous growth and advancements in deploying AI algorithms on the Edge devices, UAVs or surveillance cameras and platforms, to make stand-alone devices and gadgets intelligent. Local and embedded machine learning (ML) is a key component for real-time data analytics in upcoming computing environments like the Internet of Things (IoT), edge computing and mobile ubiquitous systems. There is an increasing need for real-time intelligent data analytics, driven by a world of Big Data, and the society’s need for pervasive intelligent devices, such as wearables for health and recreational purposes, smart city infrastructure, e-commerce, Industry 4.0 and autonomous robots. With huge volumes of data comes memory issues, privacy constraints, incompleteness and uncertainty. As a result, we observe a strong need for new ML methods to address the requirements of emerging workloads deployed in the real-world, such as uncertainty, robustness, and limited data. In order to not hinder the deployment of such methods on various computing devices, and to address the gap in between application and computer hardware, we furthermore need a variety of tools and first of these tools are Edge devices.
AI at the Edge (Jetson Nano, Jetson TX2, ARM, CORAL, Compute Stick)
The first workshop will cover deep learning for deploying AI and computer vision on Jetson TX2 and Jetson Nano. Participants will get a stronger background in deep learning, will be able to load and run a pre-trained deep neural network on the devices and learn how to retrain the network with your own dataset to produce a live demo. Moreover, this workshop is designed to provide industry-level experience in developing embedded systems projects using the widely used ARM architecture.
AI at the Edge (FPGAs)
This hands-on workshop will cover key aspects of modern programmable ML-on-Chip and its applications to advanced scientific instrumentation and reconfigurable computing. It is based on FPGA and dual-core processors are characterized by its low-cost along with a huge versatility to implement different concurrent tasks such as high performance multichannel data acquisition, processing and transmission.
More Details Coming Soon!
AI at the Edge (Real Case Studies)
The emergence of practical computer vision technology creates vast opportunities for innovation in electronic systems and associated software. In many cases, existing products can be transformed through the addition of vision capabilities. One example of this is the addition of vision capabilities to surveillance cameras, allowing the camera to monitor a scene for certain kinds of events. In other cases, it enables the creation of new types of products, such as surgical robots and swimming pool safety systems. Companies that adopt vision technology ahead of their competitors will reap big rewards in many markets.