AI ON THE EDGE
The AI on the Edge workshop provided participants with a comprehensive understanding of how Artificial Intelligence can be deployed on edge devices to enable fast, efficient, and scalable real-time processing. The session highlighted the significance of bringing AI computation closer to the data source and demonstrated practical techniques used in modern embedded and IoT-based systems.
The workshop covered:
1. The workshop introduced the fundamentals of Edge AI, highlighting its advantages over cloud-based models and showcasing real-world applications in smart devices, robotics, healthcare, and Industry 4.0.
2. Common edge hardware platforms such as Raspberry Pi, NVIDIA Jetson, Google Coral TPUs, and microcontroller-based ML boards were discussed, with emphasis on their capabilities and limitations.
3. Model-optimization techniques—including quantization, pruning, and converting models into lightweight formats like TensorFlow Lite and ONNX—were explained for efficient deployment on low-power devices.
4. Hands-on demonstrations allowed students to work with TinyML and edge-deployment tools, practicing inference execution, sensor data handling, and power-efficient AI design.
5. Practical applications such as real-time object detection, keyword spotting, anomaly detection, and sensor-driven AI solutions were showcased.
6. Key considerations related to security, latency, speed, memory, and power consumption were discussed to help attendees understand the trade-offs in Edge AI system design.
Date and Time
Location
Hosts
Registration
-
Add Event to Calendar