Presented by

  • Andy Gelme

    Andy Gelme
    @geekscape
    https://geekscape.github.io

    Andy started hacking as a teenager when microprocessors were first available and you had to build your own personal computer. His career has included the spectrum of computing … from consumer electronics products to Cray supercomputers. Various projects have involved building automation, Internet of Things, establishing the Melbourne HackerSpace in 2009 and co-founding LIFX in 2012. Since the start of 2016, Andy has been developing distributed frameworks that combine real-time telemetry and video processing via Machine Learning (neural networks) for applications including robotics and drones.

Abstract

Artificial Intelligence, robotics and the Internet of Things (AIoT) are revolutionizing the world at an ever increasing pace. However, the entrenched approaches for delivering applications, often web based, are not the best way to deliver the full potential of these recent advances ... due to some serious impedance mismatches. Devices at the edge, from simple sensors up-to sophisticated robots need local computation combined with remote computation in the data centre ... connected via low-latency streams of rich data types, such as telemetry, video, audio, LIDAR and more. Ensembles of Machine Learning models will need to be placed strategically throughout the whole architecture. These highly dynamic systems require that the concepts of failure and security are baked into the architecture ... rather than being tacked on as something that developers deal with at the application level. Above all, building such systems should be quick and fun to do ... and easy to diagnose when things go seriously pear-shaped ! Managing and integrating all these technologies can be a soul destroying challenge, because each one has its own set of terminology, frameworks, libraries, and APIs ... resulting in a seemingly insurmountable Tower of Babel. This presentation will introduce an open source distributed embedded framework that consolidates AIoT, media streaming, Machine Learning pipelines and robotics into a single, cohesive platform. The overall architecture, design and some implementation details will be covered, including distributed systems (Actor model), messaging (MQTT), streaming media (GStreamer), data flow pipelines, incorporating Machine Learning models and embedded devices as first class members of the network, along with scaling up to large numbers of devices. There will be some live examples with hardware ... always a source of intense embarrassment and magic smoke escaping. This open framework has been used by a commercial product during 2022 ... which is being developed further throughout 2023. A brief portion of this presentation will cover that technical use-case ... no marketing, I promise ! The aim being to release an open-source framework that is being used commercially at scale. YouTube: https://www.youtube.com/watch?v=htbzn_xwEnU LA Archive: http://mirror.linux.org.au/pub/everythingopen/2023/clarendon_auditorium/Thursday/Aiko_Services_Building_an_open_framework_combining_AIoT_media_robotics_Machine_Learning.webm