End-to-End Federated Learning for Autonomous Driving Vehicles
2021 (English)In: Proceedings of the International Joint Conference on Neural Networks, IEEE, 2021Conference paper, Published paper (Refereed)
Abstract [en]
In recent years, with the development of computation capability in devices, companies are eager to investigate and utilize suitable ML/DL methods to improve their service quality. However, with the traditional learning strategy, companies need to first build up a powerful data center to collect and analyze data from the edge and then perform centralized model training, which turns out to be inefficient. Federated Learning has been introduced to solve this challenge. Because of its characteristics such as model-only exchange and parallel training, the technique can not only preserve user data privacy but also accelerate model training speed. The method can easily handle real-time data generated from the edge without taking up a lot of valuable network transmission resources. In this paper, we introduce an approach to end-to-end on-device Machine Learning by utilizing Federated Learning. We validate our approach with an important industrial use case in the field of autonomous driving vehicles, the wheel steering angle prediction. Our results show that Federated Learning can significantly improve the quality of local edge models and also reach the same accuracy level as compared to the traditional centralized Machine Learning approach without its negative effects. Furthermore, Federated Learning can accelerate model training speed and reduce the communication overhead, which proves that this approach has great strength when deploying ML/DL components to various real-world embedded systems.
Place, publisher, year, edition, pages
IEEE, 2021.
Series
Proceedings of ... International Joint Conference on Neural Networks, ISSN 2161-4393, E-ISSN 2161-4407
Keywords [en]
Federated Learning Machine learning Heterogeneous computation Software Engineering, Autonomous vehicles, Embedded systems, Machine learning, Quality of service, Software engineering, Autonomous driving, Computation software, End to end, Heterogeneous computation, Learning strategy, Model training, Service Quality, Traditional learning, Training speed, Data privacy
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:mau:diva-48761DOI: 10.1109/IJCNN52387.2021.9533808ISI: 000722581704014Scopus ID: 2-s2.0-85115839157ISBN: 978-1-6654-3900-8 (electronic)ISBN: 978-1-6654-4597-9 (print)OAI: oai:DiVA.org:mau-48761DiVA, id: diva2:1623193
Conference
2021 International Joint Conference on Neural Networks (IJCNN), 18-22 July 2021, Shenzhen, China
2021-12-282021-12-282022-08-05Bibliographically approved