Publikationer från Malmö universitet
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
End-to-End Federated Learning for Autonomous Driving Vehicles
Chalmers University of Technology.
Chalmers University of Technology.
Malmö universitet, Fakulteten för teknik och samhälle (TS), Institutionen för datavetenskap och medieteknik (DVMT).ORCID-id: 0000-0002-7700-1816
2021 (engelsk)Inngår i: Proceedings of the International Joint Conference on Neural Networks, IEEE, 2021Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In recent years, with the development of computation capability in devices, companies are eager to investigate and utilize suitable ML/DL methods to improve their service quality. However, with the traditional learning strategy, companies need to first build up a powerful data center to collect and analyze data from the edge and then perform centralized model training, which turns out to be inefficient. Federated Learning has been introduced to solve this challenge. Because of its characteristics such as model-only exchange and parallel training, the technique can not only preserve user data privacy but also accelerate model training speed. The method can easily handle real-time data generated from the edge without taking up a lot of valuable network transmission resources. In this paper, we introduce an approach to end-to-end on-device Machine Learning by utilizing Federated Learning. We validate our approach with an important industrial use case in the field of autonomous driving vehicles, the wheel steering angle prediction. Our results show that Federated Learning can significantly improve the quality of local edge models and also reach the same accuracy level as compared to the traditional centralized Machine Learning approach without its negative effects. Furthermore, Federated Learning can accelerate model training speed and reduce the communication overhead, which proves that this approach has great strength when deploying ML/DL components to various real-world embedded systems.

sted, utgiver, år, opplag, sider
IEEE, 2021.
Serie
Proceedings of ... International Joint Conference on Neural Networks, ISSN 2161-4393, E-ISSN 2161-4407
Emneord [en]
Federated Learning Machine learning Heterogeneous computation Software Engineering, Autonomous vehicles, Embedded systems, Machine learning, Quality of service, Software engineering, Autonomous driving, Computation software, End to end, Heterogeneous computation, Learning strategy, Model training, Service Quality, Traditional learning, Training speed, Data privacy
HSV kategori
Identifikatorer
URN: urn:nbn:se:mau:diva-48761DOI: 10.1109/IJCNN52387.2021.9533808ISI: 000722581704014Scopus ID: 2-s2.0-85115839157ISBN: 978-1-6654-3900-8 (digital)ISBN: 978-1-6654-4597-9 (tryckt)OAI: oai:DiVA.org:mau-48761DiVA, id: diva2:1623193
Konferanse
2021 International Joint Conference on Neural Networks (IJCNN), 18-22 July 2021, Shenzhen, China
Tilgjengelig fra: 2021-12-28 Laget: 2021-12-28 Sist oppdatert: 2022-08-05bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Olsson, Helena Holmström

Søk i DiVA

Av forfatter/redaktør
Olsson, Helena Holmström
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 90 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf