Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Microsplit: efficient splitting of microservices on edge clouds
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. (Autonomous Distributed Systems Lab)ORCID-id: 0000-0001-9249-1633
Department of Computer Science and Engineering, Chalmers University of Technology, Gothenburg, Sweden.
Cloud Systems and Platforms, Ericsson Research, Stockholm, Sweden.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. (Autonomous Distributed Systems Lab)ORCID-id: 0000-0002-2633-6798
2022 (Engelska)Ingår i: 2022 IEEE/ACM 7th Symposium on Edge Computing (SEC), IEEE, 2022, s. 252-264Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Edge cloud systems reduce the latency between users and applications by offloading computations to a set of small-scale computing resources deployed at the edge of the network. However, since edge resources are constrained, they can become saturated and bottlenecked due to increased load, resulting in an exponential increase in response times or failures. In this paper, we argue that an application can be split between the edge and the cloud, allowing for better performance compared to full migration to the cloud, releasing precious resources at the edge. We model an application's internal call-Graph as a Directed-Acyclic-Graph. We use this model to develop MicroSplit, a tool for efficient splitting of microservices between constrained edge resources and large-scale distant backend clouds. MicroSplit analyzes the dependencies between the microservices of an application, and using the Louvain method for community detection---a popular algorithm from Network Science---decides how to split the microservices between the constrained edge and distant data centers. We test MicroSplit with four microservice based applications in various realistic cloud-edge settings. Our results show that Microsplit migrates up to 60% of the microservices of an application with a slight increase in the mean-response time compared to running on the edge, and a latency reduction of up to 800% compared to migrating the entire application to the cloud. Compared to other methods from the State-of-the-Art, MicroSplit reduces the total number of services on the edge by up to five times, with minimal reduction in response times.

Ort, förlag, år, upplaga, sidor
IEEE, 2022. s. 252-264
Nyckelord [en]
Edge clouds, Microservices, Service mesh, Louvain community detection
Nationell ämneskategori
Datavetenskap (datalogi)
Forskningsämne
datorteknik
Identifikatorer
URN: urn:nbn:se:umu:diva-202481DOI: 10.1109/SEC54971.2022.00027ISI: 000918607200019Scopus ID: 2-s2.0-85146644109ISBN: 978-1-6654-8611-8 (digital)ISBN: 978-1-6654-8612-5 (tryckt)OAI: oai:DiVA.org:umu-202481DiVA, id: diva2:1727375
Konferens
IEEE/ACM 7th Symposium on Edge Computing (SEC), Seattle, WA, USA, December 5-8, 2022
Tillgänglig från: 2023-01-16 Skapad: 2023-01-16 Senast uppdaterad: 2024-04-08Bibliografiskt granskad
Ingår i avhandling
1. Edge orchestration for latency-sensitive applications
Öppna denna publikation i ny flik eller fönster >>Edge orchestration for latency-sensitive applications
2024 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Alternativ titel[sv]
Orkestrering av distribuerade resurser för latenskänsliga applikationer
Abstract [en]

The emerging edge computing infrastructure provides distributed and heterogeneous resources closer to where data is generated and where end-users are located, thereby significantly reducing latency. With the recent advances in telecommunication systems, software architecture, and machine learning, there is a noticeable increase in applications that require processing times within tight latency constraints, i.e. latency-sensitive applications. For instance, numerous video analytics applications, such as traffic control systems, necessitate real-time processing capabilities. Orchestrating such applications at the edge offers numerous advantages, including lower latency, optimized bandwidth utilization, and enhanced scalability. However, despite its potential, effectively managing such latency-sensitive applications at the edge poses several challenges such as constrained compute resources, which holds back the full promise of edge computing.

This thesis proposes approaches to efficiently deploy latency-sensitive applications on the edge infrastructure. It partly addresses general applications with microservice architectures and party addresses the increasingly more important video analytics applications for the edge. To do so, this thesis proposes various application- and system-level solutions aiming to efficiently utilize constrained compute capacity on the edge while meeting prescribed latency constraints. These solutions primarily focus on effective resource management approaches and optimizing incoming workload inputs, considering the constrained compute capacity of edge resources. Additionally, the thesis explores the synergy effects of employing both application- and system-level resource optimization approaches together.

The results demonstrate  the effectiveness of the proposed solutions in enhancing the utilization of edge resources for latency-sensitive applications while adhering to application constraints. The proposed resource management solutions, alongside application-level optimization techniques, significantly improve resource efficiency while satisfying application requirements. Our results show that our solutions for microservice architectures significantly improve end-to-end latency by up to 800% while minimizing edge resource usage. Additionally, the results indicate that our application- and system-level optimizations for orchestrating edge resources for video analytics applications can increase the overall throughput by up to 60%. 

Ort, förlag, år, upplaga, sidor
Umeå: Umeå University, 2024. s. 46
Serie
UMINF, ISSN 0348-0542 ; 24.04
Nyckelord
Edge Computing, Resource Management, Latency-Sensitive Applications, Edge Video Analytics
Nationell ämneskategori
Datavetenskap (datalogi)
Forskningsämne
datalogi; datorteknik
Identifikatorer
urn:nbn:se:umu:diva-223021 (URN)978-91-8070-350-5 (ISBN)978-91-8070-351-2 (ISBN)
Disputation
2024-04-29, Hörsal UB.A.240 - Lindellhallen 4, 13:00 (Engelska)
Opponent
Handledare
Anmärkning

Incorrect date of publication on the posting sheet. 

Tillgänglig från: 2024-04-08 Skapad: 2024-04-08 Senast uppdaterad: 2024-04-08Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Rahmanian, AliElmroth, Erik

Sök vidare i DiVA

Av författaren/redaktören
Rahmanian, AliElmroth, Erik
Av organisationen
Institutionen för datavetenskap
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 127 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf