Umeå University's logo

umu.sePublikasjoner
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Microsplit: efficient splitting of microservices on edge clouds
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. (Autonomous Distributed Systems Lab)ORCID-id: 0000-0001-9249-1633
Department of Computer Science and Engineering, Chalmers University of Technology, Gothenburg, Sweden.
Cloud Systems and Platforms, Ericsson Research, Stockholm, Sweden.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. (Autonomous Distributed Systems Lab)ORCID-id: 0000-0002-2633-6798
2022 (engelsk)Inngår i: 2022 IEEE/ACM 7th Symposium on Edge Computing (SEC), IEEE, 2022, s. 252-264Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Edge cloud systems reduce the latency between users and applications by offloading computations to a set of small-scale computing resources deployed at the edge of the network. However, since edge resources are constrained, they can become saturated and bottlenecked due to increased load, resulting in an exponential increase in response times or failures. In this paper, we argue that an application can be split between the edge and the cloud, allowing for better performance compared to full migration to the cloud, releasing precious resources at the edge. We model an application's internal call-Graph as a Directed-Acyclic-Graph. We use this model to develop MicroSplit, a tool for efficient splitting of microservices between constrained edge resources and large-scale distant backend clouds. MicroSplit analyzes the dependencies between the microservices of an application, and using the Louvain method for community detection---a popular algorithm from Network Science---decides how to split the microservices between the constrained edge and distant data centers. We test MicroSplit with four microservice based applications in various realistic cloud-edge settings. Our results show that Microsplit migrates up to 60% of the microservices of an application with a slight increase in the mean-response time compared to running on the edge, and a latency reduction of up to 800% compared to migrating the entire application to the cloud. Compared to other methods from the State-of-the-Art, MicroSplit reduces the total number of services on the edge by up to five times, with minimal reduction in response times.

sted, utgiver, år, opplag, sider
IEEE, 2022. s. 252-264
Emneord [en]
Edge clouds, Microservices, Service mesh, Louvain community detection
HSV kategori
Forskningsprogram
datorteknik
Identifikatorer
URN: urn:nbn:se:umu:diva-202481DOI: 10.1109/SEC54971.2022.00027ISI: 000918607200019Scopus ID: 2-s2.0-85146644109ISBN: 978-1-6654-8611-8 (digital)ISBN: 978-1-6654-8612-5 (tryckt)OAI: oai:DiVA.org:umu-202481DiVA, id: diva2:1727375
Konferanse
IEEE/ACM 7th Symposium on Edge Computing (SEC), Seattle, WA, USA, December 5-8, 2022
Tilgjengelig fra: 2023-01-16 Laget: 2023-01-16 Sist oppdatert: 2024-04-08bibliografisk kontrollert
Inngår i avhandling
1. Edge orchestration for latency-sensitive applications
Åpne denne publikasjonen i ny fane eller vindu >>Edge orchestration for latency-sensitive applications
2024 (engelsk)Doktoravhandling, med artikler (Annet vitenskapelig)
Alternativ tittel[sv]
Orkestrering av distribuerade resurser för latenskänsliga applikationer
Abstract [en]

The emerging edge computing infrastructure provides distributed and heterogeneous resources closer to where data is generated and where end-users are located, thereby significantly reducing latency. With the recent advances in telecommunication systems, software architecture, and machine learning, there is a noticeable increase in applications that require processing times within tight latency constraints, i.e. latency-sensitive applications. For instance, numerous video analytics applications, such as traffic control systems, necessitate real-time processing capabilities. Orchestrating such applications at the edge offers numerous advantages, including lower latency, optimized bandwidth utilization, and enhanced scalability. However, despite its potential, effectively managing such latency-sensitive applications at the edge poses several challenges such as constrained compute resources, which holds back the full promise of edge computing.

This thesis proposes approaches to efficiently deploy latency-sensitive applications on the edge infrastructure. It partly addresses general applications with microservice architectures and party addresses the increasingly more important video analytics applications for the edge. To do so, this thesis proposes various application- and system-level solutions aiming to efficiently utilize constrained compute capacity on the edge while meeting prescribed latency constraints. These solutions primarily focus on effective resource management approaches and optimizing incoming workload inputs, considering the constrained compute capacity of edge resources. Additionally, the thesis explores the synergy effects of employing both application- and system-level resource optimization approaches together.

The results demonstrate  the effectiveness of the proposed solutions in enhancing the utilization of edge resources for latency-sensitive applications while adhering to application constraints. The proposed resource management solutions, alongside application-level optimization techniques, significantly improve resource efficiency while satisfying application requirements. Our results show that our solutions for microservice architectures significantly improve end-to-end latency by up to 800% while minimizing edge resource usage. Additionally, the results indicate that our application- and system-level optimizations for orchestrating edge resources for video analytics applications can increase the overall throughput by up to 60%. 

sted, utgiver, år, opplag, sider
Umeå: Umeå University, 2024. s. 46
Serie
UMINF, ISSN 0348-0542 ; 24.05
Emneord
Edge Computing, Resource Management, Latency-Sensitive Applications, Edge Video Analytics
HSV kategori
Forskningsprogram
datalogi; datorteknik
Identifikatorer
urn:nbn:se:umu:diva-223021 (URN)978-91-8070-350-5 (ISBN)978-91-8070-351-2 (ISBN)
Disputas
2024-04-29, Hörsal UB.A.240 - Lindellhallen 4, 13:00 (engelsk)
Opponent
Veileder
Merknad

Incorrect date of publication on the posting sheet.

In publication: UMINF 24.04

Tilgjengelig fra: 2024-04-08 Laget: 2024-04-08 Sist oppdatert: 2024-04-23bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Rahmanian, AliElmroth, Erik

Søk i DiVA

Av forfatter/redaktør
Rahmanian, AliElmroth, Erik
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 142 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf