Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Microsplit: efficient splitting of microservices on edge clouds
Umeå University, Faculty of Science and Technology, Department of Computing Science. (Autonomous Distributed Systems Lab)ORCID iD: 0000-0001-9249-1633
Department of Computer Science and Engineering, Chalmers University of Technology, Gothenburg, Sweden.
Cloud Systems and Platforms, Ericsson Research, Stockholm, Sweden.
Umeå University, Faculty of Science and Technology, Department of Computing Science. (Autonomous Distributed Systems Lab)ORCID iD: 0000-0002-2633-6798
2022 (English)In: 2022 IEEE/ACM 7th Symposium on Edge Computing (SEC), IEEE, 2022, p. 252-264Conference paper, Published paper (Refereed)
Abstract [en]

Edge cloud systems reduce the latency between users and applications by offloading computations to a set of small-scale computing resources deployed at the edge of the network. However, since edge resources are constrained, they can become saturated and bottlenecked due to increased load, resulting in an exponential increase in response times or failures. In this paper, we argue that an application can be split between the edge and the cloud, allowing for better performance compared to full migration to the cloud, releasing precious resources at the edge. We model an application's internal call-Graph as a Directed-Acyclic-Graph. We use this model to develop MicroSplit, a tool for efficient splitting of microservices between constrained edge resources and large-scale distant backend clouds. MicroSplit analyzes the dependencies between the microservices of an application, and using the Louvain method for community detection---a popular algorithm from Network Science---decides how to split the microservices between the constrained edge and distant data centers. We test MicroSplit with four microservice based applications in various realistic cloud-edge settings. Our results show that Microsplit migrates up to 60% of the microservices of an application with a slight increase in the mean-response time compared to running on the edge, and a latency reduction of up to 800% compared to migrating the entire application to the cloud. Compared to other methods from the State-of-the-Art, MicroSplit reduces the total number of services on the edge by up to five times, with minimal reduction in response times.

Place, publisher, year, edition, pages
IEEE, 2022. p. 252-264
Keywords [en]
Edge clouds, Microservices, Service mesh, Louvain community detection
National Category
Computer Sciences
Research subject
Computer Systems
Identifiers
URN: urn:nbn:se:umu:diva-202481DOI: 10.1109/SEC54971.2022.00027ISI: 000918607200019Scopus ID: 2-s2.0-85146644109ISBN: 978-1-6654-8611-8 (electronic)ISBN: 978-1-6654-8612-5 (print)OAI: oai:DiVA.org:umu-202481DiVA, id: diva2:1727375
Conference
IEEE/ACM 7th Symposium on Edge Computing (SEC), Seattle, WA, USA, December 5-8, 2022
Available from: 2023-01-16 Created: 2023-01-16 Last updated: 2024-04-08Bibliographically approved
In thesis
1. Edge orchestration for latency-sensitive applications
Open this publication in new window or tab >>Edge orchestration for latency-sensitive applications
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Orkestrering av distribuerade resurser för latenskänsliga applikationer
Abstract [en]

The emerging edge computing infrastructure provides distributed and heterogeneous resources closer to where data is generated and where end-users are located, thereby significantly reducing latency. With the recent advances in telecommunication systems, software architecture, and machine learning, there is a noticeable increase in applications that require processing times within tight latency constraints, i.e. latency-sensitive applications. For instance, numerous video analytics applications, such as traffic control systems, necessitate real-time processing capabilities. Orchestrating such applications at the edge offers numerous advantages, including lower latency, optimized bandwidth utilization, and enhanced scalability. However, despite its potential, effectively managing such latency-sensitive applications at the edge poses several challenges such as constrained compute resources, which holds back the full promise of edge computing.

This thesis proposes approaches to efficiently deploy latency-sensitive applications on the edge infrastructure. It partly addresses general applications with microservice architectures and party addresses the increasingly more important video analytics applications for the edge. To do so, this thesis proposes various application- and system-level solutions aiming to efficiently utilize constrained compute capacity on the edge while meeting prescribed latency constraints. These solutions primarily focus on effective resource management approaches and optimizing incoming workload inputs, considering the constrained compute capacity of edge resources. Additionally, the thesis explores the synergy effects of employing both application- and system-level resource optimization approaches together.

The results demonstrate  the effectiveness of the proposed solutions in enhancing the utilization of edge resources for latency-sensitive applications while adhering to application constraints. The proposed resource management solutions, alongside application-level optimization techniques, significantly improve resource efficiency while satisfying application requirements. Our results show that our solutions for microservice architectures significantly improve end-to-end latency by up to 800% while minimizing edge resource usage. Additionally, the results indicate that our application- and system-level optimizations for orchestrating edge resources for video analytics applications can increase the overall throughput by up to 60%. 

Place, publisher, year, edition, pages
Umeå: Umeå University, 2024. p. 46
Series
UMINF, ISSN 0348-0542 ; 24.05
Keywords
Edge Computing, Resource Management, Latency-Sensitive Applications, Edge Video Analytics
National Category
Computer Sciences
Research subject
Computer Science; Computer Systems
Identifiers
urn:nbn:se:umu:diva-223021 (URN)978-91-8070-350-5 (ISBN)978-91-8070-351-2 (ISBN)
Public defence
2024-04-29, Hörsal UB.A.240 - Lindellhallen 4, 13:00 (English)
Opponent
Supervisors
Note

Incorrect date of publication on the posting sheet.

In publication: UMINF 24.04

Available from: 2024-04-08 Created: 2024-04-08 Last updated: 2024-04-23Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Rahmanian, AliElmroth, Erik

Search in DiVA

By author/editor
Rahmanian, AliElmroth, Erik
By organisation
Department of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 142 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf