Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Balancing compression and prediction: a hybrid autoencoder-LSTM framework for cloud workloads
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-8097-1143
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.
Sun Microsystems, Neubiberg, Germany.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-2633-6798
2025 (Engelska)Ingår i: BDCAT 2025 - IEEE/ACM International Conference on Big Data Computing, Applications and Technologies, Co Located Conference UCC 2025, Association for Computing Machinery (ACM), 2025, artikel-id 10Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Accurate future workload prediction is an essential step for proactive resource allocation and efficient provisioning in cloud computing environments. Deep learning strategies have proven successful for this task, but they face challenges due to the high dimensionality of monitoring data, extensive preprocessing requirements, and computational overhead. In this paper, we propose a hybrid framework that integrates autoencoders for workload compression with Long Short-Term Memory (LSTM) networks for time-series forecasting. Unlike prior studies, our approach systematically analyzes the trade-off between compression ratio and predictive accuracy, demonstrating how dimensionality reduction can improve both scalability and robustness. Thereby reducing the computational burden associated with processing massive-scale monitoring data. Experiments conducted on both synthetic and real-world datasets demonstrate that the proposed method achieves up to 60% data compression with minimal reconstruction loss, while also improving prediction accuracy compared to baseline LSTM models. We evaluate the overall performance of the framework using various metrics, including data reduction ratio, prediction accuracy, and the effects of different compression stages on predictive performance. Additionally, we quantify the computational savings in terms of CPU usage, memory footprint, and training/inference times, confirming the framework's feasibility for real-world deployment. These results underscore the potential of integrating compression and prediction to achieve scalable, accurate, and resource-efficient management of cloud workloads.

Ort, förlag, år, upplaga, sidor
Association for Computing Machinery (ACM), 2025. artikel-id 10
Nyckelord [en]
Autoencoders, Cloud computing, Data compression, Information extraction, Workload prediction
Nationell ämneskategori
Datorsystem Datavetenskap (datalogi)
Identifikatorer
URN: urn:nbn:se:umu:diva-248586DOI: 10.1145/3773276.3774300Scopus ID: 2-s2.0-105026855587ISBN: 9798400722868 (digital)OAI: oai:DiVA.org:umu-248586DiVA, id: diva2:2031426
Konferens
12th IEEE/ACM International Conference on Big Data Computing, Applications and Technologies, BDCAT 2025, Nantes, France, 1-4 December, 2025.
Forskningsfinansiär
Knut och Alice Wallenbergs Stiftelse, KAW 2019.0352eSSENCE - An eScience CollaborationTillgänglig från: 2026-01-23 Skapad: 2026-01-23 Senast uppdaterad: 2026-01-23Bibliografiskt granskad

Open Access i DiVA

fulltext(1006 kB)10 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 1006 kBChecksumma SHA-512
d6ee9e99f802fc4b57578239468938eaa4e8a86ccce454e11c1f4c8d01452077c8e8519abbc40d264483b2da9eb187f0668f72be0c36f99db1a96045ce7f7d5b
Typ fulltextMimetyp application/pdf

Övriga länkar

Förlagets fulltextScopus

Person

Kidane, LidiaTownend, PaulElmroth, Erik

Sök vidare i DiVA

Av författaren/redaktören
Kidane, LidiaTownend, PaulElmroth, Erik
Av organisationen
Institutionen för datavetenskap
DatorsystemDatavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 2424 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf