Umeå University's logo

umu.sePublikasjoner
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Balancing compression and prediction: a hybrid autoencoder-LSTM framework for cloud workloads
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-8097-1143
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.
Sun Microsystems, Neubiberg, Germany.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-2633-6798
2025 (engelsk)Inngår i: BDCAT 2025 - IEEE/ACM International Conference on Big Data Computing, Applications and Technologies, Co Located Conference UCC 2025, Association for Computing Machinery (ACM), 2025, artikkel-id 10Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Accurate future workload prediction is an essential step for proactive resource allocation and efficient provisioning in cloud computing environments. Deep learning strategies have proven successful for this task, but they face challenges due to the high dimensionality of monitoring data, extensive preprocessing requirements, and computational overhead. In this paper, we propose a hybrid framework that integrates autoencoders for workload compression with Long Short-Term Memory (LSTM) networks for time-series forecasting. Unlike prior studies, our approach systematically analyzes the trade-off between compression ratio and predictive accuracy, demonstrating how dimensionality reduction can improve both scalability and robustness. Thereby reducing the computational burden associated with processing massive-scale monitoring data. Experiments conducted on both synthetic and real-world datasets demonstrate that the proposed method achieves up to 60% data compression with minimal reconstruction loss, while also improving prediction accuracy compared to baseline LSTM models. We evaluate the overall performance of the framework using various metrics, including data reduction ratio, prediction accuracy, and the effects of different compression stages on predictive performance. Additionally, we quantify the computational savings in terms of CPU usage, memory footprint, and training/inference times, confirming the framework's feasibility for real-world deployment. These results underscore the potential of integrating compression and prediction to achieve scalable, accurate, and resource-efficient management of cloud workloads.

sted, utgiver, år, opplag, sider
Association for Computing Machinery (ACM), 2025. artikkel-id 10
Emneord [en]
Autoencoders, Cloud computing, Data compression, Information extraction, Workload prediction
HSV kategori
Identifikatorer
URN: urn:nbn:se:umu:diva-248586DOI: 10.1145/3773276.3774300Scopus ID: 2-s2.0-105026855587ISBN: 9798400722868 (digital)OAI: oai:DiVA.org:umu-248586DiVA, id: diva2:2031426
Konferanse
12th IEEE/ACM International Conference on Big Data Computing, Applications and Technologies, BDCAT 2025, Nantes, France, 1-4 December, 2025.
Forskningsfinansiär
Knut and Alice Wallenberg Foundation, KAW 2019.0352eSSENCE - An eScience CollaborationTilgjengelig fra: 2026-01-23 Laget: 2026-01-23 Sist oppdatert: 2026-01-23bibliografisk kontrollert

Open Access i DiVA

fulltext(1006 kB)17 nedlastinger
Filinformasjon
Fil FULLTEXT01.pdfFilstørrelse 1006 kBChecksum SHA-512
d6ee9e99f802fc4b57578239468938eaa4e8a86ccce454e11c1f4c8d01452077c8e8519abbc40d264483b2da9eb187f0668f72be0c36f99db1a96045ce7f7d5b
Type fulltextMimetype application/pdf

Andre lenker

Forlagets fulltekstScopus

Person

Kidane, LidiaTownend, PaulElmroth, Erik

Søk i DiVA

Av forfatter/redaktør
Kidane, LidiaTownend, PaulElmroth, Erik
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 2446 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf