TOKON: TOKenization-Optimized Normalization for time series analysis with a large language model

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

While large language models have rapidly evolved towards general artificial intelligence, their versatility in analyzing time series data remains limited. To address this limitation, we propose a novel normalization technique that considers the inherent nature of tokenization. The proposed Tokenization-Optimized Normalization (TOKON) simplifies time series data by representing each element with a single token, effectively reducing the number of tokens by 2 to 3 times. Additionally, we introduce a novel prompt for time series forecasting, termed Time Series Forecasting with Care (TFSC), to further enhance forecasting performance. Experimental results demonstrate that TOKON improves root mean square error (RMSE) for multi-step forecasting by approximately 7% to 18%, depending on the dataset and prompting method.

Original languageEnglish (US)
Title of host publication22nd International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, ECTI-CON 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331522230
DOIs
StatePublished - 2025
Event22nd International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, ECTI-CON 2025 - Bangkok, Thailand
Duration: May 20 2025May 23 2025

Publication series

Name22nd International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, ECTI-CON 2025

Conference

Conference22nd International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, ECTI-CON 2025
Country/TerritoryThailand
CityBangkok
Period5/20/255/23/25

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems
  • Energy Engineering and Power Technology
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'TOKON: TOKenization-Optimized Normalization for time series analysis with a large language model'. Together they form a unique fingerprint.

Cite this