Table 1 Comparison between methods for tourism demand forecasting.
From: Time series transformer for tourism demand forecasting
Category | Subcategory | Representative Models | Advantages | Limitations |
---|---|---|---|---|
Time Series Models | Basic Models | Naive, AR, MA, ES, HA | - Simple and easy to implement (e.g., Naive works well for short-term stable data) - Low computational cost | - Cannot handle seasonality - Requires stationary data - Poor long-term forecasting performance |
Advanced Models | SARIMA, SARIMAX | - Captures seasonality and trends (SARIMA) - Supports exogenous variables (SARIMAX) - High interpretability | - Complex parameter tuning - Limited ability to handle nonlinear relationships | |
Econometric Models | Static Models | Linear Regression, Gravity Model | - Transparent and interpretable - Can analyse causal relationships between variables | - Assumes linearity, ignores dynamic changes - Requires high-quality data |
Dynamic Models | VAR, ECM, TVP | - Captures time-varying features (e.g., changes in consumer preferences) - Suitable for multivariate interaction analysis | - Underperforms AI-based methods | |
AI-based Models | Machine Learning Models | SVR, k-NN | - Requires manual feature engineering - Moderately interpretable | - Underperforms deep learning methods |
Deep Learning Models | ANN | - Strong nonlinear fitting ability - Suitable for high-dimensional data | - Poor at handling sequential data - Black-box nature - Requires large training datasets | |
RNN, LSTM, Bi-LSTM | - Handles long-term dependencies (LSTM) - Bidirectional feature capture (Bi-LSTM) - High predictive performance | - Few layers and difficult to converge - Sensitive to hyperparameters - Still a black-box - Requires large training datasets | ||
Transformer | - Excellent long-sequence modelling - Parallel computing efficiency - Outperforms LSTM in many fields - Interpretability (attention visualization) | - Not yet applied in tourism forecasting - Requires large training datasets |