Table 3 Parameter setting of various models.
From: The development of CC-TF-BiGRU model for enhancing accuracy in photovoltaic power forecasting
Model | Label | Parameter setting |
|---|---|---|
CC-BP | #1 | net.trainParam.goal = 0.0001; net.trainParam.lr = 0.001; net.trainParam.epochs = 500 |
CC-ELM | #2 | For input layer, number of neuron nodes = 3, For hidden layer, number of neurons = 1, For output layer, number of neuron nodes = 30 |
CC-LSTM | #3 | Number of nodes in hidden layer 2 = 18; number of nodes in hidden layer 1 = 15 |
CC-Transformer | #4 | Sequence_length = 10, batch_size = 64, feature_size = 250, num_layers = 1, nhead = 10, num_epochs = 100 |
CC-Informer | #5 | Features = MS, seq_len = 384, label_len = 192, pred_len = 96, enc_in = 8, dec_in = 8, c_out = 8, d_model = 512, n_heads = 8, learning_rate = 0.0001, loss = mse |
CC-XGBoost | #6 | max_depth = 4; learning_rate = 0.05 |
CC-BiGRU | #7 | For hidden layer 2, number of nodes = 20; For hidden layer 1, number of nodes = 10 |
CC-GBDT | #8 | n_estimators = 10; learning_rate = 0.001 |
CC-GBDT-BiGRU | #9 | n_estimators = 10; learning_rate = 0.001; For hidden layer 2, number of nodes = 20; For hidden layer 1, number of nodes = 10;\(\rho =0.3\) |
CC-TF-BiGRU | #10 | n_estimators = 10; learning_rate = 0.001; For hidden layer 2, number of nodes = 20; For hidden layer 1, number of nodes = 10 |