IEEE/CAA Journal of Automatica Sinica
Citation: | J. Chen, K. Liu, X. Luo, Y. Yuan, K. Sedraoui, Y. Al-Turki, and M. C. Zhou, “A state-migration particle swarm optimizer for adaptive latent factor analysis of high-dimensional and incomplete data,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 11, pp. 2220–2235, Nov. 2024. doi: 10.1109/JAS.2024.124575 |
High-dimensional and incomplete (HDI) matrices are primarily generated in all kinds of big-data-related practical applications. A latent factor analysis (LFA) model is capable of conducting efficient representation learning to an HDI matrix, whose hyper-parameter adaptation can be implemented through a particle swarm optimizer (PSO) to meet scalable requirements. However, conventional PSO is limited by its premature issues, which leads to the accuracy loss of a resultant LFA model. To address this thorny issue, this study merges the information of each particle’s state migration into its evolution process following the principle of a generalized momentum method for improving its search ability, thereby building a state-migration particle swarm optimizer (SPSO), whose theoretical convergence is rigorously proved in this study. It is then incorporated into an LFA model for implementing efficient hyper-parameter adaptation without accuracy loss. Experiments on six HDI matrices indicate that an SPSO-incorporated LFA model outperforms state-of-the-art LFA models in terms of prediction accuracy for missing data of an HDI matrix with competitive computational efficiency. Hence, SPSO’s use ensures efficient and reliable hyper-parameter adaptation in an LFA model, thus ensuring practicality and accurate representation learning for HDI matrices.
[1] |
X. Luo, M. C. Zhou, S. Li, D. Wu, Z. Liu, and M. Shang, “Algorithms of unconstrained non-negative latent factor analysis for recommender systems,” IEEE Trans. Big Data, vol. 7, no. 1, pp. 227–240, Mar. 2021. doi: 10.1109/TBDATA.2019.2916868
|
[2] |
Y. Koren, R. Bell, and C. Volinsky, “Matrix factorization techniques for recommender systems,” Computer, vol. 42, no. 8, pp. 30–37, Aug. 2009. doi: 10.1109/MC.2009.263
|
[3] |
S. Sidana, M. Trofimov, O. Horodnytskyi, C. Laclau, Y. Maximov, and M. R. Amini, “User preference and embedding learning with implicit feedback for recommender systems,” Data Min. Knowl. Discov., vol. 35, no. 2, pp. 568–592, Mar. 2021. doi: 10.1007/s10618-020-00730-8
|
[4] |
C. Feng, J. Liang, P. Song, and Z. Wang, “A fusion collaborative filtering method for sparse data in recommender systems,” Inf. Sci., vol. 521, pp. 365–379, Jun. 2020. doi: 10.1016/j.ins.2020.02.052
|
[5] |
L. Hu, X. Yuan, X. Liu, S. Xiong, and X. Luo, “Efficiently detecting protein complexes from protein interaction networks via alternating direction method of multipliers,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol. 16, no. 6, pp. 1922–1935, Nov.–Dec. 2019. doi: 10.1109/TCBB.2018.2844256
|
[6] |
L. Hu, S. Yang, X. Luo, H. Yuan, K. Sedraoui, and M. C. Zhou, “A distributed framework for large-scale protein-protein interaction data analysis and prediction using MapReduce,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 1, pp. 160–172, Jan. 2022. doi: 10.1109/JAS.2021.1004198
|
[7] |
D. Wu, X. Luo, M. Shang, Y. He, G. Wang, and X. Wu, “A data-characteristic-aware latent factor model for web services QoS prediction,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 6, pp. 2525–2538, Jun. 2022.
|
[8] |
D. Wu, Q. He, X. Luo, M. Shang, Y. He, and G. Wang, “A posterior-neighborhood-regularized latent factor model for highly accurate Web service QoS prediction,” IEEE Trans. Serv. Comput., vol. 15, no. 2, pp. 793–805, Mar.–Apr. 2022. doi: 10.1109/TSC.2019.2961895
|
[9] |
L. Wan, F. Xia, X. Kong, C.-H. Hsu, R. Huang, and J. Ma, “Deep matrix factorization for trust-aware recommendation in social networks,” IEEE Trans. Netw. Sci. Eng., vol. 8, no. 1, pp. 511–528, Jan.–Mar. 2021. doi: 10.1109/TNSE.2020.3044035
|
[10] |
M. Liu, S. Li, and L. Jin, “Modeling and analysis of Matthew effect under switching social networks via distributed competition,” IEEE/ CAA J. Autom. Sinica, vol. 9, no. 7, pp. 1311–1314, Jul. 2022. doi: 10.1109/JAS.2022.105527
|
[11] |
X. Luo, M. Shang, and S. Li, “Efficient extraction of non-negative latent factors from high-dimensional and sparse matrices in industrial applications,” in Proc. 16th IEEE Int. Conf. Data Mining, Barcelona, Spain, 2016, pp. 311–319.
|
[12] |
W. Qin and X. Luo, “Asynchronous parallel fuzzy stochastic gradient descent for high-dimensional incomplete data representation,” IEEE Trans. Fuzzy Syst., vol. 32, no. 2, pp. 445–459, Feb. 2024. doi: 10.1109/TFUZZ.2023.3300370
|
[13] |
D. Wu, X. Luo, M. Shang, Y. He, G. Wang, and M. C. Zhou, “A deep latent factor model for high-dimensional and sparse matrices in recommender systems,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 7, pp. 4285–4296, Jul. 2021. doi: 10.1109/TSMC.2019.2931393
|
[14] |
H. Li, K. Li, J. An, and K. Li, “MSGD: A novel matrix factorization approach for large-scale collaborative filtering recommender systems on GPUs,” IEEE Trans. Parallel Distrib. Syst., vol. 29, no. 7, pp. 1530–1544, Jul. 2018. doi: 10.1109/TPDS.2017.2718515
|
[15] |
P. Massa and P. Avesani, “Trust-aware recommender systems,” in Proc. ACM Conf. Recommender Systems, Minneapolis, USA, 2007, pp. 17–24.
|
[16] |
D. Liang, R. G. Krishnan, M. D. Hoffman, and T. Jebara, “Variational autoencoders for collaborative filtering,” in Proc. Int. Conf. World Wide Web, Lyon, France, 2018, pp. 689–698.
|
[17] |
X. He, K. Deng, X. Wang, Y. Li, Y. Zhang, and M. Wang, “LightGCN: Simplifying and powering graph convolution network for recommendation,” in Proc. 43rd Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Xi’an, China, 2020, pp. 639–648.
|
[18] |
X. Luo, W. Qin, A. Dong, K. Sedraoui, and M. C. Zhou, “Efficient and high-quality recommendations via momentum-incorporated parallel stochastic gradient descent-based learning,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 2, pp. 402–411, Feb. 2021. doi: 10.1109/JAS.2020.1003396
|
[19] |
J. Wu, L. Chen, Y. Feng, Z. Zheng, M. C. Zhou, and Z. Wu, “Predicting quality of service for selection by neighborhood-based collaborative filtering,” IEEE Trans. Syst. Man Cybern. Syst., vol. 43, no. 2, pp. 428–439, Mar. 2013. doi: 10.1109/TSMCA.2012.2210409
|
[20] |
W. Cheng, Y. Shen, L. Huang, and Y. Zhu, “Dual-embedding based deep latent factor models for recommendation,” ACM Trans. Knowl. Discov. Data, vol. 15, no. 5, p. 85, Oct. 2021.
|
[21] |
L. Xin, Y. Yuan, M. C. Zhou, Z. Liu, and M. Shang, “Non-negative latent factor model based on β-divergence for recommender systems,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 8, pp. 4612–4623, Aug. 2021. doi: 10.1109/TSMC.2019.2931468
|
[22] |
M. Aktukmak, Y. Yilmaz, and I. Uysal, “A probabilistic framework to incorporate mixed-data type features: Matrix factorization with multimodal side information,” Neurocomputing, vol. 367, pp. 164–175, Nov. 2019. doi: 10.1016/j.neucom.2019.08.019
|
[23] |
J. Duchi, E. Hazan, and Y. Singer, “Adaptive subgradient methods for online learning and stochastic optimization,” J. Mach. Learn. Res., vol. 12, pp. 2121–2159, Jul. 2011.
|
[24] |
J. Chen, Y. Yuan, R. Tao, J. Chen, and X. Luo, “Hyper-parameter-evolutionary latent factor analysis for high-dimensional and sparse data from recommender systems,” Neurocomputing, vol. 421, pp. 316–328, Jan. 2021. doi: 10.1016/j.neucom.2020.10.030
|
[25] |
X. Luo, Z. Wang, and M. Shang, “An instance-frequency-weighted regularization scheme for non-negative latent factor analysis on high-dimensional and sparse data,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 6, pp. 3522–3532, Jun. 2021. doi: 10.1109/TSMC.2019.2930525
|
[26] |
P. Y. Zhang, S. Shu, and M. C. Zhou, “An online fault detection model and strategies based on SVM-grid in clouds,” IEEE/CAA J. Autom. Sinica, vol. 5, no. 2, pp. 445–456, Mar. 2018. doi: 10.1109/JAS.2017.7510817
|
[27] |
D. P. Kingma and J. Ba, “ADAM: A method for stochastic optimization,” in Proc. 3rd Int. Conf. Learning Representations, San Diego, USA, 2015.
|
[28] |
X. Luo, D. Wang, M. C. Zhou, and H. Yuan, “Latent factor-based recommenders relying on extended stochastic gradient descent algorithms,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 2, pp. 916–926, Feb. 2021. doi: 10.1109/TSMC.2018.2884191
|
[29] |
M. D. Zeiler, “ADADELTA: An adaptive learning rate method,” arxiv preprint arxiv: 1212.5701, 2012.
|
[30] |
Y. Chen, B. Chen, X. He, C. Gao, Y. Li, J.-G. Lou, and Y. Wang, “λOpt: Learn to regularize recommender models in finer levels,” in Proc. 25th ACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, Anchorage, USA, 2019, pp. 978–986.
|
[31] |
Y. Shi and R. C. Eberhart, “Empirical study of particle swarm optimization,” in Proc. Congr. Evolutionary Computation, Washington, USA, 1999, pp. 1945–1950.
|
[32] |
R. C. Eberhart and Y. Shi, “Particle swarm optimization: Developments, applications and resources,” in Proc. Congr. Evolutionary Computation, Seoul, Korea (South), 2001, pp. 81–86.
|
[33] |
Y. Li, Z.-H. Zhan, S. Lin, J. Zhang, and X. Luo, “Competitive and cooperative particle swarm optimization with information sharing mechanism for global optimization problems,” Inf. Sci., vol. 293, pp. 370–382, Feb. 2015. doi: 10.1016/j.ins.2014.09.030
|
[34] |
F. V. Nepomuceno and A. P. Engelbrecht, “A self-adaptive heterogeneous PSO for real-parameter optimization,” in Proc. IEEE Congr. Evol. Comput., Cancun, Mexico, 2013, pp. 361–368.
|
[35] |
S.-T. Hsieh, T.-Y. Sun, C.-L. Lin, and C.-C. Liu, “Effective learning rate adjustment of blind source separation based on an improved particle swarm optimizer,” IEEE Trans. Evol. Comput., vol. 12, no. 2, pp. 242–251, Apr. 2008. doi: 10.1109/TEVC.2007.898781
|
[36] |
Q. Pan, J. Tang, J. Zhan, and H. Li, “Bacteria phototaxis optimizer,” Neural Comput. Appl., vol. 35, no. 18, pp. 13433–13464, Jun. 2023. doi: 10.1007/s00521-023-08391-6
|
[37] |
Q. Pan, J. Tang, and S. Lao, “EDOA: An elastic deformation optimization algorithm,” Appl. Intell., vol. 52, no. 15, pp. 17580–17599, Dec. 2022. doi: 10.1007/s10489-022-03471-x
|
[38] |
F. Wang, X. Wang, and S. Sun, “A reinforcement learning level-based particle swarm optimization algorithm for large-scale optimization,” Inf. Sci., vol. 602, pp. 298–312, Jul. 2022. doi: 10.1016/j.ins.2022.04.053
|
[39] |
Y. Liu, J. Liu, and Y. Jin, “Surrogate-assisted multipopulation particle swarm optimizer for high-dimensional expensive optimization,” IEEE Trans. Syst. Man Cybern. Syst., vol. 52, no. 7, pp. 4671–4684, Jul. 2022. doi: 10.1109/TSMC.2021.3102298
|
[40] |
N. Zeng, Z. Wang, W. Liu, H. Zhang, K. Hone, and X. Liu, “A dynamic neighborhood-based switching particle swarm optimization algorithm,” IEEE Trans. Cybern., vol. 52, no. 9, pp. 9290–9301, Sep. 2022. doi: 10.1109/TCYB.2020.3029748
|
[41] |
I. U. Rahman, Z. Wang, W. Liu, B. Ye, M. Zakarya, and X. Liu, “An N-state Markovian jumping particle swarm optimization algorithm,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 11, pp. 6626–6638, Nov. 2021. doi: 10.1109/TSMC.2019.2958550
|
[42] |
C. Jin, R. Ge, P. Netrapalli, S. M. Kakade, and M. I. Jordan, “How to escape saddle points efficiently,” in Proc. 34th Int. Conf. Machine Learning, Sydney, Australia, 2017, pp. 1724–1732.
|
[43] |
W. Liu, Z. Wang, Y. Yuan, N. Zeng, K. Hone, and X. Liu, “A novel sigmoid-function-based adaptive weighted particle swarm optimizer,” IEEE Trans. Cybern., vol. 51, no. 2, pp. 1085–1093, Feb. 2021. doi: 10.1109/TCYB.2019.2925015
|
[44] |
B. Wu, W. Hu, J. Hu, and G. G. Yen, “Adaptive multiobjective particle swarm optimization based on evolutionary state estimation,” IEEE Trans. Cybern., vol. 51, no. 7, pp. 3738–3751, Jul. 2021. doi: 10.1109/TCYB.2019.2949204
|
[45] |
W. Li, P. Liang, B. Sun, Y. Sun, and Y. Huang, “Reinforcement learning-based particle swarm optimization with neighborhood differential mutation strategy,” Swarm Evol. Comput., vol. 78, p. 101274, Apr. 2023. doi: 10.1016/j.swevo.2023.101274
|
[46] |
X. Zhang, X. Wang, Q. Kang, and J. Cheng, “Differential mutation and novel social learning particle swarm optimization algorithm,” Inf. Sci., vol. 480, pp. 109–129, 2019. doi: 10.1016/j.ins.2018.12.030
|
[47] |
S. Molaei, H. Moazen, S. Najjar-Ghabel, and L. Farzinvash, “Particle swarm optimization with an enhanced learning strategy and crossover operator,” Knowl.-Based Syst., vol. 215, p. 106768, Mar. 2021. doi: 10.1016/j.knosys.2021.106768
|
[48] |
L. Pan, Y. Zhao, and L. Li, “Neighborhood-based particle swarm optimization with discrete crossover for nonlinear equation systems,” Swarm Evol. Comput., vol. 69, p. 101019, Mar. 2022. doi: 10.1016/j.swevo.2021.101019
|
[49] |
Z. Feng, L. Chen, C.-H. Chen, M. Liu, and M.-E. Yuan, “Motion planning for redundant robotic manipulators using a novel multi-group particle swarm optimization,” Evol. Intell., vol. 13, no. 4, pp. 677–686, Dec. 2020. doi: 10.1007/s12065-020-00382-z
|
[50] |
Y. Zhang, “Elite archives-driven particle swarm optimization for large scale numerical optimization and its engineering applications,” Swarm Evol. Comput., vol. 76, p. 101212, Feb. 2023. doi: 10.1016/j.swevo.2022.101212
|
[51] |
R. P. Parouha and P. Verma, “A systematic overview of developments in differential evolution and particle swarm optimization with their advanced suggestion,” Appl. Intell., vol. 52, no. 9, pp. 10448–10492, Jul. 2022. doi: 10.1007/s10489-021-02803-7
|
[52] |
N. Lynn, M. Z. Ali, and P. N. Suganthan, “Population topologies for particle swarm optimization and differential evolution,” Swarm Evol. Comput., vol. 39, pp. 24–35, Apr. 2018. doi: 10.1016/j.swevo.2017.11.002
|
[53] |
K. Subramanian and P. Kandhasamy, “Mining high utility itemsets using genetic algorithm based-particle swarm optimization (GA-PSO),” J. Intell. Fuzzy Syst., vol. 44, no. 1, pp. 1169–1189, Jan. 2023. doi: 10.3233/JIFS-220871
|
[54] |
J. Divasón, A. Pernia-Espinoza, and F. J. Martinez-de-Pison, “HYB-PARSIMONY: A hybrid approach combining particle swarm optimization and genetic algorithms to find parsimonious models in high-dimensional datasets,” Neurocomputing, vol. 560, p. 126840, Dec. 2023. doi: 10.1016/j.neucom.2023.126840
|
[55] |
Z. Li, W. Wang, Y. Yan, and Z. Li, “PS-ABC: A hybrid algorithm based on particle swarm and artificial bee colony for high-dimensional optimization problems,” Expert Syst. Appl., vol. 42, no. 22, pp. 8881–8895, Dec. 2015. doi: 10.1016/j.eswa.2015.07.043
|
[56] |
J.-Y. Li, Z.-H. Zhan, R.-D. Liu, C. Wang, S. Kwong, and J. Zhang, “Generation-level parallelism for evolutionary computation: A pipeline-based parallel particle swarm optimization,” IEEE Trans. Cybern., vol. 51, no. 10, pp. 4848–4859, Oct. 2021. doi: 10.1109/TCYB.2020.3028070
|
[57] |
X. Luo, Z. Liu, S. Li, M. Shang, and Z. Wang, “A fast non-negative latent factor model based on generalized momentum method,” IEEE Trans. Syst. Man Cybern. Syst., vol. 51, no. 1, pp. 610–620, Jan. 2021. doi: 10.1109/TSMC.2018.2875452
|
[58] |
G. Ma, Z. Wang, W. Liu, J. Fang, Y. Zhang, H. Ding, and Y. Yuan, “Estimating the state of health for lithium-ion batteries: A particle swarm optimization-assisted deep domain adaptation approach,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 7, pp. 1530–1543, Jul. 2023. doi: 10.1109/JAS.2023.123531
|
[59] |
H. Lin, B. Zhao, D. Liu, and C. Alippi, “Data-based fault tolerant control for affine nonlinear systems through particle swarm optimized neural networks,” IEEE/CAA J. Autom. Sinica, vol. 7, no. 4, pp. 954–964, Jul. 2020. doi: 10.1109/JAS.2020.1003225
|
[60] |
Q. He, W. Qiao, A. K. Bashir, Y. Cai, L. Nie, Y. D. Al-Otaibi, and K. Yu, “Rumors suppression in healthcare system: Opinion-based comprehensive learning particle swarm optimization,” IEEE Trans. Comput. Soc. Syst., vol. 10, no. 4, pp. 1780–1790, Aug. 2023. doi: 10.1109/TCSS.2023.3263546
|
[61] |
H. Gou, J. Li, W. Qin, C. He, Y. Zhong, and R. Che, “A momentum-incorporated fast parallelized stochastic gradient descent for latent factor model in shared memory systems,” in Proc. IEEE Int. Conf. Networking, Sensing and Control, Nanjing, China, 2020, pp. 1–6.
|
[62] |
B. C. Geiger and T. Koch, “On the information dimension of stochastic processes,” IEEE Trans. Inf. Theory, vol. 65, no. 10, pp. 6496–6518, Oct. 2019. doi: 10.1109/TIT.2019.2922186
|
[63] |
M. Jiang, Y. P. Luo, and S. Y. Yang, “Stochastic convergence analysis and parameter selection of the standard particle swarm optimization algorithm,” Inf. Process. Lett., vol. 102, no. 1, pp. 8–16, Apr. 2007. doi: 10.1016/j.ipl.2006.10.005
|
[64] |
M. Clerc and J. Kennedy, “The particle swarm-explosion, stability, and convergence in a multidimensional complex space,” IEEE Trans. Evol. Comput., vol. 6, no. 1, pp. 58–73, Feb. 2002. doi: 10.1109/4235.985692
|
[65] |
X. Xia, L. Gui, F. Yu, H. Wu, B. Wei, Y.-L. Zhang, and Z.-H. Zhan, “Triple archives particle swarm optimization,” IEEE Trans. Cybern., vol. 50, no. 12, pp. 4862–4875, Dec. 2020. doi: 10.1109/TCYB.2019.2943928
|
[66] |
X. Li, “A multimodal particle swarm optimizer based on fitness Euclidean-distance ration,” in Proc. 9th Annu. Conf. Genetic and Evolutionary Computation, London, UK, 2007, pp. 78–85.
|
[67] |
B. Y. Qu, P. N. Suganthan, and S. Das, “A distance-based locally informed particle swarm model for multimodal optimization,” IEEE Trans. Evol. Comput., vol. 17, no. 3, pp. 387–402, Jun. 2013. doi: 10.1109/TEVC.2012.2203138
|
[68] |
K. Goldberg, T. Roeder, D. Gupta, and C. Perkins, “Eigentaste: A constant time collaborative filtering algorithm,” Inf. Retr., vol. 4, no. 2, pp. 133–151, Jul. 2001. doi: 10.1023/A:1011419012209
|
[69] |
H. Ma, I. King, and M. R. Lyu, “Learning to recommend with social trust ensemble,” in Proc. 32nd Annu. Int. ACM SIGIR Conf. Research and Development in Information Retrieval, Boston, USA, 2009, pp. 203–210.
|
[70] |
M. Jamali and M. Ester, “A matrix factorization technique with trust propagation for recommendation in social networks,” in Proc. 4th ACM Conf. Recommender Systems, Barcelona, Spain, 2010, pp. 135–142.
|
[71] |
J. A. Konstan, B. N. Miller, D. Maltz, J. L. Herlocker, L. R. Gordon, and J. Riedl, “GroupLens: Applying collaborative filtering to usenet news,” Commun. ACM, vol. 40, no. 3, pp. 77–87, Mar. 1997. doi: 10.1145/245108.245126
|
[72] |
L. Brozovsky and V. Petricek, “Recommender system for online dating service,” in Proc. Conf. Zalosti, 2007, pp. 1–12.
|
[73] |
D. Wu, M. Shang, X. Luo and Z. Wang, “An L1-and-L2-norm-oriented latent factor model for recommender systems,” IEEE Trans. Neural Networks Learn. Syst., vol. 33, no. 10, pp. 5775–5788, Oct. 2022.
|
[74] |
X. Luo, Y. Yuan, S. Chen, N. Zeng, and Z. Wang, “Position-transitional particle swarm optimization-incorporated latent factor analysis,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 8, pp. 3958–3970, Aug. 2022. doi: 10.1109/TKDE.2020.3033324
|
[75] |
Z.-J. Wang, Z.-H. Zhan, S. Kwong, H. Jin, and J. Zhang, “Adaptive granularity learning distributed particle swarm optimization for large-scale optimization,” IEEE Trans. Cybern., vol. 51, no. 3, pp. 1175–1188, Mar. 2021. doi: 10.1109/TCYB.2020.2977956
|
[76] |
B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms,” Appl. Soft Comput., vol. 18, pp. 261–276, May 2014. doi: 10.1016/j.asoc.2013.09.018
|
[77] |
Y. Shi, “Particle swarm optimization,” IEEE Connect., vol. 2, no. 1, pp. 8–13, 2004.
|
[78] |
Y. Shi and R. Eberhart, “A modified particle swarm optimizer,” in Proc. IEEE Int. Conf. Evolutionary Computation Proc. IEEE World Congr. Computational Intelligence, Anchorage, USA, 1998, pp. 69–73.
|
[79] |
J. Ren and S. Yang, “A particle swarm optimization algorithm with momentum factor,” in Proc. 4th Int. Symp. Computational Intelligence and Design, Hangzhou, China, 2011, pp. 19–21.
|
[80] |
J.-R. Jian, Z.-G. Chen, Z.-H. Zhan, and J. Zhang, “Region encoding helps evolutionary computation evolve faster: A new solution encoding scheme in particle swarm for large-scale optimization,” IEEE Trans. Evol. Comput., vol. 25, no. 4, pp. 779–793, Aug. 2021. doi: 10.1109/TEVC.2021.3065659
|
[81] |
X. Luo, J. Chen, Y. Yuan, and Z. Wang, “Pseudo gradient-adjusted particle swarm optimization for accurate adaptive latent factor analysis,” IEEE Trans. Syst. Man Cybern. Syst., vol. 54, no. 4, pp. 2213–2226, Apr. 2024. doi: 10.1109/TSMC.2023.3340919
|
[82] |
J. Demšar, “Statistical comparisons of classifiers over multiple data sets,” J. Mach. Learn. Res., vol. 7, pp. 1–30, Dec. 2006.
|
[83] |
J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm Evol. Comput., vol. 1, no. 1, pp. 3–18, Mar. 2011. doi: 10.1016/j.swevo.2011.02.002
|
[84] |
H. Wu, X. Luo, M. C. Zhou, M. J. Rawa, K. Sedraoui, and A. Albeshri, “A PID-incorporated latent factorization of tensors approach to dynamically weighted directed network analysis,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 533–546, Mar. 2022. doi: 10.1109/JAS.2021.1004308
|
[85] |
T. Ye, H. Wang, T. Zeng, M. G. H. Omran, F. Wang, Z. Cui, and J. Zhao, “An improved two-archive artificial bee colony algorithm for many-objective optimization,” Expert Syst. Appl., vol. 236, p. 121281, Feb. 2024. doi: 10.1016/j.eswa.2023.121281
|
[86] |
F. Karimi, M. B. Dowlatshahi, and A. Hashemi, “SemiACO: A semi-supervised feature selection based on ant colony optimization,” Expert Syst. Appl., vol. 214, p. 119130, Mar. 2023. doi: 10.1016/j.eswa.2022.119130
|