S-LCG: Structured Linear Congruential Generator-Based Deterministic Algorithm for Search and Optimization

2026-05-06Neural and Evolutionary Computing

Neural and Evolutionary Computing
AI summary

The authors developed a new optimization method called Structured Linear Congruential Generator (S-LCG), which uses a special number generator to explore possible solutions efficiently. Their method avoids repeating the same checks, converts numbers into multi-dimensional points to improve search coverage, and balances trying new solutions with refining good ones. They tested it on many benchmark problems and found it often gets very close to the best answer, doing better than other popular algorithms. It also works well on real engineering problems and only needs one main setting to be adjusted.

Linear Congruential Generatoroptimization algorithmexploration-exploitationbenchmark functionsMarsaglia latticesurrogate objective functionpremature convergenceengineering design optimizationstochastic searchdeterministic optimization
Authors
Ahmed Qasim Mohammed, Haider Banka, Anamika Singh
Abstract
This study presents a novel deterministic optimization algorithm based on a special variant of the Linear Congruential Generator (LCG). While conventional algorithms generally operate within the search space, the introduced technique follows a two-level architecture. In particular, an external loop that adaptively balances between exploration and exploitation, while the internal loop evaluates solutions. It is motivated by the intrinsic structure of the generator, the reason behind naming it the Structured Linear Congruential Generator (S- LCG). which enjoys a number of unique characteristics as follows: 1) a memoryless scheme, which ensures non-overlapping sequences based on distinct seeds, thus ensuring no evaluation redundancy; 2) bit splitting representation, which converts LCG states into multi-dimensional points to overcome the Marsaglia lattice effect; 3) adaptive exploration-exploitation of the generator space, which leads to implicit optimization of the surrogate smooth objective function; and 4) constant information gathering speed to avoid the problem of premature convergence. Extensive testing on 26 benchmark functions across dimensions d = 2 to 30 demonstrates that S-LCG comes within 1% of the global optimum in 83.3% of 138 cases (100% at d = 2, 81.2% at d = 30) while the nearest competitor GA achieved 75.4%. Statistical validation shows that S-LCG outperforms eight cutting-edge binary algorithms. Furthermore, its practical value is confirmed by validation on three constrained engineering design problems. In the end, S-LCG offers an optimization framework that is strictly reproducible and requires only one sensitive parameter to be tuned.