Basit öğe kaydını göster

dc.contributor.authorDengiz, B.
dc.contributor.authorAlabas-Uslu, C.
dc.contributor.authorDengiz, O.
dc.date.accessioned2023-04-12T07:52:09Z
dc.date.available2023-04-12T07:52:09Z
dc.date.issued2009
dc.identifier.issn0160-5682en_US
dc.identifier.urihttp://hdl.handle.net/11727/8765
dc.description.abstractThe most widely used training algorithm of neural networks (NNs) is back propagation ( BP), a gradient-based technique that requires significant computational effort. Metaheuristic search techniques such as genetic algorithms, tabu search (TS) and simulated annealing have been recently used to cope with major shortcomings of BP such as the tendency to converge to a local optimal and a slow convergence rate. In this paper, an efficient TS algorithm employing different strategies to provide a balance between intensification and diversification is proposed for the training of NNs. The proposed algorithm is compared with other metaheuristic techniques found in literature using published test problems, and found to outperform them in the majority of the test cases.en_US
dc.language.isoengen_US
dc.relation.isversionof10.1057/palgrave.jors.2602535en_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectneural networksen_US
dc.subjectsupervised trainingen_US
dc.subjectheuristicsen_US
dc.subjecttabu searchen_US
dc.subjectsimulated annealingen_US
dc.subjectgenetic algorithmsen_US
dc.titleA Tabu Search Algorithm for the Training of Neural Networksen_US
dc.typearticleen_US
dc.relation.journalJOURNAL OF THE OPERATIONAL RESEARCH SOCIETYen_US
dc.identifier.volume60en_US
dc.identifier.issue2en_US
dc.identifier.startpage282en_US
dc.identifier.endpage291en_US
dc.identifier.wos000262581600014en_US
dc.identifier.scopus2-s2.0-58449100709en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergien_US


Bu öğenin dosyaları:

DosyalarBoyutBiçimGöster

Bu öğe ile ilişkili dosya yok.

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster