Standaard Boekhandel gebruikt cookies en gelijkaardige technologieën om de website goed te laten werken en je een betere surfervaring te bezorgen.
Hieronder kan je kiezen welke cookies je wilt inschakelen:
Technische en functionele cookies
Deze cookies zijn essentieel om de website goed te laten functioneren, en laten je toe om bijvoorbeeld in te loggen. Je kan deze cookies niet uitschakelen.
Analytische cookies
Deze cookies verzamelen anonieme informatie over het gebruik van onze website. Op die manier kunnen we de website beter afstemmen op de behoeften van de gebruikers.
Marketingcookies
Deze cookies delen je gedrag op onze website met externe partijen, zodat je op externe platformen relevantere advertenties van Standaard Boekhandel te zien krijgt.
Je kan maximaal 250 producten tegelijk aan je winkelmandje toevoegen. Verwijdere enkele producten uit je winkelmandje, of splits je bestelling op in meerdere bestellingen.
This book suggests the development of single and multi-layer fractional-order neural networks that incorporate fractional-order activation functions derived using fractional-order derivatives. Activation functions are essential in neural networks as they introduce nonlinearity, enabling the models to learn complex patterns in data. However, traditional activation functions have limitations such as non-differentiability, vanishing gradient problems, and inactive neurons at negative inputs, which can affect the performance of neural networks, especially for tasks involving intricate nonlinear dynamics. To address these issues, fractional-order derivatives from fractional calculus have been proposed. These derivatives can model complex systems with non-local or non-Markovian behavior. The aim is to improve wind power prediction accuracy using datasets from the Texas wind turbine and Jeju Island wind farm under various scenarios. The book explores the advantages of fractional-order activation functions in terms of robustness, faster convergence, and greater flexibility in hyper-parameter tuning. It includes a comparative analysis of single and multi-layer fractional-order neural networks versus conventional neural networks, assessing their performance based on metrics such as mean square error and coefficient of determination. The impact of using machine learning models to impute missing data on the performance of networks is also discussed. This book demonstrates the potential of fractional-order activation functions to enhance neural network models, particularly in predicting chaotic time series. The findings suggest that fractional-order activation functions can significantly improve accuracy and performance, emphasizing the importance of advancing activation function design in neural network analysis. Additionally, the book is a valuable teaching and learning resource for undergraduate and postgraduate students conducting research in this field.