Beta
158226

A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions

Article

Last updated: 27 Dec 2024

Subjects

-

Tags

-

Abstract

Slow convergence and long training times are still the disadvantages
often mentioned when neural networks are compared with other competing
techniques. One of the reasons of slow convergence in Backpropagation learning is
the diminishing value of the derivative of the commonly used activation functions
as the nodes approach extreme values, namely, 0 or 1. In this paper, we propose
eight activation functions to accelerate learning speed by eliminating the number
of iterations and increasing the convergence rate. Mathematical proving of the
errors for the output and hidden layers using these activation functions are
concluded. Statistical measures are also obtained using these different activation
functions. Through the simulated results, these activation functions are analyzed,
compared and tested. The analytical approach indicates considerable improvement
in training times and convergence performance.

DOI

10.21608/asc.2009.158226

Keywords

Neural Networks, Neural Network Learning, Backpropagation, Activation Functions, Convergence speed

Volume

3

Article Issue

1

Related Issue

23273

Issue Date

2009-06-01

Receive Date

2021-03-21

Publish Date

2009-06-01

Page Start

83

Page End

102

Print ISSN

1687-8515

Online ISSN

2682-3578

Link

https://asc.journals.ekb.eg/article_158226.html

Detail API

https://asc.journals.ekb.eg/service?article_code=158226

Order

7

Type

Original Article

Type Code

1,549

Publication Type

Journal

Publication Title

Journal of the ACS Advances in Computer Science

Publication Link

https://asc.journals.ekb.eg/

MainTitle

A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions

Details

Type

Article

Created At

23 Jan 2023