When constants are important
In this paper the authors discuss several complexity aspects pertaining to neural networks, commonly known as the curse of dimensionality. The focus will be on: (1) size complexity and depth-size tradeoffs; (2) complexity of learning; and (3) precision and limited interconnectivity. Results have been obtained for each of these problems when dealt with separately, but few things are known as to the links among them. They start by presenting known results and try to establish connections between them. These show that they are facing very difficult problems--exponential growth in either space (i.e. precision and size) and/or time (i.e., learning and depth)--when resorting to neural networks for solving general problems. The paper will present a solution for lowering some constants, by playing on the depth-size tradeoff.
Year of publication: |
2008-02-04
|
---|---|
Authors: | Beiu, V. |
Subject: | mathematics, computers, information science, management, law, miscellaneous | NEURAL NETWORKS | LEARNING | RESEARCH PROGRAMS | SIZE | WEIGHTING FUNCTIONS | ALGORITHMS |
Saved in:
freely available
Saved in favorites
Similar items by subject
-
Constrained blackbox optimization: The SEARCH perspective
Hanagandi, V., (2009)
-
Characterization of nonlinear dynamic systems using artificial neural networks
Urbina, A., (2010)
-
Benjamin, A.S., (2010)
- More ...