On limited fanin optimal neural networks
Abstract
Because VLSI implementations do not cope well with highly interconnected nets the area of a chip growing as the cube of the faninthis paper analyses the influence of limited fan in on the size and VLSI optimality of such nets. Two different approaches will show that VLSI and sizeoptimal discrete neural networks can be obtained for small (i.e. lower than linear) fanin values. They have applications to hardware implementations of neural networks. The first approach is based on implementing a certain sub class of Boolean functions, IF{sub n,m} functions. The authors will show that this class of functions can be implemented in VLSI optimal (i.e., minimizing AT{sup 2}) neural networks of small constant fan ins. The second approach is based on implementing Boolean functions for which the classical Shannon`s decomposition can be used. Such a solution has already been used to prove bounds on neural networks with fanins limited to 2. They generalize the result presented there to arbitrary fanin, and prove that the size is minimized by small fan in values, while relative minimum size solutions can be obtained for fanins strictly lower than linear. Finally, a sizeoptimal neural network having small constant fanins will be suggested for IF{submore »
 Authors:

 Los Alamos National Lab., NM (United States)
 Wayne State Univ., Detroit, MI (United States). Vision and Neural Networks Lab.
 Publication Date:
 Research Org.:
 Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
 Sponsoring Org.:
 USDOE Assistant Secretary for Human Resources and Administration, Washington, DC (United States)
 OSTI Identifier:
 654140
 Report Number(s):
 LAUR974314; CONF971235
ON: DE98004666; TRN: AHC2DT05%%228
 DOE Contract Number:
 W7405ENG36
 Resource Type:
 Conference
 Resource Relation:
 Conference: 4. Brasilian symposium on neural networks, Boifnia (Brazil), 35 Dec 1997; Other Information: PBD: Mar 1998
 Country of Publication:
 United States
 Language:
 English
 Subject:
 99 MATHEMATICS, COMPUTERS, INFORMATION SCIENCE, MANAGEMENT, LAW, MISCELLANEOUS; NEURAL NETWORKS; INTEGRATED CIRCUITS; FUNCTIONS; SIZE; OPTIMIZATION
Citation Formats
Beiu, V, Makaruk, H E, and Draghici, S. On limited fanin optimal neural networks. United States: N. p., 1998.
Web.
Beiu, V, Makaruk, H E, & Draghici, S. On limited fanin optimal neural networks. United States.
Beiu, V, Makaruk, H E, and Draghici, S. 1998.
"On limited fanin optimal neural networks". United States. https://www.osti.gov/servlets/purl/654140.
@article{osti_654140,
title = {On limited fanin optimal neural networks},
author = {Beiu, V and Makaruk, H E and Draghici, S},
abstractNote = {Because VLSI implementations do not cope well with highly interconnected nets the area of a chip growing as the cube of the faninthis paper analyses the influence of limited fan in on the size and VLSI optimality of such nets. Two different approaches will show that VLSI and sizeoptimal discrete neural networks can be obtained for small (i.e. lower than linear) fanin values. They have applications to hardware implementations of neural networks. The first approach is based on implementing a certain sub class of Boolean functions, IF{sub n,m} functions. The authors will show that this class of functions can be implemented in VLSI optimal (i.e., minimizing AT{sup 2}) neural networks of small constant fan ins. The second approach is based on implementing Boolean functions for which the classical Shannon`s decomposition can be used. Such a solution has already been used to prove bounds on neural networks with fanins limited to 2. They generalize the result presented there to arbitrary fanin, and prove that the size is minimized by small fan in values, while relative minimum size solutions can be obtained for fanins strictly lower than linear. Finally, a sizeoptimal neural network having small constant fanins will be suggested for IF{sub n,m} functions.},
doi = {},
url = {https://www.osti.gov/biblio/654140},
journal = {},
number = ,
volume = ,
place = {United States},
year = {1998},
month = {3}
}