What networks filter data using autofilter?

What networks filter data using autofilter?

What networks filter data using autofilter?

The last major type of network is data filtering. An early network, the MADALINE, belongs in this category. The MADALINE removed the echoes from a phone line through a dynamic echo cancellation circuit. More recent work has enabled modems to work reliably at 4800 and 9600 baud through dynamic equalization techniques. Both of these applications utilize neural networks which were incorporated into special purpose chips.

1- Recirculation.
Recirculation networks were introduced by Geoffrey Hinton and James McClelland as a biologically plausible alternative to back-propagation networks. In a back-propagation network, errors are passed backwards through the same connections that are used in the feedforward mechanism with an additional scaling by the derivative of the feedforward transfer function. This makes the back-propagation algorithm difficult to implement in electronic hardware.

2- In a recirculation network, 
data is processed in one direction only and learning is done using only local knowledge. In particular, the knowledge comes from the state of the processing element and the input value on the particular connection to be adapted. Recirculation networks use unsupervised learning so no desired output vector is required to be presented

3- at the output layer. 
The network is auto-associative, where there are the same number of outputs as inputs. This network has two layers between the input and output layers, called the visible and hidden layers. The purpose of the learning rule is to construct in the hidden layer an internal representation of the data presented at the visible layer. 

An important case of this is to compress the input data by using fewer processing elements in the hidden layer. In this case, the hidden representation can be considered a compressed version of the visible representation. 

The visible and hidden layers are fully connected to each other in both directions. Also, each element in both the hidden and visible layers are connected to a bias element. These connections have variable weights which learn in the same manner as the other variable weights in the network.

4- An Example Recirculation Network. 
The learning process for this network is similar to the bi-directional associative memory technique. Here, the input data is presented to the visible layer and passed on to the hidden layer. The hidden layer passes the incoming data back to the visible, which in turn passes the results back to the hidden layer and beyond to the output layer.

 It is the second pass through the hidden layer where learning occurs. In this manner the input data is recirculated through the network architecture. During training, the output of the hidden layer at the first pass is the encoded version of the input vector. The output of the visible layer on the next pass is the reconstruction of the original input vector from the encoded

5- vector on the hidden layer.
 The aim of the learning is to reduce the error between the reconstructed vector and the input vector. This error is also reflected in the difference between the outputs of the hidden layer at the first and final passes since a good reconstruction will mean that the same values are passed to the hidden layer both times around. 

Learning seeks to reduce the reconstruction error at the hidden layer also. In most applications of the network, an input data signal is smoothed by compressing then reconstructing the input vector on the output layer. The network acts as a low bandpass filter whose transition point is controlled by the number of hidden nodes.

Besides this filling of niches, neural network work is progressing in other more promising application areas. The next section of this report goes through some of these areas and briefly details the current work. This is done to help stimulate within the reader the various possibilities where neural networks might offer solutions, possibilities such as language processing, character recognition, image compression, pattern recognition among others.

Post a Comment

Previous Post Next Post