Plenty is doing.
"Essentially, a neural network is a way of processing information. It consists of a number of units, or nodes, arranged in layers; each unit is connected to several other units elsewhere in the network. Units can be either active or inactive: active units send signals to other units that either excite or inhibit them.... A single unit may receive competing signals from many other units, the combined effect of which will turn it either on or off. Information entering the system at one end, in the form of a pattern of activity among the units in the first layer, is processed through the network and eventually emerges as an output - the activity of the final layer. The interest of these structures lies in the fact that they can learn. By modifying the strengths of the connections between the units according to certain rules, a network can then generalise correctly to patterns it has not 'seen' before."
Disclaimer: this quote appears here only to spark discussion. It is not endorsed one way or the other. Make up your own mind. Or just refresh the page for another viewpoint. From a collection assembled by the late Chris Brand.