Nowadays, most people have heard of Artificial Intelligence, and the industry is awash with definitions.  The one that we think makes most sense comes from the “Artificial Intelligence and Life in 2030” report from Stanford University (2016).

“AI is a science and a set of computational technologies that are inspired by the ways people use their nervous systems and bodies to sense, learn, reason, and take action”

At Adverai we think of AI as a set of skills demonstrated by a machine when it tries to mimic a similar set of skills typical of human beings. Skills such as planning, learning, and problem-solving. Interestingly, AI has evolved so that it not only mimics the function of the human brain, but its structure is also inspired by that of the brain. In both cases, the fundamental unit of information processing is the neuron.

Biological Neurons

Within the Central Nervous System, which consists of the brain and the spinal cord, a neuron is the main processing and signalling unit. Neurons generally consist of a cell body, or soma, and two types of extension that protrude from the cell body; the dendrites and an axon.

Figure 1: Representation of a biological neuron. The arrows indicate the stimuli flow.

Dendrites take their name from the Greek for tree, and are branched extensions that convey the electrochemical stimulations received from other neurons to the cell body (“inputs”). The axon, on the other hand, conducts electrochemical stimulations from the cell body to the dendrites of other neurons (“outputs”). A neuron can have multiple dendrites, but never more than one axon.

Neurons can either have an excitatory or an inhibitory output depending on their type. When a neuron transmits a signal to another neuron, this signal is added to all of the other inputs received from other peer neurons. If the added excitatory and inhibitory inputs exceed a certain threshold in the target neuron, then an output will be triggered and propagated down the axon. On the other hand, if the given threshold is not met, then no output occurs.

Information Processing

Neurons never function in isolation. They are always organised in circuits that process specific kinds of information. In common with the biological neuron, directionality is fundamental for information processing in neuronal circuits. Generally speaking, one or more inputs target a specific circuit, that circuit then processes the information, and finally, one or more outputs are transmitted. A simple example of a circuit is the myotatic spinal, or “knee-jerk”, reflex. When a hammer taps the knee tendon, an input is sent to the spinal cord. A small local circuit processes the information and emits two different outputs back to the leg. The first output relaxes the flexor muscle. The second excites the extensor muscle, making it contract and leading to leg extension.

Neuroplasticity

Neuroplasticity is the ability of the brain to adapt to environmental changes. As discussed above, neurons are part of local circuits. They are able to redefine the role they perform in such circuits by adapting their form and/or their function. New connections or synapses among neurons can be formed both in the dendrites and in the axon terminals, existing connections can be reinforced, and unused connections can be removed.

Figure 2: A small local circuit that processes one or more inputs and produces one or more outputs.

Neuroplasticity occurs as neurons respond to the stimuli received from other neurons. This can either be spontaneous activity, such as where spontaneous activity before eye opening helps in the formation of the topographic map in the primary visual cortex of the young brain, or where the neurons are activated by external factors, such as learning how to play the guitar. Neuroplasticity mediates the acquisition of knowledge and skills, hence is fundamental to the normal development of the brain and defining who we are as individuals. Learning is achieved by tweaking already pre-established representations from memory. In other words, it is based on information that is already stored before learning is performed and which is re-adapted to the current need.

If we already know how to play the guitar we do not have to start from the beginning of the learning process when we are learning how to play the bass.

Artificial Neurons

An Artificial Neural Network (ANN) is a computational model of the biological neural network, designed in a way to approximate the functioning of the human brain. As in the brain, the main processing unit of an ANN is a neuron, often called node. A node receives weighted inputs from other nodes, in exactly the same manner as a biological neuron. The node sums its inputs and applies an activation function to the summation in order to define the output. The purpose of the activation function is to reshape the summed inputs with a nonlinear function. This is important, as most real-life data is nonlinear, and nodes should be able to learn to represent real-life patterns.

Figure 3: Representation of a node. The darker the arrow the higher the weight of the input.

As with neurons in biological networks, nodes never function in isolation, and are organised in ANNs. The simplest example of an ANN is the feedforward neural network. Here, nodes are organised in interconnected layers and information is processed forward from each layer to the successive layer. There are three types of layer, each with their own types of nodes:

Input Layer – External data is fed to the input layer. The Input Layer’s role is to pass external information to the hidden layer.

Hidden Layer – The Hidden Layer gets its data from the Input Layer and processes the information. Whereas the Input Layer is always only one layer, the Hidden Layer can have none, one, or multiple layers.

Output Layer – The Output Layer gathers the processed information from the Hidden Layer, performs the final computations, and feeds this processed information to the “real world”.

Learning for Artificial Neural Networks

Unlike biological networks, ANN’s are generally trained from scratch once the desired structure of the network has been defined. At the beginning of the training process, the weights of the connections among nodes are randomly initialised.

During this phase, the neural network can be seen as a structured blank slate. Next, training data is streamed through the nodes. Output is measured, compared with the desired output, and an error calculated. Minimisation of the error can then be effected by use of an optimisation algorithm to adjust the initial connectivity weights among nodes. The process repeats itself over and over again to reduce error. A low error ensures that the neural network is able to learn and to represent the patterns of the original data.

Figure 4: A feedforward fully connected artificial neural network.

However, an ANN can also learn from pre-existing representations or, more simply put, can be based on a pre-trained network. Such a process is termed fine-tuning, and similarly consists in adjusting the weights among the nodes through optimisation algorithms.

An ANN “learns” by updating the weights among its nodes. The network is fed data that have known responses, and it modifies its outcome until the network is properly able to represent the desired outcome. A network becomes more accurate in representing specific patterns if the range and variety of the input data is higher.

As an example, let’s imagine there are two ANNs that have been trained to recognise cats from given photographs. The network that was trained using thousands of images of the same cat will perform way worse than the network that was trained on ten thousand images of a hundred different cats. A network will be able to generalise better, the broader its exposure to information.

Biology vs Machine

While similar with respect to structure, functional units, and learning, ANNs are not yet capable of mimicking the human brain for many classes of complex tasks. In comparison to ANNs, biological neural networks do a better job when it comes to generalising, as they have more complex, dynamic architectures, and greater accessibility to data. This allows biological networks to access different domains quickly via minor weight adjustment and to be more efficient when it comes to problem-solving.

The structure of ANNs is simpler and non-dynamic. Unlike biological networks, they are unable to modify their topology given a set of external stimuli. Furthermore, ANNs are orders of magnitude smaller when it comes to a number of units and their learning algorithms are comparatively naïve. They are unable to work heterogeneous tasks simultaneously: a network that learns how to count is not able to recite the alphabet.

Final Thought

Software, though, is not the only direction in which AI is evolving. Existing chips, the same as those found in computers, are adjusted in order to fit the demands of AI. In the future, it is likely that AI will be integrated on special hardware chips that also mimic the structure of the human brain. This is a cutting edge area of research, known as “neuromorphic computing”, and has chips designed for AI in a fashion to run tasks faster and with up to 1,000 times less energy demand.

It is hard to foresee how AI will develop in the coming years. Nevertheless, it certainly will continue to try to approximate the human brain ever more closely, even if it is adapted to answer different questions.

To understand how Adverai can help you create the meta-understanding of marketing, explore our website or get in touch by email.

Machine Learning Engineer at Adverai

Leave a Reply

Your email address will not be published. Required fields are marked *