Imagined long ago, reality today
Alan Turing, mathematician recognized for having contributed through his works to the decoding of
the Nazis' Engima machine during the Second World War and for having laid the foundations of
informatics, was one of the first to speak of it, back in 1950. But it would be another almost half
a century for artificial intelligence to see the emergence of its first real applications. And
history has winged heels: it is now thought that companies will see their activities radically
affected by AI in the very near future.
Machine learning vs deep learning
The term artificial intelligence in fact covers quite different realities and, among other things,
two technological approaches: “machine learning” and “deep
learning”. The first is a case of a computer learning some rather basic notions, allowing
it, however, to - as it were - mimic cognitive processes close to those of the human brain. For the
second technology much more complex algorithms are used, modelled on our neurons and synapses.
Result: the machine learns in part only, developing capacities the limits of which are still hard
to imagine.