The roots of machine and deep learning come out the study of the human brain, but while that may have been a starting point, modern AI doesn't function like a human brain today. As modern machines are structured differently to human brains, it stands to reason that they can learn and process information in different ways as well.
In a fascinating blog post, the Google Translate team set out how they can use machine learning to translate between languages, even when the machine has never seen a direct example of this language pairing before. The AI here effectively translates all languages into a brand new language (clustering) it has developed from its learning - an approach to linguistics no human would ever take.
Transfer learning of this type, combined with flexible design, has the potential to offer insights you'd never see from a human perspective. Across the history of science there are many examples of discoveries made in one field, later successfully applied in a completely unrelated field which nobody predicted. Hopefully as machine learning develops, we'll see similar unexpected leaps forward.
Is the system learning a common representation in which sentences with the same meaning are represented in similar ways regardless of language — i.e. an “interlingua”?