Nice article!
When I hear scientific click-bait I can only think of "Futurism" facebook page that mastered the art of click-baits in the scientific domain.
I think I also came across their article about the 11 dimensions the brain can encode...
I think this miscommunication between the scientific world and the people is a major problem. Making people believe in sometimes false informations like "we only use 10% on our brain", it not very serious in general, but sometimes this wrong comprehension of our progress can lead to fear.
I'm referring to all these discussion about A.I. when for now AI are in general simple regression algorithms or basic "artificial neurons" networks (There again an artificial neuron is often just a matrix multiplication) and still very far from the terminator like super robot people are imagining.
You are viewing a single comment's thread from:
Unfortunately, this is a general issue in reporting, whether it is scientific or not. It is difficult to get people interested in nuance, in incremental changes, so flashy headlines are the norm. This is a difficult problem to solve though ...
The vocabulary we use to talk about these issues is also very important : when you hear "Artificial Intelligence learns how to read", there is an implicit tendency to associative all the cognitive functions that are associated with "Intelligence" and "reading", leading people to believe that there is some basic sentient being that has actually learned to read. This is of course completely wrong : a machine learning algorithm was able to make sense (find correlations) between a big fat set of numbers. Very cool indeed but there is no sense of meaning or purpose for it.