Wednesday, December 28, 2016

Brains - 10 deep flaws and why AI may be the fix

Every card has a number on one side and a letter on the other.
If a card has a D then it has a 3 on the other side.
What is smallest number of cards you have to turn over to verify whether the rule holds?

 D      F       3      7
(Answer at end)
Most people get this wrong, due to a cognitive weakness we have - confirmation bias. We look for examples that confirm our beliefs, whereas we should look examples that disconfirm our beliefs. This, along with many other biases, is well documented by Kahneman in Thinking: Fast and Slow. Our tragedy as a species is that our cognitive apparatus and, especially our brains, have evolved for purposes different from their contemporary needs. This makes things very difficult for teachers and trainers. Our role in life is to improve the performance of that one organ, yet it remains stubbornly resistant to learning.
1. Brains need 20 years of parenting and schooling
It takes around 16 years of intensive and nurturing parenting to turn them into adults who can function autonomously. Years of parenting, at times fraught with conflict, while the teenage brain, as brilliantly observed by Judith Harris,  gets obsessed with peer groups. This nurturing needs to be supplemented by around 13 years of sitting in classrooms being taught by other brains - a process that is painful for all involved – pupils, parents and teachers. Increasingly this is followed by several years in college or University, to prepare the brain for an increasingly complex world.
2. Brains are inattentive
You don't have to be a teacher or parent for long to realise how inattentive and easily distracted brains can be. Attention is a necessary condition for learning, yet they are so easily distracted.
3. Fallible memories
Our memories are not only limited by the narrow channel that is working memory but the massive failure to shunt what we learn from working to long-term memory. And even when memories get into long-term memory, they are subject to further forgetting, even reconfiguration into false memories. Every recalled memory is an act of recreation and reconstitution, and therefore fallible. Without reinforcement we retain and recall very little. This makes them very difficult to teach.
4. Brains are biased
The brain is inherently biased, not only sexist and racist, it has dozens of cognitive biases, such as groupthink, confirmation bias and many other types of dangerous biases, that shape and limit thought. More than this it has severe weaknesses, not only inherent tendencies, such as motion sickness, overeating, jet-lag, phobias, social anxieties, violent tendencies, addiction, delusions and psychosis. This is not an organ that is inherently stable.
5. Brains need sleep
Our brains sleep eight hours a day, that’s one third of life gone, down the drain. Cut back on this and we learn less, get more stressed, even ill. Keep the brain awake, as torturers will attest, will drive it to madness. Even when awake, they are inattentive and prone to daydreaming. This is not an organ that takes easily to being on task.
6. Brains can’t upload and download
Brains can’t upload and download. You cannot pass your knowledge and skills to me without a huge amount of motivated teaching and learning. AI can do this in an instant.
7. Brains can't network.
Our attempts at collective learning are still clumsy, yet AI, collective learning and intelligence is a feature of modern AI.
8. Brains can't multitask
This is not quite true, as they regulate lots of bodily functions, such as breathing, balance and do on, while doing other things. However, brains don;t multitask at the level required for some breakthroughs. What seems like multitasking is actually rapid switching between tasks.
9. Brains degenerate and die
As it ages the brain’s performance falls and problems such as dementia and Alzheimer’s occur. This degeneration varies in speed and is unpredictable. And in the end, that single, fatal objection – it dies. Death is a problem, as the brain cannot download its inherent and acquired knowledge or skills. It is profoundly solipsistic. Memories literally disappear. The way we deal with this is through technology that archives such acquired experience in technical media, such as print, images and now data.
‘Artificial Intelligence’ has two things wrong with it – the word ‘Artificial’ and the word ‘Intelligence’. Coined by John McCarthy in 1956, it has survived the ups and downs of AI fortunes, that is not to say that these rather odd two words now capture the field’s reach and promise. In fact, they seem, at times, to be a liability.
10. Brains don't scale
Brains are impressive but they're stuck in our skulls an limited in size, as women would not be able to give birth if they were bigger. There are also evolutionary limits interms of what can be supported on the top of our bodies along with heat and energy requirements. Bottom line, however, is that warm brains don't scale.
'Artificial' intelligence is pejorative
Artificial suggests something not real. As a word it lies in direct opposition to what is real. It suggests something unnatural, a mere imitation. This dash of the pejoration debases the concept. This lies behind many of the dystopian attitudes many have towards AI. Rather like artificial grass or artificial limbs, AI successes, no matter how astonishing, feel as though they are second-rate and inferior. An even stronger pejorative suggestion is the idea that it is fake or counterfeit, the ‘artificial’ as something feigned or bogus. As the word explicitly compares the abilities of man and machine, brains and computers, anthropomorphic judgements tend to sneak in. It defines the field as simply copying what humans or human brains do, whereas it tends to do things that are very different. The human brain may not be the benchmark here. Man may not be the measurement of the machine. 
Homo Deus
Harari in Homo Deus (2016) proposes an argument that eliminates the artificiality of AI. Homo Sapiens is, like all other living beings, an evolved entity, shaped by natural selection, which is profoundly algorithmic. These algorithms exist separately from the substrate in which they reside. 2+2 =4 is the same, whether it is calculated or shown on wooden blocks, the plastic beads of an abacus or the metal circuits of a calculator. It doesn’t matter that algorithms reside in one form or another. We should conclude that there is no reason to suppose that our organic abilities will not be replicated, even surpassed. In other words, algorithmic power resides in the power of the maths to solve problems and come up with solutions, not how accurately they mimic human abilities.
Better than brains
A stronger argument is that there is every reason to suppose that other substrates will be better. The brain has evolved for an environment that it no longer operates in. It has severe limitations, suited to survival in a place and time with very different needs. Limited in size by the need to travel down the birth channel and be carried upon the standard skeleton, it has severe limitations.
Neophobia
One thing we do have in our favour, is the fact that our brains have almost certainly evolved in tandem with our use of technology. The extraordinary explosion of activity around 40,000 years ago suggests a key role of tools and technology helping shape our brains. However, there is one fascinating downside. It also seems as though neophobia (fear of the new) increases with age, which means antipathy towards new technology and AI are likely to be a feature of the brain’s defence mechanism.
Neophobia is not new
Neophobia, fear of the new, is not new. No doubt some wag in some cave was asking their kids to ‘put those axes away, they’ll be the death of you’. From Socrates onwards, who thought that writing was an ill-advised invention, people have reacted with predictable horror to every piece of new technology that hits the street. It happened with writing, parchments, books, printing, newspapers, coffee houses, letters, telegraph, telephone, radio, film, TV, railways, cars, jazz, rock n’ roll, rap, computers, the internet, social media and now artificial intelligence. The idea that some new invention rots the mind, devalues the culture, even destroys civilisation is an age-old phenomenon.
Neophobia
Stephen Pinker sees neophobia as the product of superficial reaction about cognition that conflates “content with process”. The mind and human nature is not that malleable and obviously not subject to any real evolutionary change in such a short period of time. Sure the mind is plastic but not a blank slate waiting to be filled out with content from the web. It is far more likely that the neophobes themselves are unthinking victims of the familiar destructive syndrome of neophobia.
Neophobia as a medical and social condition
Interestingly, the medical evidence suggests that neophobia, as a medical condition, is common in the very young, especially with new foods. It fades throughout childhood and flips in adolescence when the new is seen as risky and exciting. Then it gradually returns, especially during parenthood, and into our old age, when we develop deeply established habits or expectations that we may see as being under threat.
Tool of our tools
Neophobia exaggerates the role of technology. Have we ‘become the tool of our tools’, as Thoreau would have us believe? There is something in this, as recent research suggests that tool production in the early evolution of our species played a significant role in cognitive development and our adaptive advantage as a species. So far, so good. But far from shaping minds, the more recent internet is largely being shaped by minds. Social media has flourished in response to a human need for user-generated content, social communication and sharing. Input devices have become increasingly sensitive to human ergonomics and cognitive expectations, especially natural language processing through voice.
That is not to say that what we use on the web is in some way neutral. Jaron Lanier and others do expose the intrinsic ways software shapes behaviour and outcomes. But it is not the invisible hand of a malevolent devil. All technology has a downside. Cars kill, but no one is recommending that we ban them.
The internet, as Pinker explains, is not fundamentally changing ‘how we think’ in any deep sense. It is largely speeding up findings answers to our questions through search, Wikipedia, YouTube etc., speeding up communications through email, whatsapp, whatever, speeding up commerce and fundraising. It provides scale and everyone can benefit.
Neophobia as a brake on progress
Thomas Kuhn and the evolutionist Wilson, saw neophobia as a brake on human thinking and progress, as individuals and institutions tend to work within paradigms, encouraging ‘groupthink’ which makes people irrationally defensive and unsupportive of new ideas and technologies. As Bertrand Russell said, “Every advance in civilisation was denounced as unnatural while it was recent”. Religion, for example, has played a significant role in stalling scientific discovery and progress, from the denial of the fact that the earth rotates around the sun to medical advances. Education is a case in point.
We have the late, great Douglas Adams to thank for this stunning set of observations:
1) Everything that’s already in the world when you’re born is just normal;
2) Anything that gets invented between then and before you turn 30 is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re 30 is against the natural order of things and the beginning of the end of civilisation as we know it, until it’s been around for about 10 years when it gradually turns out to be alright really.
Answer:
Gut feelings are wrong.  Commonest answers are D or D and 3 but the correct answer is D and 7.
D YES – if it had another number it would falsify the rule
F – doesn’t matter what’s on the other side – rule says nothing about F
3 – popular but if it doesn’t have a D makes no difference

7 – suppose there was a D that would falsify the rule.

No comments: