Sunday, 16 December 2018

Are the Young More Creative/Innovative? - Not Really (Part 2)


In Part 1, I had looked at the prevailing perception that younger people (age less than 30 years) are more innovative than their older peers.  In reality, scientists and business workers innovate far more effectively only after reaching an age of about 40 years.  Fundamental research by Nobel Laureates and profit-directed innovations (patents) in the industries show a close similarity of innovation frequency and age. Given the  very different motivations in the two groups, this is an interesting finding. The slide shows the age at the time of innovation.
Slide 1:
  
In the 20th century the age at the time of innovation has shifted to higher values by 6 to 8 years.  The shift has happened for Nobel laureates and for innovators in business alike as shown in the next slide:
Slide 2:

I would like to understand - What are the factors that determine creativity/innovation dynamics and how one might be able to boost the innovation output?


AGE: Young children are more creative as they interpret the world around them with few preconceived notions.  Child prodigies can manage astounding feats but most 'burn-out' rather quickly. 'The road from kid-genius to adult-dud is a well travelled one'.  The available evidence does not support the notion that child-geniuses will mature into prolific innovators - in most cases their contribution has been minimal.
Even laureates, who make significant contributions in their twenties, do not always continue to make seminal contributions to fundamental research in later life. In physics Heisenberg, Dirac, Einstein, Lawrence Bragg and many others did their Nobel Prize winning research in their early to mid-twenties but did not publish much after the age of 40 years. One wonders if there is a limit to the total innovation that human mind can create.  
As we have discussed above, age itself is not a barrier to great innovations. John Goodenough is still innovating at age 94.  

Younger people might be better placed for creative activity as they are less encumbered by the many demands that adult life brings, but they lack experience and breadth of knowledge that focused innovation demands.  
There are special situation, such as presented by technological revolutions, that might override the need for a wide knowledge base and allow younger innovators to come to the fore.  
This has happened at the start of the first industrial revolution, at the beginning of the 20th century with the paradigm shift due to the theories of relativity and quantum mechanics, introduction of the personal computers in the 1970s, the Internet in the 1990s etc. 

IQ and Innovation: 
The intellect and achievements are far from perfectly correlated

About a hundred years ago, IQ tests were designed (1) to 'measure' intelligence.  50% of the population has scores in the range from 90 to 109 (average IQ). Only 2.1% have an IQ greater than 130 and 0.1% have an IQ of 190+, these are highly gifted people or geniuses.  
You might think that those with IQ of 130+ will be extremely innovative but studies have shown that they do no better than average population in life - 'children with high IQ turned out to be run-of-the-mill people'. It will be fair to say that IQ tests examine some areas of intelligence but neglect others such as creativity and social intelligence. An IQ of about 110 to 120 is all one needs to have a good chance of becoming a successful innovator.  

The longest study, of how successful high-IQ people are in real life, was performed by Lewis Terman (1877-1956).  In 1921, Terman selected and followed the progress of 1470 primary school children with IQ of 140 to 200 (top 0.1% in IQ intelligence) - they were called termites
Termite progress reports were prepared and published as 'Genetic Studies of Genius'.  By adulthood, of the 730 termites still reporting, Terman could classify them, according to their achievements, in three distinct categories - 20% As, 60% Bs, 20% Cs.  
A - high achievers, professionals with high earnings, 98% with graduate degrees 
B - average achievers
C - low achievers, blue collar workers, some with no jobs at all, only 5% had                graduate degrees

Remembering that during early life, Cs were geniuses in the top 0.1% population - the results are deeply confusing.  Terman had believed that termites were destined to be the future elites.  
Most schools, universities and employers still perceive that those who score high in IQ tests have the greatest potential and formulate their selection policies accordingly. This is another example of perils of perception - we know that a high IQ score has limited relevance for success in later life but we still use it to determine our selection policies.  By doing this we might be squandering talent and doing immense harm not only to the individuals but to the national prosperity.

Thankfully, such longitudinal studies provide much additional data that could be usefully analysed to help create a better picture of what contributes to success.  I shall return to this topic later in this blog.

Burden of Knowledge:  
'If I have seen further than others, it is by standing upon the shoulders of giants'     Isaac Newton

Successful innovators gain their insight from previous accumulated knowledge - one does not have to invent the wheel again.  In the STEM context, the accumulated knowledge is increasing rather rapidly and an innovator has much more to learn before starting to innovate effectively.  More than 85% of Nobel Laureates in the sciences had a PhD.  The average age at the time of their PhD was 26.2 years.  Interesting to note that most laureates, who did their award-winning work before the age of 30, did the research as part of their doctoral thesis (Einstein is an obvious exception here). In 2013, the median age of completion of PhD in USA was 29.9 years - a considerable increase over the age at which the laureates completed their PhD in the 20th Century. Most innovators in sciences and in business do their best work after receiving their highest degree and it seems reasonable to infer that the mean age of innovation is increasing partly because people are taking longer to acquire a solid knowledge base.

Let us look at the age at which Nobels were awarded in the 21st centuryThe second slide in this blog shows the data - some numbers will be useful:  In chemistry, until the year 2000, 66% of the chemistry laureates were age 40 or less; however, in the 21st century no scientist under the age of 40 was awarded Nobel Prize in chemistry.  
In the 20th century, 60% of the prize winning work in Physics was done by those under the age of 40 years - this has now fallen to less than 20% in the 21st century.

Opportunity Knocks! There are instances when innovation may happen at an earlier age.  I shall illustrate this by describing two such situations:

Physics in the early 20th Century:  The decades preceding the year 1900 were some of the most traumatic years for physicists.  Classical physics (largely based on common sense) had been extremely successful except that work in the previous 50 years or so had revealed deep anomalies between empirical measurements and theoretical expectations.  It was only the emergence of the quantum (with Max Planck but mainly from Einstein in 1905) and relativistic (single-handed by Einstein) theories that physics was put on a much firmer footing and by the year 1935, it was the beginning of the realisation that we have a good grasp of the laws of nature.
The genius of Einstein was to think laterally and bravely follow a completely new approach (Look at the link for my 8-hour outreach course on Einstein's work) .   

For younger physicists from about 1900 to 1935, it was a god-sent opportunity when they could simply concentrate on the task at hand in their respective research groups and innovate without acquiring a complete knowledge base.  Much of the seminal research was done by scientists at the graduate student and post doctoral level in European labs.  This is evident in the statistics of Nobel Prize winners and is largely responsible in the observed drop in the mean age when award winning work was done and as the recognition was swift too, the age when the Nobel was awarded also came down.

Arrival of the Personal Computer in 1975: The main frame computers were bulky occupying a big room, power hungry and most required water cooling.  First personal computer arrived in mid-seventies, they were small portable and inexpensive for an ordinary person to own and use - they totally transformed the way we now use computers.  A large amount of innovation occurred - particularly in software development.  Young innovators were extremely well placed to start businesses offering digital technology through the lap-top.  People who were best placed to exploit this opportunity were those who were not too old to have settled in a family and work routine or too young to still be at college - the innovators who were born between 1950 and 1960 had the fortunate break and they grabbed the opportunity with enthusiasm.  The next slide lists some of the entrepreneurs (Inspired by Outliers):



Most of these characters are multi-billionaires, are some of the richest people in the world. They run mega-sized technology companies with a combined worth of more than 5000 billion dollars.

It Matters Where You Come From: Innovation is having a good idea and then successfully implementing it.  The first part (creativity) is an inspiration - one generally has a lot of interesting novel ideas every day.  How we efficiently convert these abstract ideas into reality is something I do not have a recipe for.  For sure, for success one needs discipline, persistence and encouragement - all require a supportive environment from early childhood.  Habits, formed when you are young, will stay with you - you develop 'good' habits if you are fortunate to be born in a family that has role models (normally your parents) and reasonable financial resources. 

The most important thing appears to be that many successful entrepreneurs start young - I suggest that you should read this link to learn first hand what the entrepreneurs say about how they achieved success.  

With determination and hard work, some became successful in life even though they had an unfavourable start in life - what they had in common was a will to succeed. They had grit - rock-solid resilience and extra drive that keeps them focused on the end result.  A gritty individual approaches achievement as a marathon; his or her advantage is stamina.

Serendipity: The faculty of making happy and unexpected discoveries by accident  …The OED

Many chance observations have led to Nobel Prize winning work and successful inventions. Some everyday items like teflon, velcro, nylon, penicillin, safety glass, sugar substitutes, plastics, post-it notes, viagra, microwave oven and many more were accidental discoveries.  In sciences - pulsars (neutron stars),  cosmic microwave radiation, neutrino oscillations, X-rays are just a few of the many examples of serendipitous discoveries.  See also for role of serendipity in drug discovery.

In the context of innovation, serendipity does not mean pure luck.  Many discoveries appear to be lucky accidents when the discoverer happened to stumble on the idea  - but, it is not so simple. On the basis of her previous knowledge and experience, the discoverer must be prepared to recognise the significance and potential of the observation. As Pascal had said: Chance favours the prepared mind.

An analysis of 117 Nobel Laureates, in Physiology or Medicine and Chemistry in the past 25 years, came up with 14 discoveries as totally serendipitous, 72 problem driven and 31 hybrid in which serendipity contributed substantially.  

It helps if you are in the top universities: While Nobel Prizes have been awarded to scientists who have graduated or worked in lesser known universities, the bulk of awards go to those who are affiliated in one of the top universities.  Wiki lists -  Nobel laureate number in brackets -  Harvard (158); Cambridge, UK (118); Columbia, NY (96); Chicago (98); U of California, Berkeley (107) among the top five. (There might be some double-counting in these numbers due to the problem of defining affiliation)  
Interestingly, out of 915 awards, USA and UK scientists have won 368 and 132 awards respectively but have only 5% and 0.9% of the world population. 
Even within the country, not all regions do as well.  In the USA, most Nobel Laureates come from New England and California while in the UK, University of Cambridge dominates.
The people around you, the academic culture means a lot in motivating a young graduate.  The reputation of the university draws talent and all-important research funding. The concentration of highly talented and motivated people who can exchange and discuss ideas (brainstorming) creates a synergy to move innovative enterprise forward.  This is not only true of scientists but equally so in the industrial context - although, in businesses the need of confidentiality is greater.

Barriers to Innovation: There are many.  Our discussion already points to potential damage done by selection processes based on IQ and by supporting research that is mainly problem solving. Such policies disadvantage innovative activity.

The way things are organised just now, highly innovative people are promoted to administrative work that pays better but hinders them from effective innovation.  This might be particularly true in Japan where innovation drops sharply at age 40 years (slide in Part 1).  

When a scientist reaches a preeminent position in his/her discipline, it is zealously guarded.  New ideas may not be supported, or are actively discouraged. In this context, the actions of ageing Lord Kelvin make interesting reading (see 3rd slide from the end in the link). There are many such examples.

In industries, the pole position is defended with all might - start-ups are bought and dismantled, competition is suppressed etc. - the energy/oil industry is a good example here.

Gender bias:  Females have formed a tiny percentage of innovators in the past and the situation continues to be unsatisfactory.  We are essentially losing about half of the innovation potential of the human race.  I shall refer you to many studies on this topic.  Thankfully, deeply held prejudicial beliefs in our society about women are gradually being dismantled and one hopes that in the coming decades they will be able to play a more prominent part in the advancement of knowledge and wealth creation. 

Final Word: To write this blog has been a pleasurable task - it has helped to organise my thoughts about creativity and innovation.  The way human mind operates is rather mysterious (or I should say - not well understood).  We have been able to identify some situations that help innovation and changing the way we select and promote innovative activity could help.
Technology - in the form of artificial intelligence - will be able to identify potentially fruitful lines of research and development - we are not there yet.  Interesting times ahead.

Love to hear your comments.  Please pass on the link to this blog to friends and family.

Sunday, 9 December 2018

Are the Young More Creative/Innovative? Not Really - Another Example of Perils of Perception

Creativity is to come up with new and useful ideas. Innovation is the successful implementation of those ideas.

People under 35 are the people who make change happen; people over 45 basically die in terms of new ideas...                                                                                                                                                                        Vinod Khosla (Venture Capitalist)

Age is, of course, a fever chill; that every physicist must fear.
He is better dead than living still; when he is past his thirtieth year.                                                                                                                    Paul Dirac (1933 Nobel Laureate in Physcis)

Mark Zuckerberg became CEO of Facebook (one of the biggest companies in the world) at age 23, Gauss and Ramanujan died at age 28 and 32 years respectively, but both contributed enormously to the field of mathematics.  When I talk to people, the common perception seems to be that the young are more creative than older adults - but is this true?  Do the facts bear this out?  Look a little more carefully and the reality is completely different. This perception is a classic example of how our brain overestimates the significance of outliers.  From the exceptional achievments of a few young innovators, we start to believe in a general rule. Once the society starts to believe this perception - decisions are made on wrong premises and potentially can be very damaging - decisions like which projects should receive venture capital fund, should middle aged employees be trained, we need to get 'new blood' to move forward etc. 

I want to disprove the myth of 'young are more creative' by (i) giving examples of Nobel Laureates in the sciences, (ii) discuss evidence that relative to their younger colleagues, older business entrepreneurs contribute far more to innovation and wealth creation.  
In Part 2, we shall try to understand why the myth of 'greater creativity of the young' came about and the reasons that this perception is based on false premise; we shall also identify several factors that contribute to make people more effective innovators. 

Nobel Laureates:  In the past 118 years, about 900 Nobels have been awarded to people in sciences and other fields for outstanding, original achievements.  The statistical information available is impeccable and has been thoroughly analysed. The youngest recepient, Malala Yousafzai, was age 17 and the oldest, Arthur Ashkinwas 96 years (Ref, see also).  The following table summarises the data (1, 2) of the mean age of Nobel Laureates in Sciences:

 Age (years) at          All Awards    Chemistry     Medicine       Physics

Time of Award                  60          58±25          58±30         56±32

Prize-Winning Research  39±8.5       40.2±8.5     39.9±7.9      37.2±9.2

HighestDegree Earned    26.1±3.4    25.5±3.2      26.7±3.6    26.2±3.4            

What is important for our discussion is the age at which the prize-winning research was done, and it is nearly constant at 39±8 for all disciplines.  


Interestingly, we note from the table that after receiving their highest university degree, the laureates, on average, wait 16 years for their best work to be performed and another 20 years before the work is acknowledged for the award.  This is no coincidence - Malcolm Gladwell, in his book Outliers - the STORY of SUCCESS - discusses how it takes 10,000 hours of hard work for a person to reach maturity in all fields of endeavour; this amount of effort is required to start contributing at your highest level - it does not seem to matter what field you are in - it could be music, arts, digital technology, medicine etc. Besides hard work, other factors pertaining to your family, colleagues, prevailing cultural norms also determine how quickly and securely you get to be the best in what you are aiming for. The slide shows the effect of the environment on your performance:



Innovation in Business:  Entrepreneurs, managers, scientists and engineers innovate meaningful, marketable products and services that create wealth and improve quality of life of the nation.  Such innovations are based on knowledge, expertise, opportunity, government policies, but most of all on hard work. (we all know of the old adage: success is 5% inspiration and 95% perspiration). 

Information Technology and Innovation Foundation (ITIF) in the USA have recently conducted, in 2016, a comprehensive study of award-winning innovators and international patent applicants.  A similar 2009 study in Japan broadly supports the conclusions of  the ITIF study.  I shall use the data from these studies to discuss the age profile of ace innovators in the USA and Japan. 
The ITIF study have found that innovators tend to be experienced and highly educated; and most hold advanced degrees in science and technology              (76.3% had a postgraduate degree -- MSc or PhD).

The following table lists the median age of innovators in different sectors:

                  R&D 100 Awards                     46 years
                  Large Tech Companies             44
                  Life Sciences                           50
                  Information Technology            53
                  Material Sciences                     47



An even better indicator of innovation is the granting of patents which are more likely to create wealth and success: 

The situation in Japan is similar:
Japanese inventors appear to be about 8 years younger than in USA.  This might be due to much greater emphasis in Japan on academic work at  school and university level and Japanese workers retire early.  However, both in Japan and the USA the quality of patents, according to their domestic economic value, is much superior for inventors in more senior age group (see the following slide)



Various studies have shown that the situation is the same in other OECD countries with older workers innovating far more than younger employees.

It should be appreciated that Nobel Prize Winners indulge in largely academic pursuits, fundamental research - without worrying about wealth creation; while for inventors in industries the main focus is wealth creation.  These are two very different groups of innovators, but as we have discussed above and the following slide summarises, the age of innovation is surprising similar in the two groups.  While age appears to be a significant factor; other important factors, that help people to innovate more effectively, are also in play.  



In this first part of my blog, I have found that people in sciences and business are most innovative at about age 40 years.   It is only later in life that the most impactful innovations are made.  There are some instances when young people in their twenties have made significant contributions but this must be looked at with correct perspective.

In Part 2 of the blog I shall address several questions like:

(i)  What factors help to make a person more innovative?
(ii) The role of IQ  - are people with higher IQ more innovative?
(iii) Why the mean age for innovation might be increasing?


Thanks for reading.

Wednesday, 28 November 2018

Ethics of Eating Meat - We Need to Factor-in Sustainability


Whether eating meat is ethical or not is a subject that arouses passion and interesting debate (see and comments).  What is lacking in most such discussions is the realisation that ethics is not a fixed entity but evolves to encompass the changing values of the society.  As somebody had aptly said - 'Change is the only constant' and this is what we have to keep in mind when we address the question of ethics of eating meat. For background, please see - particularly the section on behavioural ethics(In this blog, I shall assume that ethics and morality refer to the same thing)

Humans are omnivores - meat and plants have formed their diet.  It is only in the past couple of thousand years or so that we have questioned the ethics of eating meat. Ironically, the control of fire increased meat consumption that resulted in the human brain to outgrow brains of other animals, allowing us to ask questions about ethics and morality of eating meat.  
What people ate in the past, while a worthy subject for discussion, must not be reason for deciding what we should be eating in the future. Life is very different now than what it was even 100 years ago, and looking forward we need to appreciate the new reality of the 21st century, and develop the ethics of what we eat including meat accordingly. 

What is the new reality?  With reference to food - in rough proportions-   

(1) 15% of the world population is undernourished, while 40% is overweight or obese.  
(2) A third of the agricultural produce is wasted/destroyed by failure to efficiently manage our food resources.
(3) We have switched, in a big way, to animal-farming that is 10 times more inefficient than plant-based nutrients.
(4) A good proportion of corn and soya crops are used to raise cattle.  This could be used to feed the hungry people in the world. Bio-fuels take away further food resources.
(5) Agriculture has gone big - 'agriculture is the way by which oil is converted into edible food' and in that process is doing great deal of harm to the environment and climate.
(6) Population numbers keep growing and are expected to reach more than 9 billion by 2050.  Coupled with increased consumption per capita, much more food will be needed in the future.

We ignore these realities at our peril.  The ethics of what we eat has to be decided by what is sustainable - essentially what type of the world we want to live and leave for future generations? That is the moral/ethical question.  I have addressed this in my recent blogs (1, 2, 3).  Reducing red meat consumption is a definite conclusion from these analyses.

I can already hear the howls and shrieks from some that without eating meat human body cannot survive  - not enough protein, not enough B12 - we shall become weak and be not able to fight diseases.  That will be the end of human race.

The situation is not that way at all.  Apart from 4 years in the 1960s, I have been a vegetarian, I do not ever take supplement vitamins, kept a constant weight (with BMI of 23) for the past 40 years and visit my doctor once a year for a blood test.  I might have considered adding meat to my diet but the current methods of meat production and processing flag a clear message to me to stay away from eating meat.  

So far, this has been a rational discussion - it appears that it might even be possible to solve the food crisis the world is facing.  But that is where life gets more complex - humans love power, have a selfish streak to their nature and there are some who can just not tolerate others do better or even equally well. Then, there is the human mind which can be irrational some of the time (actually I should write - quite a lot of the time).  Essentially, what I want to say is that those who have power will exercise it to grab far more than their fair share.  The so-called developed world has done so (imperialism  and slavery in the past; waste and overconsumption now) and the new 'developed' countries like China and India are following the OECD example. 

Pseudoscience is also raising its profile.  Our political leaders no longer care to set examples that we can follow.  Lying in the face of evident truth is almost acceptable - thanks to our great new leader.  
I must say that I do not feel much hopeful that our moral compass will define a sensible, viable route for the future.

Thanks for reading...

Monday, 26 November 2018

A Magic Square based Party Game



Sometime ago, I had published a version of this game as part of a general blog on magic squares. From the feedback, it seems that people would like more instructions about how to use it, and also how to change it to suit their own usage.  I shall try to help in this through the following blog.

Let us start with a six by six grid that I have constructed - later I shall explain how to make such a grid yourself.  The game is described in the next four slides.  Yes, you can use negative and/or decimal numbers as well.
(click on a slide to see full page image; press Esc to return to main text)



Remember that the number you choose must not be on the scored out numbers (I have shown them masked in this illustration)


The point of the game is that one can choose any combination of the numbers as described above and the sum will always come out to be 50 - very convenient if the birthday or anniversary you are celebrating is the 50th.  

But what if the sum you want is different, say 20 or 70 or something else.  I shall describe how the make a grid with a different outcome for the sum.

Suppose you wish the sum to be equal to the number N.
To play with a six by six grid, you should choose 6 + 6 = 12 numbers that sum to the number N.

let us choose a,b,c,d,e,f,p,q,r,s,t,u for the 12 numbers - they must sum to N.  

Now, outside the grid, place the first six numbers (a to f) horizontally and the other six numbers (p to u) vertically.  In each cell, write doen the sum of the horizontal and vertical numbers as shown in the next slide.  This completes the grid and you are ready to play the game. Simple!
You do not need to have 6 x 6 grid.  You can work with m x m grid but choos 2m numbers to add to N.

Enjoy!
Pass the web link of the blog to friends and family.

Sunday, 25 November 2018

Why Do Humans Have two Front-Facing Eyes? An Analysis and Some New Ideas.

"Eyes in the front, the animal hunts. Eyes on the side, the animal hides."


Index of BlogsBlogger Profile
https://ektalks.blogspot.com/2018/11/why-do-humans-have-two-front-facing.html

Humans are primates and all primates have two front-facing eyes.  Why?  

Currently, the explanation goes like this: The binocular vision provides stereoscopic or three-dimensional (3D) view that helps to locate and pin-point objects more precisely.  This is good if you are hunter/predator; and it helps in arboreal living - gives ability to swing and jump more accurately between tree branches - good if you live in trees

Early primates were indeed tree dwellers; and besides finding insects to eat, they ate plant leaves and fruits.  None of the primates were predators in the usual sense of the term. Chimps are the closest relatives of humans; and among the great apes, only humans and chipms eat meat but only infrequently.  Traditional human societies appear to have relied more heavily on plant-based diets

So one might think that more likely front-facing eyes evolved to help in arboreal living - living in trees helped early primates to stay safe from predators and also allowed easy access to tree leaves and fruits.  The problem with these theories is that the earliest primates were actually nocturnal and relied on smell more than vision.  But see also.

For many millions of years, great apes have lived mostly on ground and have not been hunters.  I would think that input from the front facing eyes serves a more fundamental purpose and is not necessarily wholly related to predation or arboreal living. I base this suggestion on the fact that in the human brain, neurons devoted to visual processing number in hundreds of millions and take up about 30% of the cortex - as compared with 8% for touch and just 3% for hearing.  Each of the two optic nerves, which carry signals from retina to the brain, consists of a million fibres; each auditory nerve carries a mere 30,000.  It would be unusual for evolution to invest so much energy in visual processing if it only helped humans to carry out relatively minor activities of hunting and tree dwelling. 

So, why do primates have front facing eyes?  In this publication, I want to examine this question in more detail - (1) by looking at the evolution tree of primates, (2) I shall discuss the pros/cons of having front-facing eyes against eyes on the sides of the head. Lastly, (3) I shall argue that the complex society/environment, that humans have been living for the past million years or so, requires the most elaborate vision system and a good part of the 'new brain' was earmarked for processing the 'superior' visual signals possible by having two front-facing eyes.  

(1) Primates: Let us look at the evolution of primates that goes back to some 60 million years.  The next two slides summarise the evolution tree of primates:
(please click on a slide to see full page image; press Escape to return to main text)

Let us start with Prosimians - the first primates. Prosimians are nocturnal, have large eyes with a tapetal (retro-reflecting) layer behind the retina to help night vision, but their eyes are not as well positioned for 3D vision as are the eyes of other primates. They have well-developed sense of smell and hearing; a larger proportion of prosimian's brain is devoted to the sense of smell than the sense of vision.  Prosimians are insectivorous, also eating fruits.  It is plausible that two front-facing eyes helped the prosimians to live in trees and search for insects during the night. However, it is only about 20 million years ago that in the apes, eyes developed to have the full bony sockets, full colour vision etc. 
For completeness, I list some common traits that primates (apes) share:




Among the apes, humans are unique in having a much larger brain relative to the body mass - human brain is 1.3 kg that is almost 3 times the size of a chimp brain even though they have similar body mass.  This divergence is due to the rapid development of the preforntal cortex in humans starting some 2 million years ago.   

(2) Vision with side-facing and front-facing eyes: Most animals have either front-facing or side-facing eyes.  Conventional wisdom is that hunters/predators have front-facing eyes as the binocular vision provides greater accuracy in determining distance and location of the prey; animals, who are preyed on, have side-facing eyes as the monocular vision provides almost a 360 degrees view and helps in detecting an approaching predator.
I have drawn the following slide to explain the difference:
 
Depth Perception:  In binocular vision, we perceive depth/distance of objects by receiving information from two different angles.  If we close one eye - we have monocular vision as many two-eyes-on-the-sides animals also have. However, with one eye only, our depth perception does not seem to be enormously different; the question arises how can/do animals perceive distances of objects with monocular vision?  

We can do this because the brain uses a variety of depth cues.  The brain has a big memory bank and large processing power; and it seems to do a pretty good job of interpreting depth cues. Binocular vision just makes the depth perception so much better. For animals, it is a trade off between limited 'good' coverage of about 180 degrees or nearly full 360 degrees 'good-enough'/functional coverage.  One chooses what gives the best chance of survival.
Wiki has a detailed article on depth cues; here, I shall discuss only some of the most important cues: 

Relative Size Cue:  If two objects (e.g., two trees) are known to be the same size and if one subtends a larger visual angle on the retina than the other, the object that subtends the larger angle appears closer.

By observing the angle projected by the object on the retina, the brain can determine the absolute distance of the object using the previous knowledge of its size.

Similarly, the brain uses perspective (parallel lines converging in the distance) to reconstruct relative distances of two parts of some large object (a building or landscape features)
     
Motion Parallax:  

Transverse Motion:  Parallax is the apparent change in position of an object relative to distant background objects resulting from a change in position of the observer.  
The relative motion of an object against the background objects gives hints about its distance; for example, when travelling in a train or a car, nearby objects pass quickly while far off objects appear stationary. The use of motion parallax for depth perception is widespread throughout the animal kingdom.  Birds bob their heads to achieve motion parallax, squirrels move in lines perpendicular to an object of interest to locate its position etc.

Radial Motion:  If an object is moving towards you, its image on the retina increases in size - the changing image size enables the observer to not only see the object as moving but also to perceive the distance and the speed of the moving object.  

Accommodation:  Mammals, birds and reptiles vary the optical power of the eye by changing the shape of the elastic lens.  Fish and amphibians vary the power by changing the distance between a rigid lens and the retina. 


The far point may usually be taken as a distant point at infinite distance and focusing on an object at finite distance allows the brain to perceive its distance.

The muscles inside the eye (ciliary muscles) bring about the mechanical changes in the lens.  The sensation of contracting or relaxing of the ciliary muscles in focusing on a nearby object is sent to the visual cortex where it is used to interpret distance/depth of the object.  Accommodation is most effective in judging distances less than 2 metres. In humans, the accommodation amplitude can be up to 15 Dioptres.

Recent Work:  Some fascinating work has been done in quantifying the superiority of binocular vision over monocular vision.  An interesting conclusion is that for large distances of greater than about 25 metres, the two visions have similar precision; but for closer objects, binocular vision precision, depending on the availability of depth cues from the surrounding, can be up to 40 times better than for monocular vision (see 1, 2). 

Visually Guided Behaviour:  Studies with visually guided behaviour like walking over and around obstacles have observed that walkers were quicker by about 10% when using binocular vision and also  judged the height of obstacles with greater precision.  There was higher uncertainty in monocular vision, leading to greater reliance on feedback (from depth cues) in the control of movements. 

Seeing Objects behind Obstacles 




It is not only in depth perception that binocular vision is superior, but significantly, results from recent research point to advantages of binocular vision in carrying out many aspects of daily activities

Structure of the Eye:  Actually, nobody tells you about the scrappy and incomplete information about the surrounding environment that the eye sends to the brain - the brain fills in the voids, and this can be dodgy.  To understand 'the trip-wire act' that vision processing is, we need to learn how the eye collects visual data about the surroundings.  Essentially, one thinks of the retina as an extension of the brain and it is at the retina that external light is received to generate the electro-chemical signals which are then processed in different parts of the brain (see Section 2). The interpretation of the results again requires the brain to fill in lot of details using guesswork and can also be dodgy.  It is surprising that, most of the time, the brain appears to make a decent job out of the whole process.
   
First, I describe the anatomy of the eye and the retina in the following slides. 





In humans, the retina is a 0.5 mm thick layer of cells and covers 72% of the spherical eyeball of 22 mm diameter. The retina covers about 150 degrees vision area in front of the eye. 
The optical disc  where the bundle of 1 million nerve-fibres leave the retina is an area of about 3 mm square. It contains no light-sensitive detectors creating a blind spot in the vision.
Light detectors are called rods and cones.  There are 100 million rods (sensitive to the brightness of light but not its colour), while the 7 million cones in the retina are sensitive to colour but not very sensitive to brightness.

Cones are primarily concentrated in the central retina (see slide) as  hexagonal mosaic in the fovea and its surrounding macula (diameter = 5.5 mm). Fovea is the region of sharpest visual acuity, is very small in size and contains no rods.  The pit in the macula (parafovea) is 1.5 mm diameter. The area surrounding the fovea has the largest concentration of rods and has the most sensitivity to light.  
  
The fovea of the eye sees the spatial details and full colour of the objects with the greatest sharpness - but that is a tiny angular range - of the order of a few degrees.  The reason, that we can see the bigger picture spanning almost 130 degrees range without moving the head, is  that our eyes constantly dart about, fixating for a fraction of a second and then moving on.  The jerky movements are called saccades.  We make about three saccades per second, each lasting between 20 and 200 millionth of a second; we have no concious control on saccades - the brain manages it.  While saccades are happening, we are effectively blind. The brain does not use information picked up during a saccade but uses guesswork to fill in the details.

How is Visual Information Processed by the Brain:  
{This section may be missed out without loss of continuity}

The information from the eye is carried by the axons of the retinal ganglion cells to the midbrain.  In the brain,  visual processing is akin to an orchestra, where clusters of cells in different parts of the brain co-operate to process different components of visual information such as vertical or horizontal orientation, colour, size, shape, movement etc. 
The following two slides show a schematic of how visual information from the eye flows to the brain. I refer you to the original lectures for more information on this topic.



The visual information received is then analysed by various parts of the brain; the brain collects the results and constructs a picture of the external view.  Using memory and previously held information, the brain updates/fills-in any missing information to form a full picture. All of this is done in the blink of an eye!  Generally, it does a good job too.  
But a word of caution here:  It is not too difficult to  fool/manipulate the brain.  It uses past experience to construct from somewhat incomplete information that the eye sends and guesses what the missing information might be.  It is not too difficult for the brain to get the whole thing wrong.  The subject of optical illusions (see an infographic with many examples here) and hallucinations provide fascinating case studies where brain gets the results totally wrong. Eyes play a central role in meta-communications and bizarre effects like Uncanny Valley are observed directly as a result of brain's processing of visual signals. Dreaming is another example when teh brain creates visual perception when no external input is present.

And, it is not only in processing visual information, the brain is equally fallible in interpreting hearing, smell, taste, emotions and in many other decisions it makes.    


(3) Humans and Vision:  The above discussion leaves unanswered the question - 'why we have two front-facing eyes?' To some extent this is of academic interest only.  What we really want to understand is - why evolution has invested so much, almost 30-50% of brain resources - in processing vision related information. This is unique to humans and we shall try to speculate why such a large proportion of the increase in human brain size over the last 2 million years might have gone to vision related processing.

We learn about the world we live in through our senses - there are 21 accepted senses including 4 belonging to vision (brightness and 3 colours - red, green and blue). Vision, sound, smell and touch are the only senses that provide us information about the external environment.  Smell and touch are relevant only for short distances; sound may be good for medium distances of a few km or less but is of very poor spatial resolution. Vision is the only sense that properly connects us to the outside world and is our main way to interact with our surroundingsHumans rely heavily on vision to guide our behaviour and perceive the world.

Unlike all other animals who are mainly concerned with the search for food and safety from predators, humans have learnt to manipulate the environment for their benefit.  Many factors like learning to use tools, controlling fire, agriculture, living in societies with mutually accepted rule and regulations etc. have elevated humans to become the most powerful species that has ever lived.  This has been made possible by the visual input processed by the brain.  Imagine the amount of visual information that would be fed to the brain from the ever increasing activities that humans are involved with - that will require a massive supercomputer to handle .  It is no wonder that the human brain consumes greater than 20% of basal metabolic energy even though it is only 2% of the body mass.  The brain also consumes energy at a more or less constant rate of 20 Watts whether you are solving maths problems or sleeping or sitting quietly in the sofa - it has to organise itself to be ready for efficiently analysing the next input; (For comparison Titan supercomputer uses 4,000,000 Watts of electricity!)  
It seems the vision perception department is always looking for more resources and this might be the reason why the size of the human brain has increased so rapidly over the past million years or so.
Humans have retained the two front-facing eyes because the binocular vision provides a far superior depth perception than a monocular vision; and helps the brain to more accurately perceive reality.

The brain's perception of reality involves a lot of filling-in of missing information from guesswork, and the reliability of the resulting perception is a matter of discussion.  There are many examples when our perception is wrong by a long margin - but that is all we have just now.     


Thanks for reading. 
Please pass the link of this blog to your friends and family.