Wednesday, August 23, 2017

Why I Love Bureaucracy?

Don't be perplexed. I know you may be wondering how on earth can someone love bureaucracy, which stands for all the bad things - slowness, indifference, complexity, lethargy - but I argue back: If we don't love bureaucracy, how does it persist?If we hate it, why the first thing when we start a business is to draw an org chart? Why, when things go wrong, we want to see a manager? Or, better still, why most of us want to be managers? Why we want a job description? Why we fill so many forms and want to fill some more? Why we love emails and calendars, show off our smartphones and smart watches, want to prove how busy we are?

I love bureaucracy because it's everywhere. Call it any name, but our lives are bureaucratic. Every morning, when I plan the day and write down my to-do list, I am bowing to bureaucracy. As I run for meetings, cut short conversations, skip lunch or feel guilty about not responding to a message, I am celebrating it. There is no escape. The only way I can avoid being bureaucratic is by using an euphemism: Managing my life, I say.

I also love Bureaucracy because, with this, you can rise to your highest level of incompetence. Yes, this is the Peter Principle, which I discovered in my youth, showed it to my boss for a laugh and got censured. The idea is simple: If you are promoted to the next level for doing a job well, you will only stop when you have reached a job level which you don't do well. Common sense, but indeed problematic for all bosses.

Also, as I watched bureaucracies, I discovered a law. Okay, I did not discover it entirely - I only derived it from another, better known law. I am referring to Parkinson's Law - that of Work expanding to fill the available time! Here is my Second Law of Bureaucracy (after Peter's): Bureaucracy expands to spend every available dollar, and then some more. Imagine a world where this is not true - we can't create the need for that extra consultant - and you will know how bereft a place the world will be without all these bureaucratic hangovers.

But I love bureaucracy not just because it's everywhere, but because of its wonderful capacity of keeping busy with useless work. We live in a world of activities without outcome: That's the point of Capitalism, in fact. Imagine asking the Reality TV stars, including the one in White House, what they actually do. The point of work, particularly those with lots of money, is not to work. Earning without a sweat is the pinnacle of success: After all, that is a bureaucratic idea.

And, if you are against bureaucracy, think what that would really mean. 

That will mean that we would have to live without a job description - not just the one in the workplace, but also in the society and at home - where we have to constantly figure out what is the right thing to do without a set of instructions and stereotypes. 

This will mean that we have to think about the consequences of our actions, particularly about those sticky issues about how those will affect people we don't know. It would mean that we would not be able to hide behind our titles - whether that is President, Accountant or Husband - and would have to be answerable for what we do. 

There will be no experts and lots of questions; less rules and more compassion; less would be reducible to technology and more would be open to understanding. 

Life would be worse off as we will have less things and more time, and yet have to care more for others and less about our own statuses.

If you thought bureaucracy is a bad word, what do you think of being whimsical? If slow is bad, how about unpredictable? If complex repels you, how about a world without many rules? If you want to rid of lethargy, would you want to live without Facebook? Even being oneself - that of status updates, photos of touristy places you have been to, along with kitschy food photos and cliché quotes - is the expression of our bureaucratic selves, the one that lives by the rules of success and wants to be a little bit like that incredibly powerful, incredibly powerful person we all secretly wish to be. 

I love bureaucracy because life would be so meaningless without it.







Friday, August 18, 2017

Education and Automation

Working in Education, I have to confront the conversations about Automation all the time: Are there enough jobs there for us to educate so many people?

As with other things in life, there are 'Many Sides' in this debate too.

One side of the argument is that there are enough jobs, and the unemployment is resulting from a skills mismatch. As evidence, one can cite simply the number of unfilled positions that the companies report, or the poor applicant-to-job offers ratio. 

The other side of the argument is that the jobs are really shrinking and many jobs are being automated, and we should be preparing for a future when most people would not find work. There is strong evidence for this as well: It is possible to show that the job numbers, when compared like-for-like (without counting the new positions created by new companies or sectors), are often decreasing, not only in the developed countries but also in supposedly high growth areas such as manufacturing in China. Also, despite all the talk of unfilled positions and skills mismatch, most wages are stagnant or decreasing in real terms. 

This is just a snapshot of the arguments, but one could perhaps see even in this that the debate is ideological in nature. One could look at the same piece of data - the stagnation of wages - and can draw very different conclusions. And, therefore, most data give no definitive answer about what is going to happen to jobs. One may write a book with a title like 'The Inevitable', as Kevin Kelly has done, but it is sobering to note that the Robots will take at least until 2020 to fold the laundry properly. 

Besides, almost all the predictions about human obsolesce do not take into two significant factors. 

First, most of these predictions are based, at least loosely, on Moore's Law, or the ability to double the processing capability of machines every 18 months or so. Such a pattern has held since the 60s, but assuming that the future will be the kind of thing we are told to guard against. Indeed, there is no guarantee such a trend will continue, and the Robots may indeed stop at folding laundry before taking the next leap only very slowly.

Second, we completely overlook that whether we develop technology in one direction or another is actually a decision we make. And, in fact, it is a political decision. For example, it was known for a long time that the women can do more than folding laundries, but it took us a while to accept them as equals in the scientific community. At a time when overpaid Google execs are writing memos based only on convenient facts, and an American President sees provocation in White Supremacist violence but finds none in case of Islamic terrorism, we should stop pretending that politics does not matter in the decisions we make.

My point here is that the automation and human obsolesce is not a secular, technological event, but it is a choice we are actively making. This is not about computer chips going beyond a certain threshold of capability - that still lies in the future, and is probabilistic - but has more to do with the climate of opinions today. Getting back to the dictum - Future is not going to be like the Past - we may argue that this does not only mean greater automation, but may equally stand for different priorities. And, this position may actually be Optimistic rather than Pessimistic: I am arguing that, in the near future, it may appear to us that finding cures for diseases like Ebola, which kills poor Africans at this time and are therefore considered unimportant, is more worthwhile than developing driverless cars. In summary, automation is an investment decision, made within a certain context, which may change rapidly.

Also, something needs to be said about what goes for Optimism these days. That we have wrong priorities - the point I made above - is taken as unscientific, anti-progress and pessimistic. Instead, the current prophets of artificial intelligence claim that automation will not only destroy jobs, but will create new ones: As evidence, they point to the track record of industrial revolution, and how it destroyed labour jobs but created new ones instead. In fact, that it managed to create the new jobs undermined the doomsday predictions of the contemporary observers, Karl Marx included. We should think whether or not this can continue.

This is actually one of the key features of capitalism: That it can create new jobs which has no immediate productive benefit. The magic machine of capitalism is not the more powerful computer, but advertising, the ability to manufacture desire and create new social expectations. If you think of the millionaire reality TV stars - or, even better, a reality TV President - you would see what I am referring to. In fact, celebrity culture and advertising is a self-generating loop ad infinitum, and this has kept the job machine going.

This may happen again, but there is one caveat. Someone has to pay: So far, we made the next generation pay for the earlier ones. I don't know about you, and I have this terrible feeling that we are that payback generation: The judgement day seems too close for comfort. This is indeed what all the austerity messages mean. And, when one sees two contradictory messages coming out - that the party must end and yet we have a magic machine to create jobs and prosperity endlessly - coming from the inhabitants of perhaps the most important street in the world, one must pay attention and pause to think.

So, in summary, we may have to readjust our priorities, because both our technological capability and our ability to pay for creating jobs, may not last forever. Universal Basic Income has been mooted as a solution, though this found no favours in quarters where the 'celebrities' are happily paid millions. And, this should perhaps tell us what we do with technology is political, and Education - rather than a passive producer of people for jobs - should be the shaper of these priorities and conversations. 





Sunday, August 13, 2017

Brexit: The Remaining Problem

As Brexit starts to bite, the politics of it has come alive again. 

There are some clear signs that the British economy has started cooling. In a way, the experts were right: We have started paying the costs of Brexit. Indeed, they were wrong at the same time - the effects are slowly beginning to emerge, rather than appearing as a morning-after apocalypse. But it is inescapable that a long winter is around the corner.

This makes the politics of Brexit come alive again. The Remainers suddenly see a light, as the Leavers' claims are exposed as hoaxes and lies, and the economic effects of Brexit become clearer. Their moods are a combination of 'I-Told-You-So' and denial, as the weak and unstable government proves itself to be clueless about how to deal with Brexit. Suddenly, leaders who bailed out - David Miliband, for example - are back in conversation, urging the MPs to push for a second referendum; there is talk of leadership changes, and even of a new party of Remainers emerging. Liberal Democrats, a write-off after 2015, suddenly found themselves a purpose, and believe that they can be that party of Remainers. In summary, Brexit has made British politics interesting again.

Except one thing: Where are the Remainers?

All those people who voted to Remain in June 2016, yours truly included, have come to accept Brexit as a fact of life. They have not suddenly become xenophobic or protectionist, but the intervening months gave them more perspective than the simplistic referendum question David Cameron put forth. In a way, the Remainers have become Post-Brexit, and have come to question whether Remaining or Leaving the EU was actually the most important question.

I can, and shall, speak for myself, but I think these views are shared: Many Remainers no longer think that the Brexit was about some disgruntled voters breaking with the establishment, but that there are real issues we should think more deeply about. We have watched, in the meantime, the surreal Presidency of Trump, and another General Election, all of which indicated the end of politics as usual. We have realised that Cosmopolitanism, a nice, cosy ideal, has a downside: Globalisation's losers, obscured from view, have claimed the centre-stage and forced us to rethink what openness really means. The Brexiters' dislike for Syrian refugees was disgusting, but what came to light since is that the Remainers forgot their own backyard. 

This soft underbelly of the Liberal Internationalism now lay exposed. The Liberal Elite and the Professional Left have mixed up the Internationalism - the common cause of the working class - with the global flow of International Capital and pursuit of economic efficiency. This did not come from nowhere: The Liberal faith in progress, the Marxist assumption of Capitalism's forward march, the European belief of cultural superiority, blended together in this new slogan of unity of the world's elite. This has made the conservatives the keepers of the social, the religious leaders the last refuge of the dispossessed, and the nationalists the champion of the cultural. It needed the jarring experiences of a Brexit, and the rise of a Trump, for the illusion of openness in the Liberal edifice to break down.

In a classic role reversal now, the Remainers now find themselves not on the side of sympathy, but selfishness; enlisted not for the cause of openness - as they set out to do - but opportunism. The champions of the new remain - Blair, Miliband, the Lib Dems - are pushing an old envelop and asking a question - whether or not to remain in EU - that has been asked, and answered, already. They have not confronted the brave new Post-Brexit, Post-Trump questions, which will require new answers. Their arguments have no new ideas about how to address the challenges of deprivation and disaffection, no commitment to make globalisation work for everyone.The only thing they offer is a path back to a Golden Age that none of us can remember living in.

This is indeed the Remaining problem. One may not subscribe to the xenophobia and small-mindedness that Theresa May offers, but the opposite cause is equally bankrupt. The offer of 'Making Brexit work for everyone', the somewhat less sexy Corbyn slogan, may have more to it than we see. Indeed, the idealistic 'Making Globalisation work for everyone' is not on offer - at least, not from the current bunch of politicians.  So, despite all the talk, except for a few people worrying about visa queues (having lived most of my life with an Indian passport, this does not bother me much), the Remainers don't have much to go for them. That is, not until they get back to basics, and start asking the questions they should have confronted a long time ago.


 

Saturday, August 12, 2017

Training to Teach in Global Higher Education: Ideas For A Qualification

The idea came to me from various conversations in China and India: That teacher training in Higher Education is an urgent need and a significant opportunity.

This is counter-intuitive. Most Western institutions of Higher Education, autonomous as they are, train their own teachers. For Continuing Professional Development, the emphasis here is on Research, and an established network of Conferences exist to foster the community. Teacher training is for schools, where the volume and turnover of teachers are high, and it needs constant refreshing.

However, the expansion of Higher Education in the last decade in China, India and elsewhere brings into play a different reality altogether. 

First, the Higher Education institutions created in these countries in the last decade are different from research-led institutions in the West: They are teaching institutions operating at a mass scale. The focus is on teaching at scale, and the appropriate teacher training is therefore of great importance.

Second, these institutions face an acute talent shortage and high turnover. Teaching in Higher Education is not well remunerated, and few opportunities exist for professional development. Faculty members are recruited directly after completion of their education, and often lack perspectives and skills required to be a successful university teacher. A training solution for the new teachers are in demand among institutions that want to retain its faculty and develop their skills.

Third, the quality of Conferences and opportunities to develop Professional networks are often quite limited too. The Conference ecosystem is springing up as a response to the expansion in the number of teachers, but the conferences, modelled after the Western ones, are research-centric and at odds with the requirements of teaching focused institutions. 

Besides these, the teaching in Higher Education is also rapidly changing. Globalisation is a persistent reality, both in terms of access to talent (and lack of it, as trained faculty often migrate abroad) and student preferences (more global institutions are preferred by students). Technological change is also making an impact, and its possibilities are instantly understood in the context of requirements in countries like China and India, though the solutions available are often immature and poorly implemented. Finally, appreciation and understanding of outcomes, always a challenge in Higher Education sector, are critical in resource-poor, outcome-hungry developing countries, and this imposes a new set of demands on teachers.

What ideally is needed is a qualification and community for Higher Education teachers (in the broad sense, and including people teaching in Technical schools) that is geared to the challenges and opportunities of the 21st Century. The demand for this is understood in countries such as China, where the state actively encourages faculty development and global exposure; it is also obvious in countries like India, though, generally speaking, it suffers from a greater level of 'not-invented-here' syndrome and actively resists change. However, even in countries like India, the allure of a foreign qualification for teachers is irresistible, and there is a strong business case to develop something to offer to Indian teachers as well.

Despite the apparent opportunity, however, most Western Teacher Training institutions and qualification bodies are wholly unprepared to provide a solution. Apart from the fact that teacher training in Western nations is primarily a K12 focused affair, Education as a whole remains a very nationally-oriented area of research and conversation. 'Transnational' in Education often has imperial undertones - this means local practises being spread globally - and supplanting teaching models from a Western nation is hardly the solution the rapidly expanding mass Higher Education sector in China and India needs.

Globalisation of Finance and Business has hardly reached the academia, and while Western Universities attracted millions of students, they attributed this success, perhaps rightly, to maintaining their British, American or Australian roots, rather than on their ability to understand and solve problems in the developing countries. That this creates a curious duality - they claim education is a public good and try hard to protect public funding, while at the same time, encouraging and serving the International students' private needs and aspirations as faithfully as ever - but the Western University sector is completely oblivious to such inconvenient questions.

There is also a deep distrust of technology! Good teaching and deployment of learning technologies are seen as oppositional activities. This is not necessarily so in the Developing World, where teachers bother less about having to write emails after work and more about the struggle to find even the basic research papers or learning materials. Their commute to the classroom are often more troublesome and sweatier than the pleasant drive through Middle England, and they are therefore happier to explore how to teach online. And, besides, for a teacher in Indian Higher Ed, mastering the technology is a desirable advantage, not a self-defeating distraction.

And, finally, the outcome-centricity is seen by most people in Western Higher Education as a sign of creeping managerialism (which it is). Higher Education institutions, with its public roots and ecclesiastical pretencions,  do not want to be accountable for short-term and measurable results. There is inherent contradiction between this and the pursuit of private advantage which Higher Education mostly represents, but this is one thing Western academics feel very strongly about. There is no such luxury in India and China, where hierarchy and accountability are facts of life. Surely, the practises there need a 21st Century update - often the people in Higher Education are being accountable for wrong things - but outcome-centricity would not come as a surprise to someone teaching in Higher and Professional education in developing countries.

In conclusion, I see a clear gap and a significant demand. I am well aware of the challenges of building a never-before solution in Education, particularly Higher Education: In a regulated industry, regulatory compliance replaces excellence, and a service that may make perfect sense under the logic of competitive markets, may find few takers unless it is a regulatory requirement. I have applied my market-based logic to regulated sectors before and am well-aware of the perils of such an approach. In planning Teacher Training, therefore, I am not just planning a programme to be launched under a private label, but rather with the right credentials and hopefully with blessings of regulators in certain target countries. This may indeed be my next big project, and I am all excited about it.

 






Friday, August 11, 2017

On Being A Hindu

I remember this awkward dinner conversation. I was with my colleague in Northern Ireland, and a friend of his joined our table. After we were introduced, he wondered at my name and asked me what religion I belong to. I went for the simpler answer and kept my doubts aside: "I am Hindu", I said. That made him even more confused. "What's a Hindu?" he said, "Is that some kind of Muslim?"

When I tell this story to my friends in India, they are usually outraged. What an ignorant person, they would say. Particularly treating Hinduism as a branch of Islam, when Hindus love to believe that everyone was originally a Hindu, upsets them. I have also reflected upon this conversation later. It may indeed be that he did not know. He was particularly ignorant, just as ignorant as the lady, who, standing inside the Irish Bar at Mumbai's ITC Grand Central hotel, asked my colleague - the same person as it happened to be - where Ireland was. But the confusion about Hinduism is more common than one may think. The 800 million Hindus live in one geographic corner of the world. This may make many people, who live their lives contentedly within the region, feel Hinduism engulfs the world, but the reality is just the opposite: Most people live in blissful ignorance of something called the Hindus (the same people indeed wish they could ignore Islam as well).
For me, I had to go through several cycles of finding my identity. Like any Indian, I had several layers, and knowing what to describe myself as has mostly been an act of negotiation. An Indian, I would most commonly say, despite my citizenship, because I defined myself by the Post-Imperial Republicanism that made India. This meant at once rising above my Hindu identity and being deeply into it, as the flavour of Hinduism I grew up with was, despite all the rituals and festivals, universalist. It fitted nicely with the idea of India then fashionable - with its emphasis on private faith, tolerance of other ideas and acceptance of the world as it is. There was casteism, but not in its virulent form of exclusion and violence; there was superstition, but in its comical manifestation in doing or not doing something on a particular day; but overall, this was a flexible, personalised religion, allowing me to pick and choose. 

This may sound paradoxical for those who haven't had a similar experience. But, an apocryphal story, which I first heard from Shashi Tharoor, an Indian statesman, captures the spirit. Mr Tharoor tells the story of a young man who had doubts and approached his father to know about Hindu religion. The father said he was too young and perhaps they should have a discussion when he grew up. A few years later, the father offered to induct his son into Hinduism, but the son refused, stating he had already lost his faith. "Welcome to the atheist branch of Hindu religion", his father said.

A Hindu would perhaps appreciate the story, but it is harder for others. Confirming that religion makes awkward dinner conversations, I must talk about another dinner, this time in Salt Lake City, Utah, when I was asked the same question. By then, things changed for me: The extreme form of Hinduism that took over India made me question my own prejudices and superstitions, and lose my faith, as much as someone born a Hindu can possibly do. I said, "I am an atheist", perhaps smug in comfort of belonging to the atheist branch of the Hindu religion. This stopped all conversations around the table - Salt Lake City is indeed a more religious place than most others on earth - and everyone looking at me with some kind of incomprehension. Finally, someone rode to my rescue:"You are not an atheist! You don't go around telling people not to believe in God. You may say you are agnostic, but not Atheist." I am certain this wonderful specificity of English language and Christianity would be mostly lost on my Hindu brethren, but apart from the insight about religion's place on dinner table, there is not much to be learnt from this.

But I must also perhaps explain why I started questioning my faith. The universalist, tolerant ideas that I grew up with dissipated rather quickly. What we have now instead is a different version - intolerant, ignorant and ritualistic - an opportunistic amalgamation of politics and religion that sanctions everything and yet controls everything, makes hatred its centrepiece and claims a pre-scientific heritage of universal truth. WhatsApp groups in London Suburbs now discuss the merits of sprinkling cow urine on one's head, the Prime Minister of India straight-facedly claims that Lord Ganesha - the Hindu elephant god - was the first case of plastic surgery (taking the idea that Ancient Egyptians knew some techniques of skin grafting to its absurd maximum) and people in India are regularly lynched for being suspected of eating beef. At the same time, elaborate rituals are now performed in offices and businesses, consuming beef has become a public offence in some parts of India and campaigns against Muslim actors and artists are now acceptable nationalist indulgence. The astrologers are having a great time: Recently, one applicant told me that he delayed sending his CV by two weeks - this cost him the opportunity - because the times were not favourable. The animistic, ritualistic religion that we thought we left behind has arisen from the ashes as the only true faith: There is no longer any branch of Atheists in Hinduism.

Indeed, the torch-bearers of new Hinduism recoil at Max Weber's categorisation of Hinduism as a Non-rational, Inactive religion (in Weber's world, Christianity was Rational and Active, and trumped the Rational but Inactive Confucianism and Active but Irrational Islam), but fate is back in business. Samkara's dictum that Vedic rituals are for the ignorant have been totally forgotten, and Gita's insight that Inaction may destroy one's humanity, something Max Weber completely missed, has been erased out of consciousness as well (The more famous part of the verse says, "You have a right to action, but never to its fruits", but the later part, "Never desire the ends, and never indulge in idleness", is rather forgotten). This new Hinduism, fashioned as a pure faith, is built around elements borrowed from proselytising religions: It is ritualistic and with provisions for conversion (which one can't technically do in Hinduism, therefore the assumption that everyone was once a Hindu and conversions are merely purification rituals), and it is based on rejection of the very universalistic, tolerant faith that I knew as Hinduism.

In a way, therefore, this is the best and the worst of the time to be a Hindu. Suddenly, a third possibility - between the Ritualistic Hinduism and desolate Agnosticism - opened up for people like me. It is essentially an invitation to rediscover a civilisation that lasted for thousands of years, and despite being run over by invaders and moulded by many outside influence, which maintained its essence of faith - in humanity, above all else. One could perhaps see that my aversion of the Evangelical Hinduism has finally inspired me to get back to the basics - read Gita which I should have done long time ago - and find again the essential, rational, civilisation that is founded on the idea of tolerance. My wanderings took me to the American pragmatists and my foundational belief became - "Ideas should not become Ideologies" - and yet, my journey home to Hindu texts reveals essentially the same idea, an acceptance of the world as it is, with all its imperfections, diversities, redundancies and beauties.

Sunday, July 30, 2017

India in 2017: The Coordination of The Bihar State

I am now in India, after a gap of several months. A lot has changed in the few months since I was last here. Most visibly, the money has changed - the Rs. 1000 notes have vanished and the ubiquitous Rs. 500 notes have a new look, and there is a strange purple-pink Rs 2000 note in circulation, which very few people want to accept. It is one of the signs of the great experiment that is now underway in India, where even the most fundamental things can change overnight.

One such event in the few days I have spent here was the coordination of the Bihar state. Bihar, which is one of the most populous states in Eastern India, has a large lower caste population, and have consistently rejected the upper caste Hinduvta politics of the ruling Bharatiya Janata Party (BJP). True, BJP had participated in the Government in coalition with one of the other 'caste' parties in the past, but they were never the senior partner. And, in fact, Bihar electorate dealt a severe blow to Mr Modi's ambitions in 2016, when BJP was defeated rather decisively in the state elections, by a coalition of parties that included Indian National Congress, the main opposition, and Janata Dal (United), BJP's erstwhile partners who broke ranks citing Mr Modi's communal past.

And, yet, this week, Mr Modi appeared triumphant again, as the Chief Minister of the State, Mr Nitish Kumar, kept his scruples about BJP's communalism aside and dumped his electoral allies. The excuse was the corruption cases against one of his coalition partners, but that is as filmsy as it can get: The corruption charges were old and well-known. It was as if Mr Kumar seemed to have discovered a new scruple as it discarded his old one about communalism.

In the world of Bihar politics, such U-turns are unsurprising. Mr Kumar is a known opportunist, and the only thing his latest move means is that he has given up on his Prime Ministerial ambitions, which was perhaps the real reason why he did not like the elevation of Mr Modi in 2014 and deserted the BJP.  But this latest turn has more significance than his vanity, and even the disarray in opposition. Mr Modi's demonetisation, which has caused enormous difficulties for the ordinary people and achieved little in removing 'black money' (Rs. 2000 note allowed a better mechanism for storing it), has achieved its key goal: He has demonetised the opposition! Since then, despite all the marginalisation of minorities and poor people, BJP appeared unstoppable, winning some big electoral victories, but also winning where they lost electorally - like in Goa or Manipur - where they could buy out the seats whereas the opposition, mainly Congress, watched helplessly. They have also been successful in destabilising state governments, where internal rebellion was encouraged and brought down Congress governments. Mr Kumar's latest moves were perhaps prompted by a pragmatic acceptance of such prospects: He knew the writings on the wall and decided to be on the winning side.

As Journalist Sekhar Gupta pointed out, Bihar was a big one. Now the BJP controls all of India's big states, barring the Southern ones. The changes in Bihar is being seen as a decisive turn for the 2019 General Election, which Mr Modi is now almost certain to win. The opposition, unable to come up with any alternative ideas, looks toothless. The media has been hounded into submission. The courts are largely sympathetic to the government agenda and too inept and self-obsessed to change anything. The Government now has one of its own as President. Every opposition-ruled state is feeling the undercurrent of communal tension, which is usually the precursor of the ascendancy. Mr Modi's tactic of 'If you can't win them, break them' seems to be tearing apart the Liberal politics of India.

The worst fears we had about a Modi premiership have now come to pass, but it is plain that a majority of the middle class voters (though some, from the Linguistic or Religious minorities, remain strongly opposed) are still cheering him on. However, in the last several months, the nature of the Indian government has changed. As total power was achieved, the masks have started coming off. The claims of economic development has taken a backseat and Indian economy has started slowing down (though the GDP figures were massaged and made to appear higher), and the political and social agenda of the ruling party has taken precedence. The global charm has also started fading and the global media has started noticing Mr Modi's authoritarianism and his lack of interest in any fundamental reform except in people's eating and dress habits. 

This is the transformation I am witnessing now as I travel around India. I should guard against crying foul too soon, and I don't want to exaggerate and claim that India has already turned a Dictatorship. And, the path it is taking possibly does not lead to the totalitarianism, as there are different interests and personalities are competing for ascendancy rather than being subjected to the whims of one man. But, not being like Nazi Germany is not a great achievement, particularly if India seems to be on the way to become one of those states where the politics of the majority and the interests of a few corporations seem to drive the agenda. India in 2017 and beyond will perhaps give a new model of illiberal state for the future historians to ponder. And, this 'coordination' of the Bihar state would possibly be seen as the all-important inflection poimt when this transformation became clear, and the social agenda finally and decisively trumped the rhetoric of economic development.



 

 


Wednesday, July 12, 2017

Automation Against Capitalism

Automation is Capitalism's great new prize and its most potent challenge. At once, it breaks the back of organised labour but puts into disarray the carefully constructed social system that we call Capitalism. It is Capital - that's what machines, robots and know-how are - becoming supremely productive and utterly meaningless at the same time. It is the realisation of an utopia, but also a moment of reality. It would potentially expand supply infinitely, as finite Human time will no longer be required, at the same time as perversely limiting demand, as nothing that is produced could be bought.

The last bit is indeed the classic Marxist argument, but from the vantage point of 21st century, we see something that Marx did not. First, though Marx made some very insightful predictions, the empire was still only taking shape and at the time of Marx's death, the integration of global economy was still in its infancy. Also, for Marx, the nineteenth century capitalism was a relentless pursuit of efficiency. It was about converting every scrap of human life into productive work, with just as much rewards for workers as is needed to preserve the scrap of human life they were allowed to live; the rest went to the owner of Capital. And, finally, Capital in Marx's world was a finite commodity: Remember this was the time of Gold Standard!

These are the three things we know now. First, Capitalism has defied Marx's prediction of imminent demise by progressively expanding into the farthest reaches of the globe and bringing new consumers in its fold who would slave away their time to get a piece, if only a crumb, of the cake. Further, Marx did not see Capitalism's unique tendency to create meaningless jobs - jobs which has no other productive use other than hooking people to the elaborate system of signs and desires - which gave it a 'viral' character. The system did not generate surplus through squeezing out productive efficiency; it rather created surplus by creating useless demand. This indeed wouldn't have been possible in Marx's world of sound money, but it was long gone as the Gold Standard, and its successor, Gold Exchange Standard, were conveniently binned, and an intricate but dubious system of fictitious capital was constructed to monetise the future. 

Capitalism survived and well. It advanced breaking down traditional communities, ways of living and methods of transaction; it encompassed the globe and monetised every living moment. It created layers and layers of useless jobs: Jobs like reality TV stars, models and celebrities, whose job is to create allure and keep us hooked; all those Consultants, who recycle received wisdom and specialise in making slide decks; all those myriad middlemen and sales people, who sell fictitious financial products of dubious value to each other, so on and so forth. And, all this was paid for with credit, created out of thin air by modern financial system, predicated on people slaving away their future time in the pursuit of more. 

And, what if they don't? We may be at that moment when the delusion of Capitalist sign-making reached its pinnacle and self-deluded itself; fooled itself in the business of making fools; signs became so all encompassing that the reality has been erased. As machine step into the workplace and take away jobs, it is not only that the semblance of shared prosperity vanishes, but it takes away the possibility of all those labour time on whose basis the credit was built. Among all the slickness of Robot-produced future, the debts that built Capitalism have to be reset, as there would be no-one, or at least not enough people, to pay for it.

The Robotic future is therefore as calamitous to Labour power as to the current form of Capitalism. It is no straight road to Marx's "Hunt in the morning, Philosophise in the evening" utopia (a passage which he took out after he wrote it, apparently out of embarrassment) but rather a scary challenge to all those plotting the future: If Robots do all the work, who pays for the Debt? And, if we are to reset all credit, can Robots be at all there?

Indeed, one knows the answer: We will find a way - we always find a way! And, indeed, we will. But all things that have a beginning will have an end. We are perhaps living in end times when our ability to exploit the frontier and mine the future to create a system of illusory jobs and fictitious capital comes to a close. Surprisingly a system's greatest triumph also looks like its end; that is usually how History plays out.

  

 


Incubators and Universities: Need For A New Model

As the crisis in jobs becomes apparent, many think that the way to maintain the Middle Class society is to be found in entrepreneurship. In their mind, it is a straightforward transition: People not finding jobs would start businesses. In some quarters, those look for jobs are already maligned - 'Job Takers' they are called - as opposed to those committing themselves to entrepreneurial journey, the 'Job Creators'. As always, the reality is harsher than the theory. But my point is not to challenge the idea that there should be more entrepreneurs. It is how to get there I have questions about.

More specifically, my doubts are about the new trend of creating university-based incubators, US style, in the universities in developing countries. The incubators are taking the place of 'Placement Offices' or what was euphemistically called the 'Industry Collaboration Office', becoming the last mile of the students' life cycle in an university or a business school. 

The idea behind these incubators are to replicate the successes of the incubators in the top universities of the world. They are inspired by the stories coming out of the likes of Stanford and MIT. The governments are excited about it too, and treat the incubators as solutions to the jobs crisis they have in their hand. However, the trouble is, the universities in the developing world, particularly those in ex-colonies, are very different institutions than the American ones, and they are hardly designed to be hotbed of innovation.

It is a mistake to see all universities as same, when the Colonial University was set up with the very purpose of standardisation and connecting colonial education to colonial employment. Indeed, the countries are now free, but most of them maintained their colonial institutions and see modernity in continuity of the traditions bestowed upon them by the Colonial administrations. This was specifically the intention of the British administrators, who appreciated the value of soft power long before the term was coined. And, among the institutions of Colonial age, the universities were the most revered, seen as gifts of science and reason, an intimate ally of the modernising politicians who took over the running of the countries after the Colonialists left. 

The universities, therefore, are factories to create servants of the state. The whole university culture, with the possible exception of some elite technocratic institutions set up post-independence in some of the countries, is usually deeply rooted in the desire to maintain the bureaucratic continuity, rather than disrupt and innovate. Their students come looking for a qualification that will lead to a job, and their aspirations are more narrowly defined than that of their counterparts in metropolitan nations. The idea of the university as a fountainhead of innovation, therefore, stands on a false premise.

In a way, university-based incubators work against the grain of the host societies, where the innovation mostly happen outside the universities. It also imposes assumptions which are alien and unworkable, like a bias towards younger entrepreneurs, though family support structures are different in many developing countries and people starting enterprise in relatively later stage of life are far more common. Indeed, the investors sometime work with the assumptions they learn from American business schools, and override the considerations of local labour market and society. However, this is part of the problem rather than a justification of a wrongly designed system.

In my mind, there are two things that need to happen. One, and this is close to my heart, is to create Enterprise Schools, which are built upon the culture of entrepreneurship, which will attract a specific kind of people and support them through a longer development cycle. Two, and this is perhaps more scalable, while the incubators may be university based - if purely because the lower real estate costs - they should mandatorily create mixed cohorts, drawing from the outside population, particularly including people who already have work experience.

In summary, my recommendation is that the incubation model needs to be reinvented for the developing countries, rather than the plug-and-play approach that is now prevalent. This needs a conversation, and not blind faith. Enterprise is not a straightforward solution for the jobs problem, as these require changing markets, newer opportunities and upsetting existing corporate primacy, and this, before everything else, needs opening of minds and engaging at a different level.



Tuesday, July 11, 2017

What Does A Tech-Mahindra Phone Call Say About Indian IT Industry



Last week, voice recording of an HR executive firing an employee at Tech Mahindra, a big Indian IT company, went viral (as above). The employee was told that he is being fired not because of any performance issues, but because of 'cost optimisation'. He was told to resign by the end of the day, failing which he would be terminated the next day, and lose all his exit benefits and wouldn't even get a reference. When the employee pleaded it was too short a notice, he was told that the company can fire him summarily. When he sought an option to appeal, he told there was none.

After this went viral, many weighed in, converging on the consensus that while the company might have the rights to fire the employee, it was all too harsh. As for me, I thought it was coercive, and therefore, illegal: I can't see how a company can fire an employee on disciplinary grounds because he failed to resign as told. In America, this, aggregating the claims of all employees fired in this manner, would have made a multi-million dollar class action lawsuit.

Anand Mahindra, the Chairman of the company and a business leader who maintains an enlightened image, was quick to issue an apology on Twitter. His other Senior colleagues followed, in a damage control exercise. It is not known whether anyone has actually been disciplined or fired for this stupidity.

The essence of these apologies was that the manner of this firing was harsh, which undeniably it was. However, the commentary that followed accepted these firings as inevitable. The narrative coming out of Indian IT companies is that they have been caught out by 'convergence' of several factors - automation, productization, protectionism - and their business models are changing. They hope to become more nimble, move up the value chain and come up with innovative solutions. These firings, harsh as they may be, are steps towards that better, brighter future.

This narrative is of course going nowhere, as the call shows. Legalities aside, anyone listening into that call can't miss the contempt with which the employee was treated. At one point, he was told that he can't obviously appeal to the CEO (the question is, why not?). This is the layers of disdain that one sees on the Indian streets - the guys in the big cars treat the guys in small cars with contempt, who in turn treats the scooterwallah with contempt, who then treats the pedestrians with contempt, so on and so forth. Of course, Tech Mahindra can't become a magnet of world class talent tomorrow just by firing a few unfortunate employees at the bottom of the food chain. If spreadsheet savvy created great companies, world would have been a different place today. Clearly the company treats its employees like cattle and it is going nowhere with that culture.

Besides, the would-be super-innovator also seemed to have no idea of social media. Otherwise, why would it let lose an obviously untrained and emotionally-deficient HR Exec in a bullying match with its employees? Before they unleashed the best practices in firing that they may have learned from some American company they love to ape, why did they not realise that there is an entire cottage industry of 'how to fire people' in America? Well, the obvious answer is that they did not think about it. That should tell their customers how much they really understand about the world of social technologies.

The PR exercise that the Senior Execs are doing wouldn't save the company, as these will only obscure the broader issues of commitment and culture. Nothing changes in a big company unless the share prices plummet or the customers vote with their feet. The former will not happen because the spreadsheet boys will speak to spreadsheet boys and buy their theory of 'convergence', and miss the signs of decay. The latter will also not happen because the American customers were treating those Indian IT workers with funny accent as cattle in any case, and wouldn't care if a few thousands were fired. Until indeed, the whole edifice comes crushing down again.




The Eurasian Moment in World Politics

The world of politics is changing profoundly. It is not just about the rise of the strongmen rulers - President Xi of China, Prime Minister Abe of Japan, Prime Minister Modi of India or President Duterte of Philippines - or their perennially ubiquitous counterparts in Mr Putin, Mr Erdoğan, Mr Netanyahu and Mr Zuma. The shift that we are seeing is more than the shocks, such as Brexit or a Trump Presidency, or the ascendance of extreme nationalists like Marine Le Pen in France, Geert Wilders in Netherlands or Nobert Hoffer in Austria. The anti-Semitic rallies in Poland, the authoritarian Viktor Orbán in Hungary, the absurd Beppe Grillo in Italy and the abhorrent Golden Dawn in Greece are all part of a big shift, which is not just about the rise of nationalism and breakdown of the post-war institutions. There may be a more fundamental shift underway.

Discussion about such a shift is not new. This has been discussed in the scholarly circles for some time. But, since the last year, it has reached mainstream media, for good reasons. It does seem that the anticipation of such a shift is now central to strategic decision making in various large countries, including Russia, Germany, China and Turkey. And, after Trump's ascendance to Presidency, such a shift has become one of the key factors in strategic decision making even in the White House. 

I am referring to the shift of power from Atlantic Seaboard to Eurasian plain, something that the Nineteenth century British geo-strategists foresaw. That their vision did not come to pass is perhaps because of the rise of America as a global power in the dying years of the Nineteenth century, when the American industrial might and the American Military ability and willingness to engage changed everything else, followed by the Great War, Russian Revolution and subsequent dividing lines drawn through the world. Eurasia faded out of spotlight as a strategic theatre as Europe emerged.

Indeed, this was not just a twentieth century affair: Eurasia dominated world history ever since the decline of the Romans, but its relative decline started with the improvements in long haul shipping and the voyages of Columbus and Vasco Da Gama. But it was back in contention in the Nineteenth Century, with the Russian and the British empires jostling for influence, until the Americans entered the fray (after a deeply divisive national debate) and changed everything. For the next hundred years or so, American power, primarily represented by the overwhelming power of its carrier groups, dominated the world. The unfortunate Eurasian expeditions by the Russians in Afghanistan ended badly, and led to a breakdown of that empire. 

There are several reasons to think this may now change. The global nature of American power is not well supported by shared prosperity at home, and the domestic considerations may force a disengagement from wider global policing and in favour of limited and specific engagements required for 'national interest'. In many ways, this is a result of the over-reach of the Bush Years and the consistent Foreign Policy failures under Obama, when America's overseas engagements became costly and meaningless. The 'isolationism', if we call it that, was always a force in American politics, but George W Bush's adventurism and Obama's indecision has now undermined the case for 'interventionism' so much that the former makes sense to most Americans. 

This change that we see does not undermine the United States, as it controls the world's most powerful military and is the biggest economy. It, however, means its disengagement from Europe and greater engagement in Eurasia. It also means an economic revival of the Eurasian region, as President Xi builds infrastructure and brings manufacturing and trade to inner China. It also means a great human movement, as the Global Warming melts the Siberian Ice Cap and some of the great rivers running through South and South-East Asia starts faltering (and indeed, global warming may also mean some of the coastal cities can be completely lost). 

From the vantage point of the Trump administration, which wants to reduce global engagements and restructure the American economy and society, such a shift is only problematic if one has to cling to the dated geo-politics of the post-Cold War world. They, along with many other nations in the world, are adjusting to this new geopolitical reality. In a perverse way, Britain's shift - from Europe to the old Commonwealth - is also a pivot in this direction. Germany, with greater engagement with China's OBOR, is already signalling its understanding of this shift. 

I believe this shift is real, not just because of the geo-political logic but also because of the conscious actions of the countries and the leaders. There are countries which are blissfully oblivious - India seems to be one among them - while the others are scrambling as they see themselves losing out, such as Britain. We may be at a moment that comes once in many centuries, a turning of a long term trend visible only from the long-view vantage point. This would impact not just politics - though this may be where it starts - but business, economies and lives of people. 

Monday, July 10, 2017

Ideas and Ideology

Ideas are fascinating and exciting. We live in a culture that celebrates ideas. In a sense, we see all history as history of ideas now. It is ideas that make men great, and the great men are those who belabour with ideas, either to bring it into being or to create impact with it. Entrepreneurs, our modern Heroes, are the idea-warriors, who puts everything on stake to make their idea work. Ideas, in short, are divine inspirations, whose blessing we all seek and whose existence makes us meaningful.

But there is a dark side of ideas, which never gets talked about. All the monstrosities for the last two hundred years have been committed in the name of ideas. And, indeed, if one counts religion as an idea, the history will go back much further. Just as we transformed the Great Men doctrine into a narrative of great ideas, we should also perhaps replace our evil men doctrine with a narrative of bad ideas.

However, I anticipate an objection coming: Many ideas, which turned out to be pure evil, did not appear so at first. It takes a purely evil man, such as Hitler, to make an idea, such as Race Theories, really evil. And, thereon leads the usual Liberal vacuity: No ideas are inherently great or evil, it's what men make of it!

That is all nonsense. Ideas don't exist independent of men. We may make it sound like an object in itself, but ideas are really words and actions coming from people. They have no separate existence. And, besides, the concept that all great men are men of great ideas and yet, an idea needs evil men to become evil, is the have-your-cake-and-eat-it-too option.

It is time that we are having a reasoned debate about the downside of ideas. At every crisis point of history, this was quite obvious. For example, the Pragmatists in the United States, writing after the horrors of the Civil War (in which, Oliver Wendell Holmes fought), understood it perfectly: "Ideas should not become ideology", as John Dewey would later maintain. Stalin and Mao took the ideas of perfect society just too far. But these are only the well-known examples. Untold crimes have been committed in the British Empire, Commonwealth Countries, The United States and other parts of the world, in the name of ideas. The modern state, all-seeing and all-powerful, inflicted upon its people all kinds of forced behaviour, in the name of national interest and common good. Austerity, a recent idea, which argues that the state should live within its means though that does not apply to defence expenses or things like Monarchical maintenance, has also been taken to the extreme, but avoided scrutiny. When things have gone wrong, someone fell on his sword, but the idea lived on.

Why do I write about this now? Because ideas are seductive, and perfectibility of human beings is not monopolised by Dictators. These assumptions sit under every policy document, every technology business plan, every business school, every self-development formula, the claims of theory, science and technology. It touches our daily lives every moment, and most of our lives are lived within the matrix of options set up by ideas of perfectability and neat behaviour. And, this idea is not just a passive framework: This is actively, intrusively, ubiquitous. There are nations around the world - India among them - where the quest for creation of a pure people is real: The Republican Democratic constitution that the country was set up with, are being torn apart in the search of pure 'Indianness', just as the Japanese, the Chinese, the British, the Polish and the Hungarians set upon similar journeys. The ideology of ideas are all-encompassing and inescapably alluring.

While I argue against purity of ideas, the alternative, I am told, is relativism: If you don't believe in an idea, then you are a drifter, without roots, without a truth. But, this, again, is a fallacy of purity of idea, as if the Truth exists outside the human consciousness. As we build our world, it is best to acknowledge our role in it; to accept that life isn't perfect and our standards are, largely, defined by circumstances. Variability and malleability are the only truths of human existence. And, so it should be.

Therefore, it is sensible to keep Dewey's dictum in mind: Ideas should not become ideology. We are better as observers than as judges; flexibility is an inevitable aspect of human existence. We are beings in time, our consciousness is fragile, temporal and grounded: So is our knowledge. If a bigger truth exists, the best we could do is to be sceptical about it and search for it, but never, never, never should we pretend to have found it.




Thursday, July 06, 2017

Evolution of Meritocracy: American Eugenics, Intelligence Testing and The Making Of Modern Meritocracy

Introduction 

In the second decade of the new millennium - now - new questions about human abilities and human worth have arisen. A vast industry of computerisation and gradual rise of ‘machine intelligence’ challenged the prospect of ever-improving urban middle class life, replacing a vast number of secretarial, administrative and other ‘middle ability’ jobs with computer programmes, cheap workers overseas, and increasingly, with robots. Stagnated wages, disappearing jobs and breakdown of the ‘American Dream’ in its many global variants have led to a new ‘struggle for existence’ in the workplace.


This technological phenomenon also meant an inversion of the role of Capital and Labour in the production process. With decline of large factories and their unionised workforces in the West (replaced by large factories and their non-unionised labour in China and Indonesia) and with most people turned into keen consumers of latest gadgetry, collective bargaining has fallen out of popular favour, and a new hero, the billionaire entrepreneur, has captured people’s imagination. With political acquiescence, falling taxes accompanied rising corporate profits and returns on wealth has far surpassed the growth in wage income, leading to unprecedented and ever-increasing levels of inequality. The rationale of this ‘winner takes all’ society is underpinned by an worship of the ‘smart’, an ethic of outsized reward for intellectually gifted individuals sorted through a selective system of education and economic competition.

Michael Young, the British Socialist Education thinker, created an odd term - ‘meritocracy’, mixing the Latin and Greek roots of the same word - to paint a deliberate dystopia set in 2034. Meant as a critique of the British Education Act of 1944, which attempted to sort the British Society into selective Grammar Schools and non-selective Secondary Moderns on the basis of aptitude tested through aptitude tests for 11-year olds nationally, Young’s ‘Rise of Meritocracy’ ends badly with a revolt of ‘the populists’. Written in the embarrassing shadow of Nazi eugenics, which seemed to put the conflation of human intelligence and human worth on the wrong side of public opinion, Young’s dystopia was never supposed to come to pass. Yet, sixty years hence, ‘meritocracy’ has become one of the key organising principles of society, particularly in America.

The Atlantic Monthly reports that ‘American society increasingly mistakes intelligence for human worth’, pointing out :

As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”

The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.

The same article mentions ‘Darwin Awards’, which originated from an Usenet newsgroup in 1985, to ‘commemorate those who improve our gene pool by removing themselves from it’, and are ‘conferred’ on individuals who died while attempting something ‘stupid’, such as trying to climb out of one’s bedroom by the Ethernet cable. The excesses of Nazi racial policies might have been crucial in establishing the moral element of the Allied victory in the Second World War, but the modern fetish with ‘smart’ seems to make this apparently cruel award appear humourous. However, the invocation of Darwin and genetic science, an idiosyncratic and harmless twist of popular culture, is symptomatic of the ‘episteme’, referencing science to make the meaningless respectable.    

This essay is an attempt at a genealogical presentation of the modern idea of ‘meritocracy’, particularly in America, and an institutional innovation that underpin its love for ‘smart’ : The SAT (originally Scholastic Aptitude Test).  The science of intelligence is hotly contested and has gone through several cycles of claims and debunking, but SAT represents an institutional innovation that continued to exist and advance regardless of the state of scientific knowledge, while at the same time deriving legitimacy through scientific methods and practices. Created at the bidding of powerful men and institutions, it drew on the heritage of IQ Testing, and yet, it successfully relabeled itself when IQ testing fell out of favour. With the success of American commerce and American technological innovation, SAT has become a global shorthand for Meritocracy, enabling the rise of a testing-and-education industrial complex and spreading the ethic of ‘meritocracy’ globally.

The SAT is clearly an institution far more revered and consequential than the Darwin Awards, but it is essentially built on the same three elements: The idea of ‘aptitude’ (a shorthand for ‘intelligence’ invented, as would be discussed later, for the sake of public opinion)  as a biological attribute of the individual, the popular understanding of Darwinian ‘Natural Selection’ and legitimation through science and scientific methods. This essay will present the narrative of SAT in context, from its origins in Eugenics and the modern science of ‘intelligence’ testing to the current positioning of SAT as an universal shorthand of ‘merit’, exploring how an Eugenicist artefact, based on questionable scientific claims, has become hegemonic and come to provide justification of an unequal society today.

A Darwinism Without Darwin 

Charles Darwin wrote to Francis Galton, his half-cousin, on 23rd December 1869, on reading the latter’s Hereditary Genius (not fully but the first 50 pages at the time of writing the letter, by Darwin’s own admission):


I do not think I ever in all my life read anything more interesting and original--and how well and clearly you put every point! ….. You have made a convert of an opponent in one sense, for I have always maintained that, excepting fools, men did not differ much in intellect, only in zeal and hard work; and I still think this is an eminently important difference.

In the book, Galton attempted a statistical analysis on the data from biographies and biographical dictionaries on great men to show that eminence was measurable and inheritable. Galton was not the first to study genius and personality: Raymond Cattell, one of the later leading lights of the field, would describe the development of ‘human knowledge about personality’ in three historical phases, starting with a Literary and Philosophical phase, followed by ‘proto-clinical phase’ of studies of the mental illness, and finally, the quantitative and experimental phase, a phase that Galton’s work ushered. His selection of ‘eminent men’ were questionable, and despite claims of objectivity, it was influenced by archetypes of Romantic Genius (Galton excluded a number of practical men of eminence, like the politicians and civil servants). But his work was groundbreaking in terms of its use of ‘multivariate analysis’, a statistical technique later perfected by Charles Spearman which combine different and multi-faceted observations in an aggregate measurement of a single cause.

Darwin was, however, no ‘convert’. His own thought about the nature of intelligence, as he pointed out in the letter, was in conflict with Galton’s idea of determination by inheritance. As Howard Gruber maintains, “(f)or adaptive behavioural change to precede and influence structural change, it is necessary that previously inherited structures do not completely determine behaviour.” However, beyond the questions of inheritance, there was a deep philosophical difference between Galton’s thesis about ‘genius’, which will underpin the Eugenics movement he would conceive, and Darwin’s ideas about ‘Struggle for Existence’ which is particularly relevant in the context of the present discussion.

While Darwinian theory used shared vocabulary with earlier thought about ‘Struggle for Existence’ and Darwin famously and self-reportedly got his inspiration from reading Malthus, Darwin’s conception of ‘struggle’ was diverged from these earlier ideas: That ‘struggle for existence’ works to preserve the ‘integrity’ of the species by weeding out the weak and benefitting the strong. This notion of ‘struggle’ preserving the basis of a species was, in fact, anti-evolutionary, supporting a view as maintained by John Crawford of Ethnological Society of London,”Nature, in some cases, takes some pains for preserving the integrity of the species but never for its improvement by mutation.” The Malthusian struggle for existence, and Spencer’s ideas about preservation of the ‘type’ of the species, were set in this tradition. Darwin, while using the shared vocabulary and metaphor, differed considerably from these earlier ideas:

In contrast to Spencer, Darwin thought evolution was more than the realisation of the archetype. By rooting the process of evolution in organic variations, he suggested that the notion of an ‘ideal’ or ‘type’ of a species, was, in any case, nonsense. The ‘unfit’, in the sense of the variation from a supposed archetype of a species, might very well become the successful progenitor of a new one. The species could only be measured against its ability to propagate its kind not against any idealised version of its essence and character.    

Galton understood that Darwinian ‘Struggle for Existence’ meant the poorer classes, ‘classes of a coarser organisation’ as he called them, would be favoured for their higher fertility rates, and his position was antithetical to Darwin’s idea of ‘the fittest’.  Darwin, on the other hand, while appreciating the role of intellect on human evolution, pointed out as erroneous that

there is some innate tendency towards continued development in mind and body. But development all kinds depends on many concurrent favourable circumstances. Natural selection acts only in a tentative manner. Individuals and races may have acquired certain indisputable advantages and yet have perished from failing in other characters.  

However, Galton and his colleague, Karl Pearson, remained committed to the Pre-Darwinian ideas of ‘Struggle for Existence’ improving the ‘type’ of the race. In many ways, Galton and the Eugenic movement belonged in the tradition of ‘cerebral physiology’ and phrenology of Gall, which ‘attempted to link moral and social behaviour with certain physical or physiological features of man’. Galton’s work was also deeply influenced by Adolphe Quetelet’s work, with the notions of observability and potentiality, and by the latter’s use of Statistical Techniques. Galton and Pearson refined these techniques and applied them in measuring ‘intellect’ - Galton focusing on the eminent people and Pearson studying the average individual - and pioneered ‘the multivariate experiment’ which studied many mental factors at once.  

Jones (1980) maintains that “Darwinism served in [Galton’s] work only to ‘modernise’ what even in the nineteenth century was regarded by many as an archaic pseudo-science of mind.” Despite Darwin’s acute realisation that Galton’s concept of the ‘fittest’ is an argument against his own ideas of ‘natural selection’ - in fact, Galton was arguing for suspension of natural selection within human societies - his own arguments in favour of differentiation between animals and human beings in terms of intellect in the Descent of Man, designed to appease many implacable enemies of ‘natural selection’ on the ground of human dignity, created an apparent ground on which Galtonian arguments could be launched. The deep philosophical difference on non-realisation of ‘species type’ through evolution is a moot point in public discussions when compared with the issue of primacy of Man due to ‘his intellect’. Galton’s work, therefore, remained within the tradition of Darwinian Science, forever entwining the Eugenics movement with the name and prestige of Darwin, influencing the later developments in the quest for intelligence both with its statistical techniques and its assumptions about who the ‘fittest’ may be.

Intelligence and Ordering of Society 

The standard tool for measuring intellect - Intelligence as it would now be called - was pioneered by a Frenchman, Alfred Binet, Director of the Psychology laboratory at the Sorbonne. Binet’s first attempts at measuring intelligence were in the lines of the Physical Anthropology school, and after Paul Broca’s methods of measuring skull sizes and correlating it with intellectual capacity. However, Binet’s initial studies only produced small differences, and he became aware that his own suggestibility - his measurement of brain sizes reduced when he became aware that the subject is less ‘intelligent’ - while doing these experiments.  In 1904, however, when Binet was commissioned by the Minister of Public Education to identify the children who needs special education in schools, Binet spurned craniometry and instead devised a set of tests designed to measure ‘mental age’ of the Children.  Binet’s tests were a series of defined tasks of progressive level of difficulty, and each level was associated with a mental age. The Children were assigned a mental age based on the highest level of tasks they could perform, and subtracting their mental age from their chronological age, Binet developed his famous ‘Scale’ (the Children with higher difference between their mental and chronological age needed most support). After Binet’s death, German psychologist W Stern modified the technique - dividing the mental age by chronological age rather than subtracting the former from the latter (an important difference, as now the Children with lowest mental age had the lowest score, rather than the highest as in Binet’s Scale) - and this new score was called the Intelligence Quotient, or IQ.


Binet was all too aware of the limitations of the tests he devised, and insisted on three principles regarding the use of his tests, as Gould (1981) summarises:

  1. The scores are a practical device; they do not buttress any theory of intellect. They do not define anything innate or permanent. We may not designate what they measure as “intelligence” or any other reified entity.
  2. The scale is a rough, empirical guide for identifying mildly retarded and learning-disabled children who need special help. It is not a device for ranking normal children.
  3. Whatever the cause of difficulty in children identified for help, emphasis shall be placed upon improvement through special training. Low scores shall not be used to mark children as innately incapable.

These principles were at risk immediately after Binet’s death, as in the naming of Intelligence Quotient which inverted the Scale, and they were completely lost on H H Goddard, who introduced Binet’s tests to America and ‘reified’ its scores as innate intelligence. Goddard, a former school-teacher who was by then a devoted Mendelian, was studying ‘feeble-mindedness’ through a series of field studies in collaboration with Elizabeth Kite. In 1912, Goddard wrote a book - The Kallikak Family: A Study in the Heredity of Feeble-mindedness -  to set forth his appeal to the public about improving the ‘racial type’ through identification of the feeble-minded and discouraging their propagation. This book, which will become one of the most popular tracts in American Eugenics, Goddard would bring together a host of ideas: “Binet’s Measurements; Mendel’s Laws; Galton’s calls for an experiment with natural controls; and Kite’s reports from the field.”  Goddard popularized Binet’s work in America, translating his works and advocating general use for Eugenic purposes. Ignoring Binet’s warnings though, Goddard used the scores as a measure of innate intelligence, and developed an ‘unilinear scale of intelligence’, classifying everybody but with the specific purpose of recognizing, limiting, segregating and curtailing breeding of ‘feeble-minded’. This was also to be used for Goddard’s idea of ‘democracy’, which meant
that the people rule by selecting the wisest, most intelligent and most human to tell them what to do to be happy. Thus democracy is a method for a truly benevolent aristocracy.

Goddard’s enduring legacy in the popular culture is a word that he invented to describe the ‘feeble-minded’, Moron, but by 1928, Goddard’s ideas had changed, and he, more in line with Binet, believed that feeble-mindedness is not incurable and that the feeble-minded did not need to be segregated in institutions.

While Goddard introduced the Binet’s tests to America, Lewis Terman, a Professor at Stanford, was its main populariser. Terman extended Binet’s tests to include ‘superior adults’ and created the new Stanford-Binet tests in 1916. Through testing and elimination, Terman created a standardized system where an average Child will be expected to score 100 (at which level, the mental age equals chronological age), with a standard deviation of 15. This test became the standard of all IQ testing ever since, with other test providers benchmarking their tests against Stanford-Binet without questioning its assumptions about how intelligence was defined and measured. The tests were extended to everyone, and it became hugely consequential in people’s lives, not just in terms of school choice or employment, but also in literal life and death matters: In some states, people with an IQ lower than 70 were exempted from capital punishment. Terman wanted to use these tests to bring “tens of thousands of these high grade defectives under the surveillance and protection of society. This will ultimately result in curtailing the reproduction of feeble-mindedness and in the elimination of an enormous amount of crime, pauperism and industrial inefficiency.” Terman promoted these tests as National Intelligence Tests and argued for universal testing, which, he argued, would eliminate vice and crime and save United States $500 million a year. Terman also studied Geniuses of the Past, not unlike Galton and Pearson, and created an eugenic visions of technocracy which is not a socially mobile one, but rather defined by existing class and race prejudices, which Terman accepted as a given, a result of innate intelligence rather than something that could be upset by testing.

Walter Lippmann, in a prescient critique of Terman’s endeavours, wrote:

The danger of the intelligence tests is that in a wholesale system of education, the less sophisticated or the more prejudiced will stop when they have classified and forget that their duty is to educate. They will grade the retarded child instead of fighting the causes of his backwardness. For the whole drift of the propaganda based on intelligence testing is to treat people with low intelligence quotient as congenitally and hopelessly inferior.

Lippmann’s fears have been realised, as is seen in The Atlantic article quoted above. However, Terman’s idea of universal testing was only partially realised through the endeavours of Robert M Yerkes, who convinced the United States Army to use IQ Tests for all its new recruits during the First World War, testing over 1.75 million people and classifying them, according to their IQ, to frontline or officer roles. The Army IQ Tests, despite being flawed institution because of its questionable methods and doubtful outcome (an average mental age of 13, for example) , were politically significant and was used to argue for restricted immigration for certain types of people (Eastern and Southern Europeans). However, the vastness of the Army IQ Tests produced enough data for its leading practitioners to reconcile their theories with empirical evidence, and despite several attempts, some sincere and others ingenious, many of them came to reverse their positions on the views they had earlier defended. One great example of this was C C Brigham, an Assistant Professor of Psychology at Princeton and one of Yerkes’ key associates in Army IQ Tests. Brigham wrote a book, A Study of American Intelligence, attempting to use racial arguments to justify the results of Army IQ Tests, only to recant it later and argue

Most psychologists working in the test field have been guilty of a naming fallacy which easily enables them to slide mysteriously from the score in the test to the hypothetical faculty suggested by the name given to the test. Thus, they speak of sensory discrimination, perception, memory, intelligence, and the like while the reference is to a certain objective test situation.

At the finest hour of IQ testing, its leading practitioners discovered an uncomfortable truth: Instead of being an objective reality that the practitioners believed that existed, ‘Intelligence’ is whatever the intelligence tests are measuring, as E G Boring, one of Yerkes’ key assistants, famously stated.

‘A Natural Aristocracy Among Men’ 

When James Bryant Conant, President of Harvard, wanted to reform Harvard in his first academic year of 1933-34, he set out to create a new kind of scholarship. Until this time, Harvard was a bastion of rich young students from private schools in New England, who lived in private apartments, often with full retinue of servants and other attendants. The scholarships Harvard offered did not include accommodation, and were based on financial as well as academic criteria: This meant most scholarship students were day scholars from Boston, who lived with their parents, and often had to leave the college if the academic performance did not meet the criteria. Conant wished to change this, and attract best students from all over the country rather than New England. To achieve this, he wanted to design a new, full Four-Year scholarship that included Room and Board, with minimal conditions and no work requirements. Conant wanted to change the idea of scholarship from a ‘badge of poverty’ to a ‘badge of honour’: A rich student, if he won the scholarship, would be declared the winner but would be given no money.


The problem with Conant’s expansive vision was in the way Harvard selected its students then: Through a set of tests set by College Entrance Examination Board, which focused exclusively on the mastery of the New England Boarding School curriculum. They were unusable for selecting public school students from Midwest, who Conant wanted to bring to Harvard. He, therefore, had to set the task of finding an appropriate test for two of Harvard’s Assistant Deans, Henry Chauncey, the future President of Educational Testing Service and the face of SAT in America, and Wilbur J Bender. Chauncey was already a convert to the idea of testing, particularly after attending a lecture at Harvard by William Learned, who was conducting an Eight Year study on behalf of Carnegie Foundation for Advancement of Teaching in the Pennsylvania School System. Learned was not an IQ tester, but rather a believer of standardised achievement testing, which was effective within a particular schooling district, or even perhaps a state schooling system. Chauncey, however, wanted to create a Census of Abilities - a test for aptitude rather than achievement - and this led Chauncey and Bender to Carl Brigham, the associate of Yerkes in Army IQ Tests.

After the war, when Army IQ Testing was over, the new market for IQ Testing was schools and other educational institutions. Brigham and his colleagues metamorphosed the Army IQ Test for this new market, re-labeling it Scholastic Aptitude Test (SAT), basing it on the claim that these intelligence tests have a higher level of validity - the ability to predict outcome, in this case, the first year academic performance - than the College Board tests. The IQ testers claimed that intelligence testing would have a validity of .60, an ability to predict performance with 60% accuracy, whereas the College Board tests only had a validity of .20: However, when these claims failed to materialise, Brigham pointed to ‘social distractions’ of high-living Princeton and Yale students to explain away the difference. The College Board and the Army started using SAT from 1926 for admissions in West Point and other institutions run by the Navy.

By the time Harvard started speaking to Brigham about using the SAT, Brigham had a change of heart about IQ Testing, and formally retracted the claims he made in his popular tract, A Study of American Intelligence. Furthermore, Brigham published a second book in 1932, named, A Study in Error, and was privately writing

The test scores very definitely are a composite including schooling, family background, familiarity with English and everything else. The “native intelligence” hypothesis is dead

Brigham wanted to use SAT as a ready method of interview, rather than a test of native intelligence. But Conant, who was not an Eugenicist, believed in native intelligence nonetheless, and Harvard would adopt SAT for its new scholarship examination in 1934, gradually expanding the reach of the ‘Conant Prize’.
 
Conant’s vision, however, was far more expansive than the new Scholarship programme at Harvard, and he wanted to, as he pointed out in his essay in Harper’s Magazine in 1938, “The Future of Our Higher Education”. Conant drew his inspiration from Thomas Jefferson, more specifically a letter Jefferson wrote to John Adams in 1813:

For I agree with you that there is a natural aristocracy among men...There is also an artificial aristocracy founded on wealth and birth, without either virtue or talents;.... The natural aristocracy I consider as the most precious gift of nature for the instruction, the trusts, and government of society. And indeed it would have been inconsistent in creation to have formed man for the social state, and not to have provided virtue and wisdom enough to manage the concerns of the society.

Conant disregarded Adams’ alarmed response - “Your distinction between natural and artificial Aristocracy does not appear to me well founded...I only say that Mankind have not yet discovered any remedy against irresistible Corruption in Elections to Offices of great Power and Profit, but making them hereditary” - as his ideas were also deeply influenced by Frederick Jackson Turner, Historian of the Frontier, whose central idea, that once the open frontiers of the American West was settled into and expansion of opportunity had disappeared, the American society would atrophy into European style class society without social mobility. In Conant’s view, public education is the way to maintain the vitality of the American society. Conant wanted - as he wrote in a later, one of his more radical essays - government to confiscate all property from time to time, and unseat the traditional elite through a new elite chosen democratically through testing and education.

The cornerstone of Conant’s idea to bring about the ‘Natural Aristocracy’ was a merger of all Test agencies and creation of a single, national, test provider administering aptitude tests for college admissions. Ironically, it was Carl Brigham who was standing on the way, who, by then, had become opposed to testing as a sorting device. On January 3, 1938, Brigham wrote to Conant a remarkable letter, calling the Army IQ Tests ‘atrocious’ and painting a dystopian picture of the day when Intelligence Testing would be ubiquitous:

If the unhappy day ever comes when the teachers point their students towards these newer examinations, and the present weak and restricted procedures get a grip on education, then we may look for the inevitable distortion of education in terms of tests.

Brigham would, however, pass away in 1943, at an age of fifty-two, and the unified testing agency he so opposed would come into being in the form of Educational Testing Service (ETS), with Henry Chauncey as its President and James Bryan Conant as its Chair of the Board of Trustees, on 1st January 1948. ETS would take over all College Board examinations and ACE Psychological Examination, which was SAT’s main challenger, was discontinued by American Council of Education soon thereafter. Though other, local and For Profit, Test Providers would continue to operate and compete with SAT in different regional markets, SAT’s pole position was guaranteed by its Ivy League credentials and government endorsements.

Despite a shaky start, ETS’ financial position would also be guaranteed by another nationwide testing operation by the United States Military - the Selective Service System - which contracted ETS to administer SAT nationwide for students in college. The idea was to leave students above a certain cut-off score in college and allow them to defer the draft, but send others out for Military training. This was a test of enormous consequence, with great public opposition, from including none other than Conant, who believed in universal military training, without exceptions. ETS would adopt clever public relations techniques, insisting that the SAT is not an ‘intelligence test’ but rather of ‘Scholastic Aptitude’, and changing the presentation of the cut-off score to 50 to remind the test-takers of school grade, rather than the IQ. These tests put ETS and SAT firmly into public imagination, but more so, these were enormously profitable for ETS and secured itself financially.

However, the final ‘victory’ of SAT had to wait till 1958, when Clark Kerr, already a member of the Board of Trustees of ETS, became the President of the University of California, the largest State University System in the United States. From 1958, ETS started offering SAT at no cost to University of California applicants. In 1959, UC system required all out of state applicants to take the test. But in 1962, the University dropped SAT altogether, only to embrace it back again when, in the wake of the California Higher Education Master Plan, which restricted university education to the top eighth of High School graduates, grade inflation took hold in High Schools. The University stopped accrediting High Schools from 1963, and by 1968-9, SAT was required for all applicants to the university.

Conclusion: “An Oligarchy of Brains’ 

In 2015, more than 1.7 million students took the SAT examination, with another 3.8 million taking PSAT, its practice version. However, at the same time, SAT has acquired such an aura that companies ask for SAT scores even for senior employees, sometimes in their 40s and 50s, who would have taken SAT years ago. Accepting SAT as an admission criteria is no longer a choice universities and colleges can make on their own; the US News & World Report’s Annual College Rankings, the guide Middle Class parents in America depend on for school choice, automatically downgrades an institution on student selectivity, an important criteria, if the institution does not ask for SAT Scores.


This acceptance does not mean that the SAT has proved itself to be accurate in predicting the test-takers ‘scholastic’ performance. Quite the contrary: Its validity, ability to predict academic performance, has dwindled: One study has put SAT’s ability to predict grades at about 15%. Its name change - from Scholastic Aptitude Test to Scholastic Assessment Test to the current SAT Reasoning Test - was primarily to reflect the modesty of its claim, though the popular acronym, SAT, was always maintained to be the shorthand of ‘merit’ in public perception. Lani Guinier (2015) called SAT “the Wealth Test”, and this is well reflected in the chart below:
Gross Annual Family Income                             
Average SAT Score (Out of 2400) For 2013 College Bound Seniors
$0 - $20,000
1326
$20,000 - $40,000
1402
$40,000 - $60,000
1461
$60,000 - $80,000
1497
$80,000 - $100,000
1535
$100,000 - $120,000
1569
$120,000 - $140,000
1581
$140,000 - $160,000
1604
$160,000 - $200,000
1625
More than $200,000
1714

This may be so not just because of the inherent bias in the tests which Brigham was so acutely aware of, but also because of an enormous test preparation industry that has grown around the SAT. The pioneer in this was Stanley Kaplan, the Jewish entrepreneur who debunked ETS’ claim that SAT was not coachable by building a billion-dollar test-prep business around it. Wealthy parents today, considering SAT to be a fail-safe ticket in education and career, spend $20,000 to $30,000 a year on SAT preparation for their children. The SAT has also become a racial sorting mechanism, with African-Americans averaging a score of 1278 against White students’ 1576 (out of 2400, 2013 data).

In conclusion, the SAT appears to have become the trojan horse of Eugenics in the society, embedded within Jefferson’s, and Conant’s, lofty dream of ‘Natural Aristocracy’, somewhat reaffirming Adams’ weary scepticism about human tendency to make advantages hereditary. Economist Gregory Clark, in an inversion of Galton’s method, showed how wealth has remained large hereditary over the last eight centuries, undermining the moral claim of ‘meritocracy’. However, regardless of the evidence, the emergence of a ‘Cognitive Elite’ is now celebrated, and the arguments about ‘love and marriage by IQ’ are respectable again.

Yet, technological change, and consequent disruption of the middle class life, opens up new questions about human abilities: Research by psychologist Carol Dweck, for example, shows that children growing up assuming that ‘intelligence’ is a fixed and biological attribute (‘Fixed Mindset, Dr Dweck calls it) are less able to cope with change and uncertainty than the children growing up believing in zeal and perseverance (‘Growth Mindset’). Lani Guinier argues that the ‘testocratic merit’ has undermined ‘democratic merit’, urgently needed for collaboration and conversations much needed at a time when the quest for technological advancement may become self-defeating. In the end, Darwin may have had his point: In the quest of becoming good at one thing, we may have ignored other things crucial for our advancement, or even, survival.

(5425 Words)


Bibliography 

Primary Sources 

Darwin Correspondence Project, “Letter no. 7032,” http://www.darwinproject.ac.uk/DCP-LETT-7032 Accessed on 30th April 2017


Thomas Jefferson to John Adams, Letter, 28th October 1813, The Founder’s Constitution, Chapter 15, Document 61, from http://press-pubs.uchicago.edu/founders/documents/v1ch15s61.html

John Adams to Thomas Jefferson, Letter, 15th November 1813, The Founder’s Constitution, Chapter 15, Document 62, from http://press-pubs.uchicago.edu/founders/documents/v1ch15s62.html

Secondary Sources 
Books 
Cattell, Raymond B, The Scientific Analysis of Personality, Penguin Books, London, 1965
Clark, G, The Son Also Rises, Princeton University Press, Princeton, NJ, 2015
Dweck, Carol Mindset: How Can You Fulfil Your Potential, (Random House, New York), 2006
Gould, SJ, The Mismeasure of Man, W W Norton, New York, 1981
Gruber, H, Darwin On Man, E P Dutton and Co. New York, 1974
Herrnstein, C and Murray, C, The Bell Curve: Intelligence and Class Structure in American Life, Free Press, New York, 1994
Jones, G, Social Darwinism and English Thought, The Harvester Press, Sussex, 1980
Lemann, Nicholas, The Big Test, Farrar, Straus And Giroux, New York, 1999
Shurkin, J N, Terman’s Kids, Little Brown and Company, Boston, 1992
Young, M, The Rise of Meritocracy 1870 - 2033, An Essay on Education and Equality, Thames and Hudson, London, 1958
Zenderland, L, Measuring Minds: Henry Herbert Goddard and the Origins of American Intelligence Testing, Cambridge University Press, Cambridge, 1998



Websites and Periodicals 
Freedman, D, The War on Stupid People, The Atlantic, July/August 2016  Accessed From : https://www.theatlantic.com/magazine/archive/2016/07/the-war-on-stupid-people/485618/ Accessed on 30th April 2017

Klein, R , More Students Are Taking SAT, Even As The Scores Fail to Improve, Huffpost Politics, Setember 4, 2015 Accessed from http://www.huffingtonpost.com/entry/2015-sat-results_us_55e751c6e4b0c818f61a56ce Accessed on 30th April 2017

Korn, M, Job Hunting: Dig Up The Old SAT Scores?, Wall Street Journal, March 25 2014 , Accessed from https://www.wsj.com/articles/job-hunting-dig-up-those-old-sat-scores-1393374186 Accessed on 30th April 2017

Popular Posts

How To Live

"Far better it is to dare mighty things, to win glorious triumphs even though checkered by failure, than to rank with those poor spirits who neither enjoy nor suffer much because they live in the grey twilight that knows neither victory nor defeat."

- Theodore Roosevelt

Last Words

We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.

- T S Eliot

Creative Commons License

AddThis