Five decades after his death, Jawaharlal Nehru still generates passions among Indians who are animated by questions over the country’s future.  Writing from my perch in Los Angeles, I do not, for example, see anything remotely resembling the buzz about Nehru in American discussions about John F. Kennedy, another immensely charismatic leader of a democracy who was assassinated just months before Nehru passed away in 1964.  We could say, of course, that Nehru was a titan among men, and that unlike Kennedy, who occupied the American presidency for less than three years, Nehru held sway as independent India’s first Prime Minister for seventeen years.  If Amartya Sen is to be believed, it is the argumentative Indian in every Indian that keeps Nehru visible to the Indian public, loathed and perhaps admired in equal measure.

To a great many of his detractors, Nehru is easily pigeonholed as a somewhat effete, Oxbridge-educated quasi-dreamer whose indecisiveness, socialist leanings, and moral highhandedness cost India its place in the pantheon of world powers. The principal elements of this narrative are too well known to require more than the briefest mention.  His disposition towards centralized state planning is said to have kept India back for decades before the country, in the parlance of the free market cheerleaders, ‘opened’ itself up to the world:  thus the infamous ‘Hindu rate of growth’.  Even his defenders are constrained to admit that India’s record on various social fronts under Nehru is appalling, and one might summon the exceedingly slow advancement in improving literacy as a prominent instance of misplaced priorities, though it must also be said that India’s record in the 25 years since neo-liberalization policies were introduced has if anything been worse.

Nehru’s political failings are described by his critics as even greater.   How often have we not heard that he faltered badly in turning the Kashmir dispute over to the United Nations?  We have been incessantly reminded that his naive trust in the Chinese, embodied in the slogan ‘Hindi chini bhai bhai’ [Indians and Chinese are brothers], was repaid back with China’s invasion of India.  These criticisms are most often accompanied with jejune ramblings about how India would have long ago taken the world by storm had not fate unkindly intervened to remove India’s ‘Iron Man’, Sardar Patel, from the scene and thereby leave Nehru without a peer to question his dictatorial and yet highly confused exercise of authority.  (I wonder if it was the karma of nationalist Hindus that left them bereft of their leader, but we may leave that aside for the moment.)   Nehru’s defenders, on the other hand, appear to think that all they have to do is merely summon the fact that India has persisted as a democracy, a distinct achievement when we consider what has transpired in other countries that gained their independence from colonial powers.  His admirers naturally admit his record in this respect is far being unblemished:  Nehru agreed to the dismissal of the elected communist government of Kerala in 1959, and the take-over of Goa by the Indian government in December 1961, even if it was inspired by his distaste for the remnants of colonial rule in India, dent a huge hole in his reputation as an upholder of the rule of law.  Nevertheless, the singularity of Nehru’s achievement in making India abide by democratic norms has been all too often stressed by his defenders.

Newspaper headlines on the Chinese invasion of India

Newspaper headlines on the Chinese invasion of India


Chinese Premier, Chou En-lai (left), with Nehru and an interpreter (right)

Chinese Premier, Chou En-lai (left), with Nehru and an interpreter (right)

What, then, are we to make of Nehru?  Looking back at the twentieth century, there can be no doubt that in the aftermath of the conclusion of World War II, and in the period extending to the glorious defeat of the mightiest military force assembled in the world by the (as more than one European social theorist would say) puny and rice-eating Vietnamese, decolonization was the greatest political force in the world.  Most Western, and especially American, historians have not seen it this way, obsessed as they are with the Cold War. The Cold War impinged on decolonization as well, since the Soviet Union, itself a totalitarian regime without an iota of capacity for tolerating dissent, was keen on being seen as a champion of the struggles of third world peoples against their imperial oppressors.  The United States, on its part, consistently came down on the wrong side of nearly every anti-colonial struggle, either choosing to remain on the sidelines or, more often, supporting the most reactionary elements.  What must be said unequivocally about Nehru is that he played a critical role in supporting anti-colonial and decolonization movements throughout the world.  India’s support of the African National Congress under Nehru is perhaps only the most well-known instance of his principled support of resistance movements, but similar testimony has been furnished by other political leaders such as Julius Nyerere and Jomo Kenyatta.  His friendship with Paul Robeson, an African American artist of extraordinary gifts who was hounded by the country’s political elites, is unfathomable except on the premise that both shared a profound aspiration for the freedom of all colored peoples.

Nasser of Egypt (left), Nehru (2nd from the right), and an aide to Nasser, ushering in the Burmese New Year at Bandung, Indonesia, 1955.

Colonel Gamal A. Nasser of Egypt (left), Burmese Prime Minister U Nu (2nd from left), Nehru (2nd from the right), and an aide to Nasser, Major Salah Salem, ushering in the Burmese New Year in traditional costume at Bandung, Indonesia, Aapril 1955.

Nehru Discovers an American to Admire:  an article in the Chicago Tribune, 25 November 1958, about the growing friendship of Robeson and Nehru

Nehru Discovers an American to Admire: an article in the Chicago Tribune, 25 November 1958 (bottom left corner), about the growing friendship of Robeson and Nehru


We have heard in recent years that Nehru nurtured not just democracy but diversity.  We might inquire what precisely that means:  India is nothing if not ‘diverse’, but how exactly is it that one nurtures diversity?  Does diversity increase if one nurtures it?  Those who are intolerant of minorities can justifiably be seen as not promoting ‘diversity’, but does the institution of multicultural policies, as in the United States, where the forces of homogenization are immense, lead to diversity?  Diversity’s advocates barely understand that today’s dictators are all required to undergo diversity appreciation courses.  One should say, rather, of Nehru that he was inspiringly ecumenical, but more than just a person of astonishingly wide reading and the proverbially insatiable curiosity of the polymath.  It is doubtful that a more ecumenical history of the world has ever been written than is contained in his Glimpses of World History (1934), a large collection of letters written by Nehru to his daughter Indira.  Letter 112 is almost apologetic about having given over more time and space to India than to other countries; and Letter 44 advises Indira that “we have the whole world to consider, and if a small part of it, even though it may be the part where we live, took up much of our time, we would never get on with the rest.”  Such ecumenism is not merely charming—it is a reminder of a cosmopolitanism that is still foreign to most people.  We have not yet begun to take the measure of the man known as Jawaharlal Nehru.

(Originally published as “The Measure of a Man” in the Sunday Times of India, 9 November 2014.)



Jeremy Seabrook, an independent writer, journalist, and chronicler of the human condition who has long had an interest in South Asia, probing especially the lives of those who inhabit India’s slums and Muslim ghettos, turns his attention in his most recent book to the workers of Bangladesh’s garment industries who clothe the world but, like the weavers of Bengal in colonial India, barely have enough to cover their own nakedness. The book takes its title from Thomas Hood’s elegy on the women workers in Lancashire’s textile mills, “The Song of the Shirt” (1843):  working “in poverty, hunger, and dirt,” moving their fingers to the command of “Stich! Stich! Stich!”, they sowed at once, “with a double thread, A Shroud as well as a Shirt.”


If Bangladesh has emerged at all in the news in recent years, it is on account of the disasters that have befallen its garment industry, none as calamitous as the collapse of the Rana Plaza commercial building under whose debris over 1,100 people were left dead.  Seabrook’s book is a searing indictment of the callousness of factory owners and others in the global system of the circulation of capital who are complicit in creating miserable working conditions for those employed in Bangladesh’s largest and most profitable industrial venture.  Yet it is also an extraordinary tribute to the workers who pile into Dhaka and other centers of the garment industry from all corners of the country.  Their lives are sketched not so much in detail as in poignantly suggestive prose.  Do people flee to that “washed out concrete jungle” that is called Dhaka, which Seabrook unflinchingly describes as one of the world’s ugliest cities, to escape the narrowing of human possibilities that Marx sought to capture in his brutal condemnation of the “idiocy of the rural countryside”?  One might suppose that it is the aspiration to become something in life, or merely to earn a livelihood, that brings people to the city from the interior, but what would then make the story of Bangladesh so distinct?


Rana Plaza Building Collapse, Dhaka

Rana Plaza Building Collapse, Dhaka

The recent migrants coming into Dhaka “bring the sincerity and artlessness of rural life—qualities ripe for transformation into exploitable labour” (37); yet the city where they arrive with such hope becomes another prison, reduced as they are to 12-hour working days in window-less rooms and under the watchful eyes of cruel taskmasters.  But the specificity of their stories is to be derived from the particular conditions which push them to the city—the loss of ancestral land, the inability to feed too many mouths, the need for money to secure a dowry for a sister, and so on.  Perhaps more than anything else, Bangladesh is a land dense with rivers and scarred by cyclones.  The rivers are teeming with fish but the river is also “thieving” and “hungry”; when the Meghna overflows, its ravenous maw swallows up whole houses, agricultural land, and people (105-9).


Garment Factory in the Mohakhali area of Dhaka, Bangladesh, March 2010.  Credit:  Clean Clothes Campaign.

Garment Factory in the Mohakhali area of Dhaka, Bangladesh, March 2010. Credit: Clean Clothes Campaign.

In this short yet compelling book, Seabrook skillfully weaves together tales of peasants with stories of weavers, moving back and forth between the village and the city, water and land, lush green fields and the ramshackle appearance of South Asia’s urban patches, even the poetic and the idioms of history.  He is versed in historical sources but not burdened by them; and only someone with the sensibility of a poet can ruminate on the strange interplay of fire and water that has shaped the contours of the lives of his subjects.  Water, in the form of “tidal surges, cyclones and floods”, has “always been the most usual element that brings death to Bangladesh” (29); the villagers flee this menace to arrive as laborers in the city’s garment factories, often to be consumed by the fires that have time and again struck the garment industry’s ill-regulated factories and warehouses.  One might have thought that water fights fire, but in Bangladesh the two often collude—as if the poor did not have enough wretchedness in their lives.  There are other desultory facts, almost poetic bits, that take the breath away:  who would have thought that Murshidabad, once the world centre of silk weaving and now a haunted culture, perhaps once accounted for “5 percent” of the world’s total product (176-77)?


The Song of the Shirt, writes Seabrook, “is a reflection on the mutability of progress.”  The dominant narrative of human progress presupposes “a deterministic and linear process,” but Seabrook insistently reminds us that “experience in the clothing and fabric industry suggests certain areas of the world are liable to periods of industrialization, but equally to de-industrialization” (17).  The most ambitious aspect of Seabrook’s intellectual enterprise, and not surprisingly the one that is most fraught with hazards of interpretation, is his attempt to suggest the various ways in which Bengal and Lancashire offer a mirror image of each other.  The demise of the weaving industry in Bengal under colonial rule “coincided with the rise of the mechanized production of cotton goods” in the greater Manchester area; even as Dhaka became a ghost town, “Manchester became the centre of intense economic dynamism”.  Lancashire is now utterly denuded of its textile mills, and Manchester has lost population; but, in “a ghostly replay of traffic in the other direction” of “the machine-made clothing” that wiped out Bengal’s famed spinners and weavers (17), Dhaka with its 2500 garment factories has become nearly the epicentre of the world’s manufactured clothing.  Seabrook invests much in this comparison, constantly suggesting similarities between 19th century Britain and contemporary Bangladesh with respect to the lives of the workers, the nexus between the state and manufacturers, working conditions, and much else.


One might be moved, by Seabrook’s comparisons, to accept that widely quoted French expression, “the more things change, the more they remain the same”.  But this temptation must be resisted, if only because a sensitivity to the politics of knowledge should introduce some caution in comparisons between Europe’s past and the global South’s present.  Nevertheless, in Song of the Shirt, Seabrook has accomplished the enviable task of rendering naked the social processes which have helped to clothe the world and disguise some unpalatable truths about the treacherousness of what is usually celebrated as entrepreneurial capitalism.


[Review of Jeremy Seabrook, The Song of the SpiritCheap Clothes Across Continents and Centuries (New Delhi:  Navayana, 2014;  ISBN:  9788189059644; 288 pp.  Rs 495), first published as “The Textile Jungle”, The Indian Express (18 October 2014).


In his recently concluded visit to the United States, where he addressed a jubilant crowd of around 19,000 people, Narendra Modi all but dedicated his government to the Non-resident Indians gathered to celebrate his triumph.  “You have given me a lot of love”, he told his admirers:  “This kind of love has never been given to any Indian leader, ever.  I’m very grateful to you.  And I will repay that loan by forming the India of your dreams.”  This was music to the ears of his devoted listeners, whose achievements Modi has promised to teach his countrymen and women to emulate:  “I want to duplicate your success.  What do we do to duplicate that success?”

Prime Minister Narendra Modi at Madison Square Garden, 28 September 2014.

Prime Minister Narendra Modi at Madison Square Garden, 28 September 2014.


The members of the Indian Civil Service who governed India after it became a Crown Colony were described as, and believed themselves to be, “heaven-born”.  Many Indian Americans similarly believe themselves to be not merely fortunate and hard-working but as the vanguard of what may be described as a post-industrial Vedic civilization.  To understand what it is that enables Indian Americans, and mainly the Hindus among them, to think of themselves both as immensely spiritually gifted, as the true inheritors of a Vedic civilization, and as the ideal representatives of the world’s most advanced material culture, certain aspects of the history of Indian Americans must be revisited.  Though they are today the most educated and affluent of any ethnic group in the United States, they have long bemoaned their fate as an ‘invisible minority’.  Five decades ago, the Punjabi American farmer Dilip Singh Saund served three terms (1957-63) in the House of Representatives.   Until very recently, however, Indian Americans have scarcely made any other dent in politics.  But it is other forms of invisibility that touch a raw nerve:  as the savvy and yet aggressive young professionals who form part of the comparatively new Hindu American Foundation often point out, Hinduism is barely understood in the US and is, from their standpoint, unjustly maligned as a bizarre religion of false gods, demi-gods, demons, and such strange figures as Hanuman and Kali.


Hindus everywhere are inclined to believe that their religion, characterized by the notion of Vasudhaiva kutumbakam (‘the earth is one family’), uniquely fosters tolerance, but Hindus in the United States see themselves as especially blessed and charged with the dual mission of rejuvenating India and helping America fulfill its destiny as the mecca of multicultural democracy.  The formal dedication of many Hindu temples in the US, such as the Rama Shrine of the Hindu Temple of Greater Chicago, has taken place on July 4th, which marks the anniversary of American independence.  Hindus thus signify their acceptance of the idea that they share in the blessings of American “freedom”, while at the same time conveying to Americans that Hinduism permits a richer and more spiritual conception of freedom centered on the notion of self-realization.  The secular American formula, E Pluribus Unum, ‘From Many, One’, is countered by, and complemented with, the Vedic affirmation of idea that ‘Truth is One; Sages Name It Variously’ (‘Ekam Sat Vipra Bahudha Vadanti’; Rig Veda 1.164.46).


Now A Rock Star:  The New York Times was one among many newspapers that described the reception Narendra Modi received at Madison Square Garden as befitting a rock star

Now A Rock Star: The New York Times was one among many newspapers that described the reception Narendra Modi received at Madison Square Garden as befitting a rock star

Indian American Hindus are exceedingly astute in their understanding of how discourses of multiculturalism might be deployed in the US to their advantage.  Several years ago, a number of Hindu organizations rallied together in a concerted attempt to force alterations in history textbooks used in California schools.  They objected, for example, to the fact that such textbooks characterized Hinduism as a polytheistic rather than monotheistic faith, or that women in ancient India were described as having fewer rights than men.  American Hindus Against Defamation (AHAD) almost serves as a vigilante group, observing a hawk-like look-out for those who offend against Hindu sentiments.   However, their support of “multiculturalism” in India, where the religious, linguistic, and cultural diversity dwarfs anything seen in the US, is remarkably muted.  Apparently, on their world view, multiculturalism is much to be admired in the US even as it may safely be ignored in India.  Indeed, many Hindus in the US adhere to the view that the practice of their faith is not hobbled by the constraints that a pseudo-secular Indian state has imposed upon Hindus in their homeland.


There is a remarkable convergence in the worldview of the NRI—and the model of the successful NRI is the Indian American—and Narendra Modi.  The political ascendancy of a former tea vendor reminds Indian Americans of the opportunities made available to them in the supposed land of milk and honey, though such a narrative obscures the fact that many of the immigrant Indians who have done exceedingly well in the US already came from advantaged backgrounds.   NRIs and Modi alike crave to see a new, resplendent India that can take its place as a great power, but India in its present state is an embarrassment to them.  Its faults—the appalling poverty, the ramshackle appearance of every town, the indescribable filth in public spaces, widespread evidence of malnutrition and open defecation, and much else—need not be rehearsed at length, and Modi has signaled his attempt to meet such objections by launching the Swachch Bharat Mission.  But there are more compelling parts of the story and the anxiety of influence extends much further.  The Indian middle classes and the non-resident Indians have long agonized over the fact that India, as a friend once remarked to me, is ‘the largest most unimportant country in the world’, and that the same Indians who flounder in their homeland yet make something significant of themselves outside India.


Modi at Madison Square Garden:  A Who's Who of Indian American corporate types

Modi at Madison Square Garden: A Who’s Who of Indian American corporate types

It is under these circumstances that Modi has appeared, to the Indian middle classes and to NRIs, as the appointed one.  The well-to-do physicians, software engineers, scientists, Silicon Valley entrepreneurs, and other professionals in the Indian American community have long hankered for an Indian leader who would be imposing and decisive, and they are convinced that India requires a strong dose of authoritarian leadership if it is to prosper.  They are much more hospitable to the idea of a prosperous authoritarian state than they are to the idea of an India that is flaunted as a democracy but registers poor growth and continues to be an insignificant player in world politics.  Modi’s concentration of power is calculated to furnish, from their standpoint, some of the advantages found in the Presidential system of government.  Yet Modi also stands for what they view as ‘spiritual India’, a land synonymous with great yogis, teachers of spiritual renown, and sacred rivers that are personified as goddesses.  Thus, in the figure of Narendra Modi, Indian Americans see the possibilities of a prosperous yet spiritual India which they believe is already embodied in their own life histories.


(First published in OUTLOOK [Print and Web editions], 20 October 2014, as ‘The Prophet of Boom Times’].



In the Los Angeles Unified School District (LAUSD), one of the two or three largest school districts in the United States, this Thursday, September 25th, was a school holiday.  Some weeks ago, in studying the 2014-15 school calendar so that, as a parent of two teenage children who attend two different schools in the LAUSD school district, I could be better prepared in planning my children’s schedules and my own, I noticed that September 25th was listed as a holiday and described as “an unassigned day”.  The calendar doesn’t explain what an “unassigned day” means; and I wondered what the occasion might be for a school holiday.  Apart from the long winter recess, which of course revolves around Christmas, and the spring recess, LAUSD’s holidays generally follow the pattern found in the rest of the country, and the holidays are meant to mark significant milestones in the country’s history or celebrate the lives of notable individuals, such as Martin Luther King Jr—though, of course, the King holiday is of comparatively recent vintage, and aroused enormous resentment among those who were hostile to him or thought that far more eminent (white) Americans had not been similarly honored.  In California, though apparently not in most of the nation, Cesar Chavez is dignified (as indeed he should be) with a holiday:  the LAUSD school calendar lists April 6th as an “unassigned day”, but a note explains that the holiday is meant to mark the observance of Cesar Chavez’s birthday.  The United States also observes, rather strangely, President’s Day:  if the intent here was to celebrate the founding fathers who rose to the office of the President, or “great” American presidents, such as those figures—Jefferson, Washington, Teddy Roosevelt, and Lincoln—who have been conferred immortality at Mt. Rushmore, one can imagine many Americans nodding their head in assent.  President’s Day in actuality marks the birthday of George Washington, but not every state celebrates it as Washington’s birthday; indeed, there is the tacit recognition, signified by the designation of “President’s Holiday”, that every American president is to be felicitated.  But why should that be so?  Are Lincoln, Grover Cleveland, Martin van Buren—assuming that anyone remembers him at all—Jimmy Carter, and George Bush to be equally honored?  Should war-mongers among the presidents be honored or rather pitied, critiqued, and ostracized?



This is all by way of saying that a good deal can be inferred about a country from its holidays.  That much should be obvious, once we set our minds to thinking about little things like these; though it is these little and often unremarked upon things that reveal far more about a nation than the more common representations that a country encourages and engenders about itself.  The world observes Labor Day, also known as International Workers’ Day, on May 1st—but, as is commonly known, in the United States Labor Day is observed on the first Monday of September.  One might explain this away as yet another instance of American exceptionalism, as yet one more illustration of some insatiable need on the part of the United States to signify its difference from others and proclaim itself as the last great hope of humankind.  Just about the only other country where Labor Day is similarly celebrated in September is Canada, but this is barely surprising:  notwithstanding its pretensions at being a ‘softer’ state than its neighbor to the south, more humane and sensitive to the considerations of common people, Canada is clearly incapable of having any independent policy and has slavishly accepted the American lead in most affairs of life.  (Yes, I am aware that Canada has nationalized health care.)  We need not be detained here by the history of how it transpired that the United States came to observe Labor Day in September:  suffice to say that a certain American president, Grover Cleveland, was alarmed at the proximity of Labor Day (May 1) to the commemoration of the Haymarket riot (May 4), and wanted to ensure that celebrations of Labor Day would not furnish a pretext to remember the communists and anarchists who, it was argued, precipitated the Haymarket riot.



For the present, however, I am rather more animated by how Thursday, September 25th, became a school holiday in Los Angeles—an “unassigned day”, though most other holidays are known by their proper names, such as Veteran’s Day, Thanksgiving Day, and so on.  (Schools in the Los Angeles school district are shut down for the entire week of Thanksgiving; the first three days of that week are also marked as “unassigned days”, though it is understood that they are appended to Thanksgiving Day and form part of a week-long recess.)  I am also struck by what appears to be a wholly unrelated fact, but on reflection helped me unravel this puzzle.  The University of California, Los Angeles (UCLA), where I have been teaching for two decades, is commencing the fall quarter rather late.  The fall quarter always begins on a Thursday, since later in the quarter Thanksgiving falls on a Thursday; this ensures that there are ten complete weeks of instruction.  Ordinarily, classes commence in the last week of September; this year, fall quarter instruction begins on Thursday, October 2nd.  As in almost any other major American university, the Jewish element is disproportionately reflected in faculty ranks; indeed, it would be no exaggeration to say that in some departments, whether at UCLA, UC Berkeley, Harvard, Chicago, Columbia, and other like institutions, Jewish faculty predominate.  (Thankfully, the American university is one institution where Jews could go about doing their work relatively unhindered, though this is scarcely to say that the university has always been free of anti-Semitism or that Jews did not have to struggle against all odds to find a hospitable home.)  And it is surely no coincidence that the Jewish New Year, or Rosh Hashanah, this year falls on Thursday, September 25th.


In poring through the LAUSD calendar for 2014-15, it becomes palpably clear that only the adherents of Christianity are openly permitted their holidays.  Nothing in the school calendar confers similar recognition upon Muslims, Hindus, Buddhists, Sikhs, Taoists, and so on; much the same can be said for the UCLA academic calendar.  The Buddha’s birthday, Eid al-Adha, Eid al-Fitr, Diwali:  none of these auspicious days is given the recognition that is conferred upon many of the principal holy days in the Judeo-Christian tradition.  Never mind the fact that universities such as UCLA are increasingly greedy for foreign undergraduate students, many of them Hindus and Muslims, since they furnish the dollars that help universities maintain their bloated administrations.  The Hindu can have his holy cows just as long as the cash cows make their way to America and its “world-class” universities.  We are accustomed to much noise about the greatness of America as a multicultural nation, and one is almost nauseated by the constant and rather pious sermons about the need to value “diversity”.  If Hermann Goring wanted to reach for his gun whenever he heard the word ‘culture’, I am tempted to reach for Shiva’s trident whenever I hear the word ‘diversity’.  There was never any doubt that the United States has been and remains a resolutely Christian nation; nevertheless, it is critical to inquire why, and that too in a state which describes itself as the vanguard of progressive thinking and liberal attitudes, the academic calendar reinforces the notion that we all live under the Christian dispensation.  In religious matters, it seems, there is to be little or no diversity, and certainly no parity among the religions.


Having said this, the question about “the unassigned day”, which turns out to be the Jewish New Year, remains to be resolved.  Why isn’t the day simply declared a Jewish holiday?  Does this subterfuge arise from the fear that if Jews are openly permitted their holidays, the practitioners of at least some of the other ‘world religions’ will have to be allowed similar concessions?  On the other hand, the idea that Jewish people might remain unrecognized is altogether impermissible in American society.  The Jewish presence in Los Angeles is considerable; in certain sectors of American society, among them higher education and the film industry, the Jewish element is all but indispensable.  Then there is the consideration, to which I have already alluded, that the Judeo-Christian tradition is commonly viewed as the bedrock of American society:  if that is the case, it becomes perforce necessary, and critically vital to any conception of American politics, that Jewish customs and traditions be acknowledged and given their just due.  Yet, to complicate matters further, a latent hostility to Judaism and to Jews is inextricably part of the Christian inheritance, and there is a tacit compact which underscores the idea that the Jew in America should never be altogether visible.  Here, as has so often been the case before, the liminal status of the Jew—thus the “unassigned day”—is once again reaffirmed.


Religious Holidays at Pacific University

Religious Holidays at Pacific University

There have been, and continue to be, societies where religious pluralism is understood differently.  In my previous blog, in reviewing a book on Iraq under sanctions, I was struck by the authors’ claim, which is substantiated by other accounts, that in Iraq each religious community was permitted its paid religious holidays before the commencement of the Gulf War.  To admit this much does not diminish the other horrors of living under a dictatorship.  India is scarcely without its problems, and no one could say that religious minorities have not experienced discrimination; but it is nonetheless an unimpeachable fact that Buddhists, Hindus, Muslims, Christians, and Sikhs are all recognized by the state and that the religious holiday calendar has some space for each community.  The notion that the founding fathers of the United States were deeply committed to the separation of church and state, and that this principle has ever since guided American society, is part of American ‘common sense’ and rarely questioned.  It is this cunning of reason, this fundamental dishonesty, which mars America’s engagement with the question of religious pluralism.






[A review article on Abdul-Haq Al-Ani and Tarik Al-Ani, Genocide in IraqThe Case Against the UN Security Council and Member States (Atlanta:  Clarity Press, Inc., 2012); 258 pp.]



Since sanctions have assumed a critical place over the last few years in the foreign policy of the United States and its dutiful allies, with consequences that have often been chilling and ominous, it becomes imperative to understand how sanctions came to be deployed as a blunt instrument of terror and domination in our times.  With the formation of the United Nations in 1945, and the resolution taken by member states to attempt to resolve conflicts between themselves through means other than war, sanctions were bound to assume an important place in the international regime of governance.  It was in 1959 that Albert Luthuli, then President of the African National Congress, implored the international community to impose comprehensive sanctions against South Africa and so “precipitate the end of the hateful system of apartheid.” Three years later, the General Assembly voted overwhelmingly in favor of the economic boycott of South Africa, but as Britain, the United States, West Germany, and Japan, which between them accounted for by far the greater portion of South Africa’s exports and imports, chose to remain indifferent to resolutions expressing the general will of the rest of the world, sanctions against South Africa did not then come into force.


The General Assembly, repeatedly drawing the attention of the Security Council to the threat posed by South Africa to international peace and security, insisted that action under Chapter VII of the UN Charter was “essential in order to solve the problem of apartheid and that universally applied economic sanctions were the only means of achieving a peaceful solution.”  Under the terms of articles 41-42 of this chapter, only the Security Council has the power to impose mandatory sanctions, and attempts to render South Africa compliant were vetoed by the three Western nations that are permanent members of the Security Council.  However, the tide of international opinion could not altogether be resisted, and in 1977 an arms embargo against South Africa was mandated.  In 1993, the African National Congress, which was then almost on the verge of officially acquiring power, pleaded with the world community to remove the sanctions against South Africa and restore it to a respectable place in the community of nations.


These few nuggets on the history of sanctions suffice as a prelude to the understanding of how the most draconian regime of sanctions ever imposed upon a nation led to its devastation.  Abdul-Haq Al-Ani & Tarik Al-Ani’s Genocide in Iraq, published by the small and independent Atlanta-based Clarity Press, presents a severe but cogently argued and well-documented indictment of the United Nations Security Council, the principal vehicle through which the United States, the rogue-in-chief of all nation-states, effected the wholesale destruction of Iraq.  The authors of this book—Abdul-Haq is an Iraqi-born, British-trained barrister who holds a doctorate in electronics engineering as well as one in international law, while Tarik Al-Ani is an architect, translator, and independent researcher who makes his home in Finland—mince no words in either describing the outcome of the sanctions or the inability of people to understand the implications of what transpired during the course of a decade.  “Imposing sanctions on Iraq”, they state in their conclusion, “was one of the most heinous of crimes committed in the 20th century.  Yet it has received little attention in the Anglo-American world.  Despite the calamitous destruction resulting from the sanctions, no serious attempts by legal professionals, academics or philosophers have been undertaken to address the full scope of the immorality and illegality of such a criminal and unprecedented mass punishment” (p. 222).

Poster on the genocidal impact of sanctiosn at an anti-war demonstration.

Poster on the genocidal impact of sanctiosn at an anti-war demonstration.


No one doubted that after Iraq’s invasion and occupation of Kuwait, it was incumbent upon the so-called ‘world community’ to show its strong disapproval of Saddam Hussein’s irredentist designs by enforcing comprehensive sanctions against Iraq.  This was accomplished by Resolution 661 of the UN Security Council, which urged all member states to adhere to a strict embargo on all exports from, and imports to, Iraq.  The resolution exempted from the embargo “supplies intended strictly for medical purposes, and in humanitarian circumstances, foodstuffs.” Another committee of the Security Council, known as the Sanctions Committee, was set up to ensure that there would be compliance with the resolution, and to report its observations and recommendations to the Security Council (pp. 196-214).


Before sanctions were first enforced in the late summer of 1990, Iraq unquestionably had among the highest standards of living in the Arab world, a flourishing and prosperous middle class, and a formidable social welfare system that provided enviable material security to ordinary citizens. The economists Jean Dreze and Haris Gazdar noted that the “government of Iraq has a long record of active involvement in health care, education, food distribution, social security and related fields.  Notable achievements in these fields include free public health care for all, free education at all levels, food distribution at highly subsidized prices, and income support to ‘destitute’ households . . .” One of the more significant contributions of Genocide in Iraq is not merely to reaffirm the views of knowledgeable observers of Iraqi society, but also to offer a more sustained account of the achievements of the Ba’athist regime under Saddam Hussein. Chapters 3 & 4, on the economic development of Iraq, and “the progressive social policies” of the Ba’ath regime, ought to be nothing less than a revelation, particularly to those in the United States and Britain who allowed themselves to be led like sheep into believing that Iraq was nothing but a backward state full of hateful Muslims led by a blood-thirsty dictator, detailing as they do the strides made by Iraq in attempting to give a greater number of its people the benefits of a reasonably advanced social welfare state—an accomplishment all the more remarkable considering that Hussein was doubtless a brutal ruler who did not hesitate an iota to send to their death those politicians, activists, army men, public figures and opponents who might even remotely be construed as a threat to his own political survival and well-being.


According to the authors, the transformation sought by the regime was such as would confer the “benefits of development” upon “workers, peasants and other poorer classes” (p. 97); if this is at all true, that is certainly far more than what the United States attempts to do for its working class population.  “Prior to the 1990 Gulf War,” the authors state, “93% of Iraqis had access to health care and safe water.  Education was free, calorie availability was 120% of actual requirements, and GNP per capita was more than double its 1976 value” (p. 97).  The book is rich in empirical data:  we learn, for example, that between 1960 and 1990 the infant mortality rate diminished from 117 to 40 while the under-5 mortality declined from 170 to 50 (p. 108), just as the number of doctors grew by over 500% from 2145 in 1968 to 13621 in 1990 (p. 110).  Impressive as are these achievements, a testimony to the Ba’athist government’s progressive social policies, it is the authors’ delineation of a multicultural society that commands even greater attention and will certainly invite outright skepticism from the critics of Saddam Hussein who were pushing for war.  The authors argue that “up until the 2003 invasion, Iraq had been a very tolerant society with very responsible policies on religious freedom.  People grew up in mixed neighborhoods with no segregation between sects or religions” (p. 100).  They describe growing up in neighborhoods where Muslims, Christians, and Jews “lived side by side without any problem”; and each religious community was permitted its paid religious holidays, a privilege that is not conferred on Muslims in predominantly Christians nations such as the US, UK, and Germany.  Though it is simply assumed by most people that religious minorities have always faced persecution in Iraq, leading to their migration and diminished numbers, the authors point out that Iraq’s Christian population grew from around 149,000 in 1947, or about 3.7% percent of the population according to census figures, to about 1 million in 1987, or close to 5% of the population (p. 103).

Cartoon by Mike Flugennock, 18 August 2007.  Source:  http://sinkers.org/stage/?m=200708

Cartoon by Mike Flugennock, 18 August 2007. Source: http://sinkers.org/stage/?m=200708


A campaign of sustained bombing, and seven years of the most severe sanctions ever inflicted against any nation, were to relegate Iraq, in the words of an official UN fact-finding team, to the “pre-industrial” age.  [United Nations, Economic and Social Council, Commission on Human Rights.  Sub-Commission on Prevention of Discrimination and Protection of Minorities. “Forth-third session:  Summary Record of the 10th Meeting.” E/CN.4/Sub.2/1991/SR.10 (20 August 1991), 10.]  Insofar as socio-economic indicators are reliable criteria, Iraq joined the ranks of the under-developed nations and become economically regressive: as oil revenues shrunk dramatically, the little that remained of its decimated infrastructure after the bombing fell to pieces.   Iraq would soon have the highest rates in the world of maternal and infant mortality, and correspondingly the fewest number of hospital beds; an astronomical increase in diseases and mental illnesses was documented, and malnutrition, which had all but disappeared from Iraq before 1990, was estimated to have affected the majority of Iraqis by 1995. A report released in 1997 by UNICEF described 1 million children in Iraq under the age of 5 as being chronically malnourished, a condition that leads not only to stunted physical growth but considerably reduced capacity for development and education, and it ominously adds the following words:  “Chronic malnutrition is difficult to reverse after the child reaches 2-3 years of age.”  One year after sanctions first went into effect, the real monthly earnings for unskilled laborers in Iraq had declined by nearly 95%, and were lower than the earnings for unskilled agricultural laborers in India, where levels of poverty are endemic.

Infant and Under-5 Mortality Rates in Iraq,  1979-99

Infant and Under-5 Mortality Rates in Iraq,


Severe as were the sanctions, they scarcely made a dent in the public imagination.  There can be no more notorious sign of this indifference than the remarks of the American Secretary of State, Madeleine Albright, who when asked whether the sanctions could be justified in view of the mass starvation and death of Iraqi children, replied without a moment’s hesitation:  “We think the price is worth it.”  (Of course this notoriety surrounding Albright did not prevent her from receiving the usual accolades from the establishment.)  Some scholars take the view that the sanctions policy of the United States cannot be impugned, since it is conducted under the rubric of the Security Council; if this is the case, then it becomes incumbent to conduct a close examination of the human rights implications of the sanctions policy of the Security Council.  This is the other signal contribution of Abdul-Haq Al-Ani and Tarik Al-Ani’s book:  its subtitle, “The Case Against the UN Security Council and Member States”, hints at the boldness of the argument, since the authors are quite certain that the Security Council, which ought to act strenuously to prevent genocide, became the agent for the genocidal destruction of a people and their nation.  Their argument, however, would have derived yet greater force if they had considered that, rather ironically, another (and far more widely representative) body of the United Nations, namely the General Assembly, would draw attention to the “Security Council’s greatly increased use of this instrument”, and to “a number of [attendant] difficulties, relating especially to the objectives of sanctions, the monitoring of their application and impact, and their unintended effects.”  The General Assembly was to recall the “legal basis” of “sanctions”, which are described in Article 41 of the UN Charter as “measures not involving the use of armed force in order to maintain or restore international peace and security”, in order to “underline that the purpose of sanctions is to modify the behavior of a party that is threatening the international peace and security and not to punish or otherwise exact retributions.”


In making a representation before the UN Commission on Human Rights in 1991, the non-governmental International Progress Organization made the more forceful point that “the continuation of the sanctions policy implemented through the United Nations Security Council” constituted a “grave and systematic violation of human rights and fundamental freedoms” of the entire population of Iraq, who were being denied even the most basic right, the right to life.  The figure often mentioned to indicate the number of Iraqis killed since the imposition of sanctions is about one million between 1991-95 alone; the respected British medical journal, Lancet, gave a figure of 567,000 children who had died as a consequence of sanctions by 1995, while UNICEF estimated that 500,000 children had been killed on account of sanctions and the collateral effects of war.


Sanctions constitute a form of nearly invisible death, and ought to alert us to the fact that oppression in our times is increasingly masked.  We associate war with death and violence, but sanctions with human rights and non-violence:  as the former United States ambassador to the UN, Thomas Pickering, put it in a Security Council debate, “sanctions are measured, precise and limited.  They are a multilateral, non-violent and peaceful response to violent and brutal acts.” [United Nations, Security Council, “Provisional Verbatim Record of the Three Thousand and Sixty-Third Meeting.”  S/PV.3063 (31 March 1992), 67.] This is, to put it mildly, a perverse, even macabre, view of sanctions, and just as strikingly it displays a singular naiveté about the nature of non-violence, which is erroneously equated with the mere absence of force.  Non-violence is not only, or even, a doctrine of abstention from force:  it requires us to take active measures for peace and the well-being of all, and it is obscene to suppose that the denial of basic amenities to people, including the right to life, might be construed as a respect for human rights.


There has been little endeavor to recognize the economic oppression of an entire people as a crime against humanity, indeed as a form of terrorism.  The prospects for the international rule of law can be nothing but appalling, as the American scholar John Quigley has noted, if the United States continues to act on the presumption that multilateralism is a worthwhile enterprise only if it “can control the outcome.”  It becomes imperative, then, to ask what ought to be the place of sanctions in an international world order that purports to base itself on the principles of equity, ‘rule of law’, and democracy?  Are sanctions only viable when they have the force of moral opprobrium of a world-wide citizenry, as was evidently the case when sanctions were at long last imposed on South Africa, or should they continue to be available, as they are at present, to any modern nation-state that chooses to impose sanctions unilaterally?  There is almost nothing to warrant the belief that the wide and systematic use of sanctions will serve the dual ends of ensuring a just world order and help to make societies that are targeted by sanctions more open, just as there is compelling evidence to suggest that such wide and seriously abusive use of sanctions exacerbates political repression within targeted nations and paves the way for greater inequities between nations, eroding both the ‘rule of law’ and respect for the international system.  I have discussed the legal and political implications of sanctions in greater detail elsewhere, but readers can turn to Genocide in Iraq with immense profit to understand both how sanctions are deployed as a modern means of ‘pacification’ and to understand how crimes against humanity came to be perpetrated against an entire people without any consequences for the perpetrators of such crimes.


“Things fall apart; the centre cannot hold; / Mere anarchy is loosed upon the world”:  so William Butler Yeats famously wrote in his much-quoted poem, “The Second Coming”.  Some in Britain, contemplating the prospects of the dissolution of the Union of England, Scotland, and Wales, effected in 1707 and modified in the twentieth-century to accommodate the Unionists in Northern Ireland who resisted the idea of an independent Ireland, are warning of the impending anarchy if a majority of Scots should cast a ballot in favor of independence in Thursday’s referendum.  The beauty of the ballot, which will ask voters, “Should Scotland be an Independent Country”, and then signal their choice with a ‘yes’ or ‘no’, resides in its simplicity; and it is precisely this simplicity which is no doubt the envy of many around the world—among others, Palestinians, Kurds, Basques, Kashmiris, Nagas, Texans, even some Californians and, if we may constitute such people as a ‘nation’, the gun-toting fanatics of the National Rifle Association in the US—who would certainly like to weigh in on the question of their independence.  However, the simplicity of the Scottish referendum resides in other considerations, too:  watching developments in Libya, Iraq, and Syria, nor are these the only places where the question of secessionism and new political formations looms large, one admires the Scots for attempting to settle this question through something other than the gun.  Malcolm X might have thought the ballot little better than the bullet, and he doubtless had good reasons to do so in a country where in many places the African American could only cast his ballot at the risk of receiving a bullet in his chest, but in today’s politics too little constructive use is made of the ballot.  The Scottish referendum, if nothing else, gives one hope that American-style electoral democracy, a furious sound show signifying absolutely nothing except the lifelessness of an American politics that has been consumed in equal measure by money and sheer stupidity, is not the last word in electoral politics.

Many are the arguments that have been advanced by both the proponents and detractors of Scottish independence.  Not surprisingly, nearly all the arguments that have been encountered in mainstream media—print, digital, television, social networks—verge on the economic and what might be called the narrowly political.  England’s three major political parties, though here again there is little that any more really distinguishes them from each other, have spoken in one voice in suggesting that the dissolution of the Union will be a major blow to Scotland itself.   It has been argued that bereft of its Union with England, Scotland would experience job loss, the advantages of the British pound, and the flight of capital; as a small nation-state, it is likely to become quite invisible and would be without the benefit of the political and economic security umbrella under which it is presently sheltered.  The advocates of Scottish independence argue quite otherwise, insisting, before anything else, that the Scots must be in a position to decide their own future and political outcomes.  Scotland’s priorities, argue the proponents of independence, are poorly reflected in the constitution of the British government.  There is little appetite in Scotland, for instance, for foreign wars, and a good many people would be only too happy to be rid of the nuclear submarine base.  Scotland has 59 Members of Parliament in Westminster, but only one of those belongs to British Prime Minister David Cameron’s ruling Tory party.  On the economic front, the cheerleaders for Scottish independence have argued that Scots are much more hospitable towards the idea of a welfare state than the English, and working-class support for Scottish independence is particularly high.  The notion that revenues from the North Sea oil and natural gas fields would, in the event of independence, be used only for projects to advance the advance of the Scots is often trumpeted as the clinching argument, though it is germane to point out that the 8 billion dollars in North Sea energy revenues that the British government received in 2013 amount to about only about three percent of the Scottish economy.

If there is to be a compelling argument for Scottish independence, it must surely also emanate from the tortuous history of the Union and the brutality with which the Scots were treated by the English for the greater part of two centuries.  To suggest this is by no means to excuse the Scots from the part they played in forging the British empire; indeed, they occupied a disproportionately prominent role in Indian administration.  But it is perhaps a truism that only those who have been brutalized go on to brutalize others, and the first principle for the student of colonialism is to come to the awareness that the English did not practice in their colonies in Asia or Africa anything that they had not first tested out on their subjects in Scotland and Ireland. The story of how Europe underdeveloped its various others, not least in the British Isles and in what is called Eastern Europe—just what was “Eastern Europe” becomes amply clear from the writings of the so-called Enlightenment giants such as Voltaire, for whom “Eastern Europe” was nothing more than the point at where the allegedly savage and animal-like Slavs began to predominate in the population—need not be rehearsed at any great length at this juncture, but a few fragments of this history are essential to convey the enormity of English injustice.  Following the Jacobite uprising of 1745, an attempt by “Bonnie Prince Charlie” to win the British crown for the Stuarts, Scottish Highland clansmen, who aided in this failed attempt, had to bear the burden of callous retribution.  What the English effected in Scotland was nothing short of ethnic cleansing:  the clan system was destroy

"Last of the Clan", a painting by Thomas Faed,   c. 1865 (Kelingrove Art Gallery and Museum, Glasgow)

“Last of the Clan”, a painting by Thomas Faed, c. 1865 (Kelingrove Art Gallery and Museum, Glasgow)

ed and in various other ways the English struck at the heart of the Scottish way of life.   The tartan plaid and kilt were banned by the Act of Proscription of 1746-47—in the precise language of the act, which would not allow for any lesser penalties, the offence of wearing Highland clothing would attract “imprisonment, without bail, during the space of six months, and no longer; and being convicted of a second offence” would make the offender “liable to be transported”.  Highlanders were deprived of the right to own arms, and similarly Gaelic could no longer be taught in schools.  One might easily add to this list of persecutions, but nothing summarizes better what would become the pacification—an ugly word, which describes well how colonial powers acted with utter disregard for human life in their colonies—of the Scots than what is known to historians as the “Highland Clearances” which led to the mass-scale removal of the population of the Highlands, leaving it, wrote the popular historian John Prebble, “void of most, possibly 85-90%, of its people, trees and forests.”

Memorial Stone marking the site of the Battle of Culloden, 1746

Memorial Stone marking the site of the Battle of Culloden, 1746

In his charming but now little-read book, Two Cheers for Democracy, E. M. Forster, while championing English-style democracy over other forms of government, withheld the third cheer.  The English, he argued, had one insufferable vice:  hypocrisy.  How far this is peculiar to the English rather than a common condition afflicting a good deal of humankind is a question that need not be addressed at the moment.  Taking my cue from Forster, the argument for Scottish independence certainly deserves two cheers.  England, frankly, has not been humbled enough:  its immigration policies continue to be rotten, its visa regimes for citizens of its former colonies are not merely absurdly insulting but draconian, its disdain for the contributions of its own working class to the shaping of a humane society is appalling, and virulent racism is encountered in nearly every aspect of English life.  The nonviolent break-up of Great Britain is a most desirable thing; one hopes that if the referendum for Scottish independence succeeds, it will be eventually be a prelude to even more desirable outcomes, such as the break-up of the United States, which is far too big and powerful for its own good and certainly for the good of the rest of the world.  Secondly, no arguments are too strong for the devolution of power, the decentralization of authority, and autonomy for people who might choose their independence for ethnic, religious, linguistic, or other reasons.  There is, to put it in another language, an optimum size for a nation-state, and a great many nation-states are already far too big to both be governed efficiently and at the same time give all their people equal opportunities for their just advancement in various domains of life.  Nevertheless, there is something to be wary about in the demand for Scottish independence:  nationalism is almost always accompanied by a diminishing capacity for self-reflection.  When the Union dissolves, who will the Scot set himself or herself up against to know better his or her own self?  This is the problem that nationalism has not yet been able to resolve, and there is little to suggest that Scottish independence will yield new wisdom on this old and intractable problem.

The emptying out of the Highlands:  A pamphlet on Scottish emigration, Glasgow, 1773

The emptying out of the Highlands: A pamphlet on Scottish emigration, Glasgow, 1773


One of the many stories, based on a Sanskrit tale, that the late U R Ananthamurthy [21 December 1932 – 22 August 2014] used to tell often is    AnanthamurthyLondonBookFair2009of a cow named Punyakoti which would go out to graze in the forest in the country called Karnataka.  One evening, as the other cows made their way home, Punyakoti meandered into a particularly grassy area that was, however, the territory of a tiger.  As Arbutha was about to pounce upon the cow, Punyakoti pleaded with Arbutha that she might be allowed to go feed her calf before returning to become his dinner.  If the tiger was hungry, so was her calf; and the tiger ought to be sufficiently well-informed in dharma to know that a promise thus given would not be broken.  The tiger relents:  Punyakoti reaches home, feeds her little one, bids her farewell, and then presents herself before Arbutha.  Astounded by Punyakoti’s fidelity to truth and her capacity for sacrifice, Arbutha has a sudden change of heart and begins to undertake penance—or so states the Sanskrit original.  Recounting this popular story some years ago in an essay entitled ‘Growing up in Karnataka’, Ananthamurthy had this to say:  ‘It is the dharma of the tiger to be a flesh eater.  By a change of heart he cannot become a vegetarian.  He has no choice but to die.’  Contrary to the Sanskrit storyteller, the Kannada poet has Arbutha leap to his death:  ‘The Kannada poet is more convincing.  By a change of heart, the tiger can only die.  It is as absolute as that.’


Encapsulated in Ananthamurthy’s pithy commentary on ‘The Song of the Cow’ are many of the principal themes which shaped the literary oeuvre and worldview of an immensely gifted writer and critic whose death a week ago has robbed Kannada of its greatest voice, India of an extraordinary decent man and supple writer, and the world, which sadly knew too little of him, of a storyteller and intellectual whose fecundity of thought and robust play with ideas shames many of those who style themselves cosmopolitans.  Much has been written on the manner in which Ananthamurthy, not unlike other sensitive writers and thinkers in India (and elsewhere in the global South), negotiated the tension between the global and the local, tradition and modernity; but, as is palpable from more than a merely cursory reading of his criticism and fiction, Ananthamurthy also remained engaged throughout his life with the tension between Sanskrit and the bhashas, the marga and the desi, and what he called ‘the frontyard’ and ‘the backyard’.  Ananthamurthy completed a doctorate in English literature, taught English at a number of institutions, and was completely at home in the masterworks of Western literature; and, yet, he was profoundly rooted in Sanskritic and especially Kannada literary traditions. In reading Ananthamurthy, one is brought to an overwhelming, indeed humbling, awareness of his deep immersion in a thousand year-old tradition stretching from Pampa, Mahadeviyakka, and Allama Prabhu through the Vijayanagar-era poet and composer Purandaradasa to his contemporaries Shivarama Karanth, Masti Venkatesh Iyengar, Bendre, Kuvempu, Adiga, and others.  In this, as in other respects, Ananthamurthy also inhabited a world where the simultaneity ‘of the ancient, the primitive, the medieval and the modern’ was ever present, not only in social structures but ‘often in a single consciousness’.  It is doubtful that anyone among the most celebrated of our writers who have made a name for themselves as notable exponents of the English novel or what might be termed global non-fiction have anything even remotely close to the knowledge that Ananthamurthy had of Indian bhashas.  In his essay, ‘Towards the Concept of a New Nationhood’, Ananthamurthy gave it as one of his ‘pet theories’ that ‘in India, the more literate one is, the fewer languages one knows.’  In ‘the small town where I come from,’ Ananthamurthy was to write, ‘one who may not be so literate speaks Tamil, Telugu, Malayalam, some Hindi, and some English.  It is these people who have kept India together, not merely those who may know only one language.’


Few Indian novels have been discussed as much as Ananthamurthy’s Samskara.  Fewer still, especially in India, are the number of creative people who have been entrusted with the care of institutions and intellectual enterprises and not left them diminished.  Ananthamurthy was not only a celebrated writer, but someone who stood at the helm of important institutions—Mahatma Gandhi University, Kottayam, and the Film and Television Institute of India, Pune—and strengthened them.  As President of the Sahitya Akademi, he strove to ensure that all the languages under the academy’s jurisdiction received parity; moreover, he ensured the autonomy of the institution by prevailing upon the academy to reject the Haksar Committee’s recommendation that the academy’s president be appointed by the government on the advice of a search committee.  Those familiar with the Indian literary, artistic, and intellectual scene that extends well beyond the metropoles and even “provincial” capitals are more likely to remember Ananthamurthy as the principal mentor of that unique experiment which for decades has been taking place in Heggodu, Shimoga District.  Here, in the midst of areca nut plantations, the cultural organization Ninasam attracts students, workers, and villagers for a week-long annual course to discuss literature, movies, music, philosophy, and science.  Ananthamurthy unfailingly graced this gathering every year, nurturing the young and facilitating spirited conversations that lasted long into the night.


Ananthamurthy might, thus, be remembered for many different things, but nevertheless it is the categories through which he worked that mark his contribution to Indian literature and thought as distinct and enduring.  It would be a grave mistake to view him merely as staking a middle ground:  taking a leaf out of Gandhi, Ananthamurthy was quite certain that Western civilization was not good not just for India but even for the West.  Consider, for example, his literary, emotional, and intellectual investment in the idea of the sacred, though this is something that his Hindutva critics, who fancy themselves custodians of the Hindu tradition, can barely understand.   He has told the story of a painter who was traveling through villages in north India studying folk art; on one of these sojourns, he encountered a peasant from whom he learnt something bewildering:  ‘Any piece of stone on which he put kumkum became God for the peasant.’  Ananthamurthy understood well that nearly every place in India is sacred:  here Sita bathed, there Rama rested his weary body, and over there the gods dropped nectar.  But he takes the idea of the sacred much further:  place, bhasha, childhood—all these notions, so centrally a part of the worldview of Ananthamurthy, revolved around the idea of the sacred and the untranslatable.  Sacred, too, is the dharma of the writer, laid bare by Ananthamurthy in his Jnanpith Award acceptance speech:  ‘There is something wrong with us writers if we do not lose a few of our admirers with every new book that we write.  Otherwise, it may mean we are imitating ourselves . . .  We should never lose the capacity to say those things in which we believe when we are absolutely alone.’


First published in the Indian Express, 30 August 2014 (print and online).  



Get every new post delivered to your Inbox.

Join 314 other followers