Feeds:
Posts
Comments

Archive for the ‘Politics of Knowledge Systems’ Category

The Fact of Being Black:  History, Culture, Politics IX

 It is not surprising that a good portion of even mainstream America should have unequivocally condemned the display in Charlottesville of right-wing terrorism.  President Trump cannot be counted among those who came down swiftly on the neo-Nazis and their kinsmen.  He did not merely prevaricate but, in a scarcely veiled attempt to exonerate “white supremacists”, took it upon himself to condemn “all extremist groups”—though even this disapprobation was late in coming—before, on August 15th, stating with greater conviction in his pathetically juvenile English that “there is blame on both sides”: “You had a group on one side that was bad. You had a group on the other side that was also very violent.”  To take only the examples of prominent public figures who cannot remotely be accused of having a liberal disposition, House Speaker Paul D. Ryan described the white supremacists as “repugnant”, while Senator John McCain called them “traitors” on his Twitter account.  Even Attorney-General Jeff Sessions, whose own commitment to civil rights is, to put it mildly, exceedingly questionable, but who as the country’s chief law-enforcement officer must at least put forward the semblance of some respect for the rule of law, was moved to admit that “the violence and deaths in Charlottesville strike at the heart of American law and justice.”

CharlottesvilleViolence

Street clashes in Charlottesville, 12 August 2017. Source:  Los Angeles Times.  Photograph:  Michael Nigro / Pacific Press.

The widespread outrage over white extremist violence that followed has doubtless been genuine.  The liberal constituency in the US is considerable, and most people in that community do not condone violence, at least not right-wing violence directed against other Americans.  Moreover, one can even subscribe to racist sentiments and yet forswear violence.  In the frenetic world of social media, the hashtag #thisisnotus was at once embraced by thousands.  They may have done so to bring to mind the better possibilities that reside in the American self and to invoke a necessary political solidarity for the present.  And yet I have the inescapable feeling that the crass affirmation, “this is not us”, creates a much smaller place for reflection and dialog than the unthinkable:  #thisisallofus.  One could invoke, of course, “the hooded Americanism” that historians of the KKK have documented in such meticulous detail, or the lynchings that were invitations to Sunday picnics in Jim Crow South[i]; one could also point, if one stretched one’s canvas beyond the cruel deprivations to which black America has been subjected, to the genocidal tendencies that have conspicuously been part of the grand design of making and keeping America “great”.  Just how do these disingenuous expressions of outrage permit whiteness to remain unscathed even as white supremacists are banished, as they should be, to the realm of the barbaric and the unforgiveable?

LynchingAJollyGoodShow

Lynching:  What a Jolly Good Show!  This lynching took place in Duluth, Minnesota, not in the Deep South.  Source:  https://sherielabedis.com/2015/03/29/new-report-on-lynchings-in-jim-crow-south/

White supremacism necessarily entails a profound adherence to whiteness, but (to borrow a phrase from the scholar George Lipsitz) “the possessive investment in whiteness” runs deep through American culture and only manifests itself as white nationalist ideology or outright fascist-style violence occasionally.  A large and increasingly growing body of commentary by liberals and left-leaning scholars has now made the idea of ‘white privilege’ a familiar part of American political discourse.  Such white privilege takes many forms, some obvious and others scarcely so, commencing with the assumption that is tantamount to the original sin, namely that America belongs to white people just as white people can rightfully, naturally, and preemptively call America their own.  The white American, unlike the African-American, Japanese-American, or Chinese-American, has never had to be hyphenated:  as Roland Barthes would have it, he belongs to the realm of the exnominated, those who never have to be named, those who can be universalized and whose rules become everyone else’s rules (Mythologies, 1972, trans. Annette Lavers [New York:  Farrar, Straus & Giroux]).  There are other less transparent forms of whiteness, though with even a little prodding they can be easily excavated.  Such, to take one example from studies of environmental racism, is the notion that non-white communities should have to bear the burden of toxic and nuclear wastes, pollutants, and the garbage produced in everyday life.

White privilege is perhaps best witnessed in the mounting critiques over US immigration policy and affirmative action in higher education.  The Trump regime has, contrary to common opinion, little interest in stemming illegal immigration; by law, those who are in the US “illegally” can be summarily deported.  This is apart from the consideration that illegal immigrants are an invaluable asset to the American economy.  To understand the true import of pervasive anti-immigrant sentiments, it is sufficient to understand that the slogan, ‘Take America Back’, means nothing but taking America back to the period before the Immigration and Naturalization Act of 1965 which made possible Asian and African migration into the US and thereby slowly but surely altered the social fabric of American life. “Make America Great Again” is not only a slogan calling for the revival of manufacturing in the United States and once again turning the country into the predominant industrial power in the world:  it is also a call to make American white again.  It is thus legal, rather than illegal, immigrants who pose by the greater problem for those who would like to see the US restored as a principally white dominion.

Similarly, the massive white unrest over affirmative action occludes two facts.  First, as every study has shown, and as is confirmed by a recent New York Times analysis extending to 100 universities, including Ivy League institutions and the flagship public universities, black and Hispanic students are today more rather than less underrepresented at such institutions than they were 35 years ago.  More significantly, it is almost never conceded that the entire system of higher education is effectively the consequence of an unwritten code of affirmative action over decades on behalf of white students. It is white entitlement, not supposedly the lower bar for admission for blacks and Hispanics, that has kept Asian Americans from predominating in elite American institutions.

In speaking of “the possessive investment in whiteness”, George Lipsitz was adverting to something more than white privilege; indeed, the more compelling part of his argument resides in the claim that “all communities of color suffer from the possessive investment in whiteness, but not in the same way.”[ii] Immigrant communities have, in their own fashion, sought to claim whiteness, or at least an approximation to it; whiteness has entered into the sinews, pores, arteries of American society.  Ironically, much of white America hasn’t quite fathomed its own overwhelming success; if it had, white Americans would not be staging, as they are today, a new secessionist movement.  Robert E. Lee, at least, would have understood the animated and largely cliché-ridden dispute over Confederate statues as fundamentally a proxy war over whiteness.  Even as he might have looked askance at having his own statues knocked down, he would likely have been pleased that the idea of secessionism continues to thrive.

 

[i] On the Ku Klux Klan and lynchings in the US, I would point readers to a few works, among them:  Leonard J. Moore, Citizen KlansmenThe Ku Klux Klan in Indiana, 1921-1928 (Chapel Hill:  University of North Carolina Press, 1991); Allen W. Trelease, White Terror: The Ku Klux Klan Conspiracy and Southern Reconstruction (Baton Rouge:  Louisiana State University Press, 1971, reprint ed., 1995); David M. Chalmers, Hooded Americanism:  The History of the Ku Klux Klan, 3rd ed. (Durham, North Carolina:  Duke University Press, 1987); and Witnessing Lynching: American Writers Respond, ed. Anne P. Rice (New Brunswick, New Jersey: Rutgers University Press, 2003).

[ii] See George Lipsitz, The Possessive Investment in WhitenessHow White People Profit from Identity Politics Philadelphia:  Temple University Press, 1998), 184.

 

(Concluded)

The two pars of this article were first published as a single piece in somewhat shorter form as “Whiteness and Its Dominion:  Letter from America”, in the Economic and Political Weekly (Mumbai) 52, no. 35 (2 September 2017).

Advertisements

Read Full Post »

The Fact of Being Black:  History, Culture, Politics VIII

“The problem of the twentieth century, wrote the African American intellectual W. E. B. DuBois in 1903, “is the problem of the color-line.” Nearly every book on race relations in the United States that has been published since, especially over the last several decades, has dwelled, if implicitly, on the prescience of DuBois’s observation.  Writing on the 40th anniversary of the Emancipation Proclamation, which pronounced the slaves as henceforth free and thus entitled to lay claim to the Jeffersonian formula of “life, liberty and the pursuit of happiness”, DuBois saw instead that the “very soul of the toiling, sweating black man is darkened by the shadow of a vast despair.”  That shadow, which the white man called “prejudice” and no more—something that could be undone, presumably, with education, cultivation of the virtues, goodwill, informed legislation, and social engineering—condemned the black person to “personal disrespect and mockery”, “ridicule and systematic humiliation”, indeed “the disdain for everything black.” (See W. E. B. DuBois, The Souls of Black Folk [1903], Mineola, New York:  Dover Publications. 1994), v, 6, 9, 111).

WEBDuBois

W. E. B. DuBois, 1868-1963.  Source:  The Poetry Foundation.

However emboldened black people in the slave-owning slaves may have felt at the end of the Civil War and through Reconstruction, a period that some unrepentant whites characterized as one marked by ‘Negro swagger’, their liberty, such as it was, did not last very long.  Black America had to be brought to its knees, a project that still continues however disguised the forms in which such oppression takes place, however loud the voices clamoring for diversity, multiculturalism, respect, and tolerance.  Though DuBois would have been scarcely alone in his assessment of how the black person had become disenfranchised and consigned to what he unequivocally termed “a second slavery”, he deployed a striking metaphor to characterize what had befallen America and “the souls of black folk” (p. 7).  Early in life, he says, it dawned on him that he was shut out of the white world “by a vast veil”. This “veil” is something like Churchill’s “iron curtain”, but DuBois pushes the metaphor much further.  The numerous 18th century slave revolts, which suggest that “the fire of African freedom still burned in the veins of the slaves,” had the effect of “veiling all the Americas in fear of insurrection.”  And yet more, since “the Negro” is himself born “with a veil”:  in what is the book’s most arresting insight, albeit one where the language is anticipated by Hegel in his discussion of the master-slave dialectic in Phenomenology of the Spirit, DuBois describes the veil as one which “yields him no true self-consciousness”; the Negro can only see “himself through the revelation of the other world”, through the eyes of the other.  DuBois termed this phenomenon “double consciousness” (pp. 3, 28, 7).  Malcolm X was among those who drew on this idea in drawing a distinction between the “Field Negro” and the “House Negro”:  though the former was able to maintain some, howsoever indistinct, form of autonomy, the latter was profoundly colonized, unable to see the world except through the eyes of the master.

VeilingInFranceDemonstration

Kenza Drider, wearing a niqab, was detained Monday by undercover police officers at a demonstration in front of the Cathedral of Notre-Dame in Paris, 11 April 2011.  Source:  New York Times; see: http://www.nytimes.com/2011/04/12/world/europe/12france.html

DuBois’s metaphor of veiling remains apposite for our times, and may have yet ever greater salience, and not only because much of contemporary political discussion, and white anger, in the United States and Europe has swiveled around the figure of the veiled Muslim woman.  The ban on veiling, or more precisely on covering one’s face, in public has been in effect in France since April 2011.  Muslim women are not necessarily the only ones who are affected by this ban, nor are Muslim women mentioned explicitly; indeed, besides the burqa and niqab, the ban also covers masks, scarves, and helmets.  But, of course, the ban is targeted mainly at the practice of “Islamic veiling”.  Offenders are fined 150 Euros, or about US $165-180 depending on the rate of exchange.  Remarkably, one man, Rachid Nekkaz, had by April 2016 paid the fine on behalf of 1300 women charged with illegally veiling themselves in public, thus incurring a personal expense of 235,000 Euros.  This is in itself an extraordinary story, one that compels us to think anew about notions of tolerance and charity, and the ethos of hospitality:  but a story for another occasion.

The United States has no such ban on “Islamic veiling” or, more broadly, on covering one’s face in public.  Yet, it is white America that shrouds itself in a veil, unable to look upon itself, incapable of the self-reflexivity which would suggest both maturity and a capacity to confront the naked truth.  To unveil America’s unshakable grounding in a virulent and diseased whiteness, we can do little better than turn to the events that transpired not too long ago in a picture-postcard town in the state of Virginia, which housed the principal capital of the Confederacy.

 

What Happened at Charlottesville

Charlottesville, Virginia, a two-hour drive from the nation’s capital, was home to two of the country’s “founding fathers”, Thomas Jefferson and James Monroe.  Each served as the Governor of Virginia and as President of the United States, but Jefferson also has the distinction of being the founder of the University of Virginia and the architect of the university’s signature building, the Rotunda.  In recent years, Charlottesville, perhaps in keeping with the notion of a ‘university town’, acquired something of a reputation as an outpost of liberal thought in a state that has long been a bastion of conservatism.

In July 2014, the US National Bureau of Economic Research pronounced Charlottesville the “happiest” place in America.  In the received view, it is a small town with most of the assets and none of the liabilities—traffic gridlock, pollution, social anomie—of a big city.  The scenic Blue Ridge mountains are nearby, the climate is temperate, and paeans there are many to the town’s supposed gastronomic refinements.  (This is surely one of the many ways in which the US has changed over the last few decades:  not only are tofu and yogurt widely available, and these were virtually ‘foreign’ foods in late 1976 when I first arrived in the US, but there is the cult of the chef and much hullabaloo over ingenuous culinary creations.  Universities lure students and faculty with the promise of gastronomic delights—one of many recruitment tools.)

Happy are those who know little of the past, one might say: Charlottesville, not unlike the state of Virginia, has ugly racial antecedents.  Its black population was not permitted to build their own church until 1864, not coincidentally in the thick of the civil war; even more ominously, considering that the US had partaken of two global conflicts to save the world from fascist tyranny and enshrine democracy as the supreme value, in 1958 the city responded to federal court orders to integrate white schools, issued in the wake of the US Supreme Court decision in Brown vs. Board of Education (1954) that declared segregation unconstitutional, by closing all its white schools as part of a concerted strategy of resistance.  A similar strategy was pursued by other cities and school districts in many of the southern states.

CharlottesvilleBucolicTown

Downtown Charlottesville, VA. (Photo: Payton Chung/Flickr)

If the town has indeed become more liberal, or more receptive to diversity, Charlottesville’s black people appear to be thinking otherwise.  The black share of the population has fallen from 22 percent in 2000 to 19 percent at present [Eligon 2017]. Many will put this down to gentrification and rising rents, but of course those have precisely been some of the ways in which black people have been run out of town and excised from the white world.

It is in this pleasure dome of happiness, then, that white America erupted recently as it does every now and then.  The ancient Greeks and Indians were among two people who understood that happiness is ephemeral; as the lawgiver Solon informs the vain king Croesus, “But in every matter it behooves us to mark well the end:  for oftentimes God gives men a gleam of happiness, and then plunges them into ruin.”  On the night of August 11th, as a prelude to the call by the white supremacist Richard Spencer to “Unite the Right”, white nationalists, neo-Nazis, and members of the Ku Klux Klan marched through the campus of the University of Virginia bearing torches and swastikas, all to the accompaniment of slogans such as “blood and soil”, “White Lives Matter”, and “You will not replace us”.

CharlottesvilleUVaNeoNaziRally

White supremacist and Neo-Nazi rally at the University of Virginia, 11 August 2017.  Photograph by Samuel Corum / Anadolu Agency / Getty

The following day, they gathered in force at a public park in Charlottesville.  The ostensible reason for this gathering was a decision by the town council to remove an equestrian statue of Robert E. Lee, the Confederate general who unsuccessfully attempted to lead the slave-holding states in secession from the Union.  These exponents of white terror found themselves facing a vigorous and much larger opposition comprised of liberals, left activists, ordinary citizens—a motley crowd of decent people.  Clashes ensued; the police stood by:  much of the world, but not most of gun-loving America, would have watched in astonishment at the sight of people openly flaunting assault weapons, automatic rifles, and handguns. Before the day was over, a young neo-Nazi sympathizer had, with intense deliberation, plowed his car into the crowd of protestors, thereby killing 32-year old Heather Heyer.

 

(To be continued)

Read Full Post »

 

Fifteen years ago, I delivered before the Regents’ Society at UCLA a lecture entitled, “Violence in the 21st Century:  The Terrorism of Categories and Invisible Holocausts.”  Mike Davis had published in late 2000 his magisterial book, Late Victorian Holocausts, but I do not recall that it was his work that had inspired the title of my talk; rather, it was the critical literature on the largely unrecognized genocidal aspects of “development” that had led me to my title.  A colleague who was present at my talk later told me that in Israel, where he had spent a good part of his life before moving to the US, such a lecture would be inconceivable.  He pointed, rather surprisingly, to my use of the word “holocausts” in the plural:  in Israel, only one holocaust is recognized as such.  It is “The Holocaust”, and to suggest that there may be other holocausts apparently diminishes the enormity of the Shoah, the only and only Holocaust which took the lives of six million Jews—and, though this is not always mentioned, a sizable number of homosexuals, the Roma, and those earmarked by the Nazi state as ‘unfit to live’ on account of mental or other disabilities.  In Berlin, at least, there now exists a memorial to the others who were felled by the Nazi state’s murderous policies.

Yesterday was Holocaust Remembrance Day, or, in Hebrew, Yom HaShoah, marking the anniversary of the Warsaw Ghetto Uprising.  A full page announcement, or “Open Letter”, published in the New York Times (24 April 2017) and authored by Dr. Moshe Kantor, President of the European Congress, commences with an observation by the British philosopher John Gray, who adverts to the fact that while “intellectual and scientific values accumulate in the world”, and are transmitted from generation to another, “unfortunately ethical values” are not transmitted in this fashion and must be learnt anew by each generation.  This point has been argued by many others who have similarly pointed to the fact that technological changes have taken place at lightning speed over the last few decades but that the capacity of human beings for moral thinking has not changed very much.  To Dr. Kantor and Professor John Gray alike, the inescapable truth is that though everyone is aware of the Shoah, the new generation is “ethically uneducated” about its meaning and implications.   As Dr. Kantor points out, the numbers of Holocaust survivors are “dwindling”, but this is of course unavoidable; however, much more alarmingly, anti-Semitic incidents in English-speaking countries, which have been more hospitable to Jews than other European countries, have been rising sharply.

That there should be a “Holocaust Remembrance Day” is one of those truths that dare not be contested, except at the peril, as Dr. Kantor’s “Open Letter” unfortunately suggests, of being labeled anti-Semitic.  Not just on this day, but nearly every day, there is always the occasion to remind the world that it “should never forget” the unspeakable atrocities of the German killing machine.  The dozens if not hundreds of Holocaust Museums around the world stand forth as vivid reminders of the fact that one community at least has the power to invoke its past and shame everyone else into remembering. Who, however, remembers the perhaps half a million Bengalis who were killed in the genocide in what was then East Pakistan as it made its bid for independence in 1971?  Hardly anyone—indeed, I should say no one, barring the people of Bangladesh themselves.  Some people, but not very many, remember the 800,000 Tutsis butchered in the Rwandan genocide from a little more than two decades ago, but those numbers have already been dwarfed by those who have been killed in the Democratic Republic of Congo.  Rwanda will soon go the way of Bangladesh; it is doubtful that even the Congo will stay very much in the collective memory of the West or indeed the rest of the world.

Africa interests the West very little, except as a place for “investments”.  Let us, therefore, take a more complicated example.  56,000 American soldiers, or something in that vicinity, were killed during the Vietnam War, and the United States is littered with memorials to them.  To the best of my knowledge, not a single memorial mentions the three million Vietnamese who were killed in this war. It is not their names that I am suggesting should be recounted:  the very fact that 3,000,000 Vietnamese were killed is not recorded at any war memorial site.  It isn’t even certain how many Vietnamese were killed; in all such instances in Asia or Africa, some nice round figure seems to suffice.  Every single American life, on the other hand, must be etched in memory forever, doubtless because God has an especially soft spot for Americans, dead ones as much as those who are living—thus the familiar and noxious incantation of American political speeches, ‘God Bless America’.  The search for American soldiers “Missing in Action” in Vietnam is still on going; the budget for that mission runs into millions of dollars.  Every American life counts, as indeed it should.  Why American lives alone should count is a question that few are prepared to ask, though, paradoxically, many are prepared to answer.

Some Americans might well ask why the Vietnamese should be remembered in American memorials, since such memorials very much do the work of the nation-state and are intended to commemorate the lives of the Americans who laid down their lives.  Presumably, they will add, the memorials to the Vietnam War in Vietnam honor their own dead.  But one would think that in a Christian nation, which is what the United States has called itself, it should be as least just as important to remember those we hate, and those in whose killing one has been complicit.  What did Christ say on the Sermon on the Mount?  “For if you love those who love you, what reward do you have?  Do not even tax collectors do the same?  And if you greet only your brothers and sisters, what more are you doing than others?” (Matthew 5:46-47)  There is nothing whatsoever that is exceptional in remembering the American dead:  if at all forgiveness was sought, it is the Vietnamese dead who ought to be remembered?  Or perhaps it is only given to America to forgive, not to beg forgiveness?

The ethics of forgetting is not any less important than the ethics of remembering.  But of this I shall speak some other time.  For now, it suffices to stay with the idea that the call to remember cannot be dismissed.  But what exactly is to be remembered?  Only that six million Jews were killed and that the Nazis were engaged in annihilationist terror?  Does this entail submission, howsoever tacit, to the view that the suffering of Jews takes precedence over the suffering of others?  If it is “Holocaust Remembrance Day” that we are called upon to observe, does this confer recognition upon the Holocaust as the paradigmatic instantiation of genocide in modern times? Does Holocaust Remembrance Day give rise to the supposition that there is a hierarchy of suffering?  Does the suffering of some people count more than the suffering of others and, if so, on what theological and ethical view?  Unless “Holocaust Remembrance Day” is decisively disassociated from an insistence on the singularity of the Holocaust, it is very likely going to breed resentment rather than understanding and compassion.

 

 

Read Full Post »

 

Review-article on Ruby Lal, Coming of Age in Nineteenth-Century IndiaThe Girl-Child and the Art of Playfulness (Cambridge:  Cambridge University Press, 2013.  xvii plus 229 pp.).

More than twenty-five years ago, the Indian economist and public intellectual Amartya Sen helped ignite a debate on the “endangered” status of girls and women in Asia and Africa when he argued that 100 million women were “missing”, a third of that number from India alone.  Discrimination against girls in India begins, as is now commonly known, in the womb itself. I recall reading, some three decades ago, a report about a hospital in Bombay where 50,000 fetuses had been aborted: one, just one, fetus was male.  Sen was by no means the first person to have broached this subject:  indeed, the girl-child in India had, by the 1970s, already been the subject of numerous government committee reports, but there was still little awareness of the various largely invisible forms of discrimination that affected girls and women adversely.  The various government commissions may, not all that ironically, have helped to bury the problem; but India is attentive to the likes of Amartya Sen, who has wide recognition in educated liberal circles in the West and has been lionized in India.  Just three years after Sen’s article was written, the Government of India outlawed prenatal sex discrimination with the passage of the Pre-conception and Prenatal Diagnostic Techniques (Prohibition of Sex Selection) Act [1994].  Soon thereafter, one could see the following sign at least some hospitals:  “Here pre-natal sex determination (Boy or Girl before birth) is not done. It is a punishable act.”

It is Indian feminists rather than Sen, of course, who must be credited with whatever little reforms the Indian state has undertaken in the matter of rights of unborn girls, female children, and women.  Those who are familiar with the Indian principle of jugaad, which means, among other things, making do with the situation at hand, bending corners, and finding a way out, would not be surprised to hear that sex selection still takes place.  It is not merely the case that most Indian laws are seldom and certainly imperfectly implemented, though this is part of the story:  more than ten years after the legislation was passed, only 400 cases had been registered under the 1994 act, and a mere two convictions had been procured.   What is more germane is that under the guise of aiming to screen for birth defects, amniocentesis is still carried out without any fear of penalty.  At Amritsar’s New Bhandari Hospital, for example, amniocentesis is widely practiced and openly advertised.  Kanan Bhandari, who is herself a gynecologist and married to the hospital’s proprietor, defends her clinic’s practices by distinguishing between amniocentesis and the “medical termination of pregnancy of fetuses older than 20 weeks.”  However, the measure of the girl-child in India can be taken in myriad other ways.  In many Indian households, to take one illustration, girls eat after boys, and women after men; moreover, girls are given less to eat than boys, and they may be given smaller portions of milk, eggs, and poultry.

Considering what the sociological literature on the girl-child has to say, the work of the historian Ruby Lal comes as a breath of fresh air.  Her monograph on the girl-child in 19th century India is of an altogether different genre, even if it is similarly animated by the desire to make visible certain forms of experience that undergird the lives of what she describes as the girl-child/woman.  By the early 19th century, the colonial state in India had embraced the view that a civilization was to be evaluated, and placed in a hierarchical scale, on the basis of how it treated its women.  India was found sorely wanting in this respect:  colonial texts offered lurid accounts of the practice of sati (widow-immolation), female infanticide, child marriage, and the prohibitions placed on widow-remarriage, even among widows who had not yet achieved puberty and had never consummated their marriage.  We need not be detained here by such considerations as whether the position of women in Britain was all that much better, and whether the sexual exploitation of girls was not rampant, particularly in view of the vulnerability of working-class women under the new conditions of industrialism.  In Britain, as in India, girls generally had little access to education. Likewise, there is by now a sufficiently large literature which has alerted us to the politics of representation and the difficulties that inhere in unmediated readings of colonial narratives  What is most germane is that throughout the 19th century, the picture painted of Indian girls and women was generally one of doom and gloom, ensnared as they were by domesticity, servitude, or the iron laws of patriarchy that bound them to be unflinchingly obedient (as in the classic formulation of the Hindu law-giver Manu) to the authority, successively, of father, husband, and oldest son.

In Coming of Age in Nineteenth-Century India, Ruby Lal argues for a very different reading of the spaces available to girls and women for the expression of their subjectivity in 19th century north India even as “entire stages and spaces of female lives” were “wiped out” (39).  While she is mindful of the duties imposed upon females, and recognizes that many of her subjects found the spaces of freedom fleeting, she nevertheless takes it as her task to argue that a certain playfulness informs female lives, thus “allowing forms of self-expression and literary creativity that are not dependent on masculinist definitions of fulfillment” (39).  For too long playfulness has been seen as the prerogative of males, as their “exclusive province”, but Ruby Lal attempts to understand it also as “a nonpaternal practice of the feminine” (55).  To delineate the contours of such “playfulness”, she distinguishes between “making” a “woman”, which she characterizes in India and other societies as an invariably “male project”, and “becoming” a woman which allowed greater room for negotiation (30-34).  Becoming a woman, in her view, is not a mere “teleological proposition” (33), one that takes us from a girl to a young woman and then to the exalted state of motherhood and finally the aging matriarch.  Her hyphenated girl-child/woman figure points, in fact, to her interest in the idea of liminality—and where there is the liminal there is also the transgressive.

The ethnographic substance of Lal’s argument is played out in four chapters where she considers the space of the forest, the school, the household, and the rooftops.  She turns to an early 19th century text, the tale of Rani Ketki by the writer Insha-allah Khan (1756-1817) where the hero and the heroine meet in a forest.  She recognizes, of course, that parallels can be drawn with the Ramayana and the Mahabharata, and the scholar of Indian literature has to take great pains to ensure that these great pan-Indian epics do not colonize our understanding of texts and practices drawn from very different times and denude them of their local particularities.  Ruby Lal is not only sensitive to these considerations but shows how the trope of play is at work in this text:  as she points out, “the claim of writing a story in the Perso-Arabic script without using a single word of Persian or Arabic becomes all the more a claim about authorial agility and playfulness” (65).  In a similar vein, she describes Insha as “a theorist of playfulness” who systematized Urdu grammar and placed a heavy emphasis on decorum while being “committed to linguistic and gender playfulness” (69).  But what is singularly important for her argument is how the characters are constantly leaving behind the mohalla (the neighborhood) and the duties concomitant to respectable family living for the forest.  Lal describes this as a movement from the spaces of pedagogy to the spaces of pleasure.

The most distinct space for pedagogy, initially for boys alone, was of course the school.  By the third quarter of the 19th century, textbooks for girls had come into shape.  Lal’s narrative at this juncture revolves around Raja Shiv Prasad, an inspector of schools in the Benares region and a writer of books such as Vamamanranjan, or ‘Tales for Women’. In 1856, when he first assumed his post, there were no schools for girls; within a decade, 12000 girls had been enrolled (98).  The matter of textbooks, particularly those focused on the study of history and morals, is too complex to be given any lengthy consideration; but Shiv Prasad’s textbooks are of interest to Ruby Lal since she seeks to understand how girls navigated the space of the school and received the learning that would enable them to engage in various forms of self-making.  The emerging centrality of the school in the 19th century as a form not only of socialization of children, but as a technology of governance and as a mode for creating national subjects, can scarcely be doubted.  Against such a backdrop, Lal’s analysis of the school as a site for “playfulness” is less than persuasive; indeed, the greatest strength of this chapter resides in her discussion of the debates surrounding “the standardization and the homogenization of languages, scripts, religions and communities” in late 19th century India (124).

Lal’s chapter on the “Woman of the Household” has similarly little to say on (to borrow from the subtitle) the “art of playfulness” and is focused on “a number of significant texts concerned with the upbringing and training of respectable (sharif) girls and women” (125).  These texts, not surprisingly, were concerned rather with the duties of girls and women, the modes of respectability, and the protocols of domesticity.  Her gaze extends to several texts, the “dominant motif” of which is sharafat or respectability (137); one of the texts in question has a section entitled “Concerning the Chastisement and Regulation of Wives” (139), not really a subject calculated to inspire hope that girls and women could readily escape the constraints placed upon them.  A much more promising space for tasting forbidden fruit was the rooftop of the home, which Lal in an imaginative stroke describes as the “the forest” that is transplanted.  The rooftop was the extension of the home, used by women and servants, to take one illustration, to put up the day’s washing; however, in another register, it was also the place, not just for dalliances, but for reading and writing.  The scholar who is attentive to the practices of reading in India would do well to devote some attention to Indian homes with their rooftop terraces.  It was similarly the rooftop from which women, when they were still forbidden to take part in the political life of the nation, observed marches and demonstrations.  Drawing on Fatima Mernissi’s memoir of growing up in Fez, Morocco, in the 1940s, Ruby Lal quotes her to suggest what possibilities came to mind atop the terrace (198):  “So every morning, I would sit on our threshold, contemplating the deserted courtyard and dreaming about my beautiful future, a cascade of serene delights.  Hanging on to the moonlit terrace evenings, challenging your beloved man to forget his social duties, relax and act foolish and gaze at the stars while holding your hand, I thought, could be one way to go about developing muscles for happiness.  Sculpting soft nights, when the sound of laughter blends with the spring breezes, could be another.”

While Lal’s close readings of the texts and the literary history of 19th century north India yields some arresting insights, her argument seems forced at times just as her neglect of a large swathe of literature that may be useful for her arguments is puzzling. More than six decades after it was first published, Johan Huizinga’s Homo LudensA Study of the Play Element in Culture (1950) has still not been superseded in its depiction of the civilizing function of play and the play-forms that are encountered in poetry, philosophy and art.  Considering Ruby Lal’s interest in the categories produced by aesthetics, even Huizinga’s analysis of the play element in the baroque and the rococo could have been productive for her own work.  If Huizinga seems too far removed from the Indian context, though his canvas extends to the Mahabharata and the Upanishads, Indian readers might ponder over the relation between the Indo-Islamic or Urdu literature that she peruses and the stories that proliferate in north India on the playfulness of the gopis or the village women who engaged in constant play with the god Krishna.   As Ruby Lal doubtless knows, the mythopoetic world in which Krishna and the gopis are immersed was construed by the most positivist of the Indian nationalists as one of the principal sources of India’s subjection to colonial rule.

Ironically, then, for a book that promises to open up our understanding of the “art of playfulness”, Ruby Lal’s monograph gives insufficient play to the idea of play itself.  Nevertheless, her social history of play and pedagogy, refracted through the lens of the girl-child/woman, is not without promise.  Whatever the limitations of education in India, and those are severe, and whatever the merits, which are likewise considerable, of the meta-critique of education as the indispensable element in the liberal pharmacopeia, the education of the girl-child in India still remains the first door leading to a more enhanced and dignified conception of human life. The criminal neglect of the girl-child and woman in India will haunt the nation for decades to come. However, as Lal’s study amply shows, girls and women have displayed remarkable ingenuity and resilience alike in giving play to spaces to make them less restrictive. It is in the imaginative dialectic of play and pedagogy, as it were, that the promise of Indian girlhood and womanhood will come to fruition.

[Adapted from a review published in The Journal of Social History 49, no. 3 (Spring 2016), 752-54.]

 

Read Full Post »

(after a viewing of “The Man Who Knew Infinity”)

No matter how often one might have heard the story of Srinivasa Ramanujan, it never ceases to astound.  G. H. Hardy, the Cambridge mathematician with whose life Ramanujan’s story is inextricably intertwined, put it poignantly when he remarked that his collaboration with him was “the one romantic incident in my life.”  Even those who are mathematically illiterate are touched by the story.  It is a romance that nothing can kill.  And when the life of a mathematician appears as a romance to ordinary people, then one can only turn to Hamlet’s admonishment to his friend:  “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.”

Ramanujan&HisNotebook

Srinivasa Ramanujan with one of his legendary notebooks.

However sophisticated the interpretations surrounding Ramanujan’s life and his extraordinary genius, the bare outlines of the story appear in a form that is inescapably present to every reader of the narrative, which goes something like this:  A little-known, indeed rather obscure, Indian mathematician was toiling away as an office clerk in Madras in the early part of the 20th century.

Ramanujan'sBirthHomeKumbakonam

Srinivasa Ramanujan’s birth home in Kumbakonam, Tamil Nadu.

Though recognized by his peers in Madras as man of unusual mathematical gifts, Ramanujan could find no one in his vicinity capable of understanding the theorems which he had a habit of recording in his notebooks.  Meanwhile, Ramanujan had been published in the journal of the Indian Mathematical Society.  Ramanujan eventually, and altogether fortuitously for the history of mathematics, came to the attention of G. H. Hardy, quite possibly the greatest mathematician of the day in the Anglo-American world. The two would commence a famous intellectual collaboration after Ramanujan had been brought over to Britain.  Alas, five years in Britain, while they would bring Ramanujan to the notice of fellow mathematicians all over the world, would also be his undoing.  The inhospitable climate and food took its toll of the fastidious Brahmin, and a year after his return to India in 1919 Ramanujan passed away at the age of 32.

GH Hardy

G H Hardy, Cambridge mathematician.

At first glance, a casual reading of Robert Kanigel’s The Man Who Knew Infinity, which has inspired the film of the same name, might appear to convey the impression that the Ramanujan-Hardy encounter is best read as a ‘culture clash’.  Hardy, writes Kanigel, was a “Fellow of Trinity College, the mecca of Cambridge mathematics, hence of English mathematics” (111); Ramanujan, on the other hand, was largely an autodidact, and was bereft of any degree.

Cambridge-trinity-college-the-great-court-1914

The Great Court, Trinity College, Cambridge, 1914.

Though Ramanujan spent five years at Trinity College, and the two worked in close proximity throughout this time, Hardy was little aware that Ramanujan was a strict vegetarian and that his complete rejection of meat, fish, poultry, animal lard, and, I suspect, eggs was leaving him starved in a country that for centuries had remained clueless about vegetables.  (Now that Britain had been civilized by South Asians, at least this problem has been addressed.)  Even less would Hardy have understood that vegetarianism alone is construed by some as a religion—though, as shall be seen, Ramanujan’s religiosity went well beyond dietary preferences.  Watching this film, where episodes that point to the difficulties that Ramanujan encountered in being able to satisfy his hunger without violating the tenets of vegetarianism with which he had grown up appear intermittently, brought to mind an evening in 1992 I spent with T.G. Vaidyanathan, a comparatively little-published but maverick thinker (and even more so teacher) of great reputation.  TGV, as he was known to friends, was visiting New York; we walked to dinner; and when I inquired whether he had any preference for a particular cuisine, he stated only that he was a strict vegetarian.  What stays with me from our conversation that evening is TGV’s remarkable rendition of his faith:  Vegetarianism is my Bhagavad Gita, he told me.

 

So with Srinivasa Ramanujan, except that he further expressed himself as inspired by the Goddess.  Hardy, by contrast, was an unflinching atheist.  But this was not, as is commonly supposed, a clash between the mysterious and spiritual East and logos-centered West.  True, there are moments when the film might appear to descend into such clichés, as when Hardy, in a moment of exasperation, berates Ramanujan for ignoring “proofs” and relying on “intuition”.  However, Kanigel wisely eschews the satisfaction of embracing the easy distinction between the spiritual Orient and the material Occident that continues to inform many popular readings of their encounter, gesturing instead at least at what are some of the more fundamental questions that emerge from the collaboration of these two minds.  Both Ramanujan and Hardy were consumed by numbers, though there is the arresting question about what we mean by numbers at all—and particularly very large numbers, broaching, shall we say, infinity.  What did either of them understand by numbers?  What, in turn, were the sources of their creativity, and what might the fact that Ramanujan was unschooled have to do with Hardy’s inability to comprehend how Ramanujan’s mind worked?  How, Hardy asks Ramanujan more than once in the film, do you know what you do know?  How do you arrive at these theorems?  Is there, in other words, a method to this madness—for surely it was madness that drove Ramanujan to his results and then to extinction?

 

The Hardy-Ramanujan narrative is a parable about the politics of knowledge and the incommensurability of knowledge systems. Against Hardy’s repeated insistence that Ramanujan offer “proofs”—which I would liken to the stations of the cross, the steps that culminate in the apotheosis of mathematical truth—for his theorems, the South Indian Brahmin countered that the “proofs” barely mattered. If a theorem was correct, then what need was there for proofs?  Hardy’s knowledge was more than merely bookish; nevertheless, he had been schooled in certain styles of mathematical thought and was bound to a bookish conception of mathematical rigor.  What Hardy barely recognized was that his own knowledge, formidable as it may have been when measured against other mathematicians, had constrained him; Ramanujan, in contrast, was unburdened by formal learning, and that was also the source of his extraordinary creativity.  To me, Sir, Ramanujan told Hardy, “an equation has no meaning unless it expresses a thought of God.”  Now Hardy could simply have dismissed this as a nonsensical remark, the residual effect of superstition from which the mind of a Hindu, no matter how much given over to the work of logos, is never entirely free.  Or he could have assimilated Ramanujan’s statement to a worldview for which he had some affinity, namely that mathematical truths have something of the ineffable about them, a beauty and purity which approximates spiritual truth.  Or he could have taken Ramanujan’s strange expression of truth as a tacit invitation to at least momentarily unburden himself, desist from proof-seeking, and allow a less charted framework of knowledge to inform his work.

RamanujanCambridge

Ramanujan (center) with other scholars at Cambridge University.

There are, as the film amply suggests, a great many other features that are important to an understanding of the Ramanujan-Hardy narrative and an appreciation of the immense odds against which Ramanujan had to struggle.  The racial element was always present, if not in their relationship, certainly in Cambridge and in wider mathematical circles:  an unschooled, “bloody Indian” had slowly but surely established himself in the Mecca of mathematics and cut the venerable dons of this institution down to size.  Kanigel misses out, however, on the politics of sexuality that is incipient in a narrative which has tacitly opposed a masculinized Hardy representing the imperial and ratiocinative vigor of Britain to an effeminized, vegetarian, superstitious Brahmin belonging to a subject race.  Their story, though it has never been read this way, is also a parable about how ostensibly neutered and highly objective forms of knowledge are also captive to dominant registers of masculinity.  But, amidst these and many other strands of thought that emerge from this story of the meeting of two minds, it is the politics of knowledge to which we must remain supremely attentive as we continue to grapple with this story.

Ramanujan15NPPostageStamp

The first of three postage stamps released by the Government of India in honor of Ramanujan, this one on the centenary (1997) of his birth. Few Indians have been conferred such official recognition.

Read Full Post »

On an October evening in 1977, Manubhai received word of the unexpected demise of his younger brother Dipak, an orthopedic surgeon based in New Jersey.  Manubhai has described his brother in Living, Dying thus:  “Dipak was a tall, handsome person, athletically built and inclined.  He had neither diabetes nor high blood pressure, nor excess weight—none of the ‘risk’ factors.”  No one in the family had ever complained of anginal pain; and, yet, at 30 years of age, Dipak had suffered a massive heart attack and passed away in his sleep.  It was a “rude shock” for Manubhai, but then “the head consoled the grieving heart, persistently driving home the point that death’s mathematics does its task governed solely by Pascalian probabilities, irreverent in the face of medical attempts at prevention, diagnosis and treatment.”  On reading this, I was reminded of Ralph Waldo Emerson’s starkly beautiful essay on “Compensation”, where he described the loss of his small son as akin to the “loss of a beautiful estate, no more” (or words to that effect).  He wrote of his experience, “I cannot get it nearer to me”, words that have disturbed his detractors and some of his admirers who opine that Emerson was unable to feel anything.  Quite to the contrary, Manubhai, as Emerson much before him, had a deeper understanding of death as a soulmate, a profound awareness that the laws of compensation cannot be denied, and that what appears as a tragedy in one’s own personal life “is but a part of the impartial, fully just, greater order.”  It would be superfluous to add that, as with the case of cancer, Manubhai remained an unrelenting critic of coronary care, which he did not deign to redeem even as a form of dignified plumbing.  His conclusion to the article that he wrote on “Coronary Care” for the aforementioned The Future of Knowledge and Culture sums up his views:  “Our advice to the lay and the learned is to stay away from the well-conceived but useless and harmless procedures comprising invasive coronary care.  The cardiologists and coronary surgeons are riding a tiger they fear to dismount, lest the dollar Niagara come to a sudden end.  Angiography, by itself untrustworthy, inevitably spawns—plasty and/or bypass, the trio comprising costly iatrogeny on a global scale.  A wise person avoids any assault on the coronary tree, no matter how sophisticated the laser, reamer, rotor or what have you.”

Any tribute to Manubhai that does not acknowledge his wry sense of humor, erudition, love of literature, and cheerfulness would be woefully incomplete.  I last saw him, I believe, in or around March 2009.  He invited me to a leisurely breakfast at his home with him and Jyotibehn and two memories of that visit will persist with me to the end.  We had been discussing politics in Gujarat, and he was just as bothered as I was by the obscenity of some of the violence perpetrated in 2002.  Quite suddenly, Manubhai threw this question at me:  ‘What do you think is the holy book of the Gujaratis?’  I knew that he did not have the Bhagavad Gita in mind, nor the Tulsidas Ramacaritmanas, certainly not the Vedas; for a moment, I thought he might have had in mind the songs of Narsi Mehta, the great devotional poet.  But somehow I also sensed that Manubhai was laying a trap for me; and yet I could not bring myself to think of an answer beyond the ordinary.  I don’t now recall what I said; but whatever it was, it was not a patch on the brilliantly funny and incisive answer Manubhai had:  the cheque book!  We had a hearty laugh.  Later that morning, as we left his apartment, we made our way to the train station: for years, Manubhai had taken the local to KEM Hospital.  It was absolutely characteristic of him that he should travel in modesty:  however dreadful the cliché, “simple living, high thinking” seemed to furnish the motor to his life at every turn.

Manubhai died as he lived; moments before his death, I am told, he had been chatting and laughing away.  Not accidentally, one of the men he admired the most was J B S Haldane, a polymath who made significant contributions to physiology, genetics, evolutionary biology, statistics, biometry, and various other fields; more to the point, Haldane, an Englishman of considerable pedigree who was educated at Oxford and had published his first scientific paper at the age of 20, migrated to India in 1956 and eventually took up Indian citizenship.  Haldane, to Manubhai’s mind, stood for the other West—a West that was critical of its own past, tolerant of dissenting traditions, aware of the homology between colonial dominance and the suppression of women, religious minorities, and people of other ethnicities, a West with which, in other words, India could enter into partnership.  Haldane thought of India as a freer country than any other, and some of his thoughts may be surmised from his observation that “the people of Calcutta riot, upset trams, and refuse to obey police regulations, in a manner which would have delighted Jefferson. I don’t think their activities are very efficient, but that is not the question at issue.”  Percy Bysshe Shelley, be consoled:  it is not only poetry that makes nothing  happen.  Haldane passed away in 1964, but not before he had written a poem on his hospital bed, “Cancer’s a Funny Thing”, from which Manubhai quoted frequently:

I wish I had the voice of Homer

To sing of rectal carcinoma,

Which kills a lot more chaps, in fact,

Than were bumped off when Troy was sacked . . . .

Cancer could be “rather fun”, says Haldane,

Provided one confronts the tumour

With a sufficient sense of humour.

I know that cancer often kills,

But so do cars and sleeping pills;

And it can hurt one till one sweats,

So can bad teeth and unpaid debts.

sort of laughter, I am sure,

Often accelerates one’s cure;

So let our patients do our bit

help the surgeons make us fit.

Manubhai was far ahead of his times, and it may take a few generations or more for us to understand the manner in which he lived and how he helped us all to become “fit”.

Coda:  Shortly after I finished writing this, by sheer coincidence my friend Ajay Singh sent me the following joke:

कार्डियोलोजिस्ट और गब्बरसिंह में क्या समानता है?
दोनो यही सलाह देते है कि तूने नमक खाया है अब गोली खा ।
(What is common to the cardiologist and Gabbar Singh?  Both come forward with this advice, ‘You ate salt, now bite the bullet.)
To audiences familiar with the world of the commercial Hindi film, this joke will resonate strongly:  The outlaw Gabbar Singh, featured in the immensely poplular film Sholay (“Embers”, 1975), shoots dead one of his henchmen, one of those who ate his salt, when he finds him no longer competent in discharging his duties.
(concluded)
See also parts I and II

Read Full Post »

In 1973, Dr. Manu Kothari and his associate, Dr. Lopa Mehta, published their voluminous tome, The Nature of Cancer, which I am tempted to describe as a war on the “war on cancer”.  The military metaphor has, of course, long been regnant in the US:  for well over a decade the American public and people overseas have been hearing about the “war on terror”, but this war was preceded by the “war on drugs”.  Neither war has been concluded; neither war is likely to be brought to a close; indeed, neither war has a foreseeable end, and the prosecutors of such wars, and their allies and friends in and out of government, have too much to lose if either war was brought to a decisive end.  All this is certainly true of the “war on cancer”, which has consumed hundreds of billions of dollars, perhaps trillions of dollars, thus far.

However, the war on cancer differs from the war on drugs the war on terror in some fundamental respects.   The war on drugs is increasingly being recognized, except by the Republican Party – not, it should be noted, by some outlandish or extreme members of the party, since such a view presumes that there are sane or even intelligent members of the Republican Party, which is very much to be doubted—as an egregious error which has needlessly committed hundreds of thousands of Americans to prison terms, and similarly the war on terror has had more than its share of detractors.  But the “war on cancer” is construed, by every sector of the American public, as a holy mission:  to be sure, there are those who think that there might have been some scams, and a few people have doubted whether all forms of cancer research have been productive, but there is an overwhelming consensus that cancer is a deadly disease that must be exterminated and that no effort must be spared to stamp it out.

Cancer research draws in more funding than any other medical endeavor; the war on cancer has its foot-soldiers and generals; and donors and philanthropists, whose wealth is often ill-begotten, easily become heroes and celebrities in a culture where donations in the name of cancer research earn one goodwill and, if the gift is substantial enough, cultural capital in the form of a building or institute named after the donor.  It is a telling fact that in his highly celebrated “biography” of cancer, The Emperor of All Maladies, the talented writer and doctor, Dr. Siddhartha Mukherjee, entirely succumbs to this dominant narrative.  On reading him, one inescapably reaches the conclusion that if we soldier on, achieve “early detection”, and eliminate the scourge of smoking—but apparently not bother with the monstrous-sized polluting SUVs and pick-up trucks with which America has an undying love affair—victory will be at hand.

Dr. Manu Kothari had an entirely different view of cancer and what passes for “cancer research”.  His views would be distilled in two much shorter works, both co-authored with Dr. Lopa Mehta:  Cancer:  Myths and Realities of Cause and Cure (1979) and Living, Dying:  A New Perspective on the Phenomena of Disease and Dying (1992).   He unflinchingly put forward the view, which certainly did not win him any friends from among those in the cancer(ous) industry, and even gained him the opprobrium of establishment doctors alarmed at his broader views about the nature of disease, that the billions of dollars expended on finding  a cure for cancer had not advanced our knowledge of the “disease” an iota.  Writing on cancer for The Future of Knowledge of Culture:  A Dictionary for the Twenty-first Century (Viking Penguin 2005), co-edited by Ashis Nandy and myself, Manubhai put the matter quite succinctly in expressing his agreement with the view of some patients that the “treatment [was] worse than the disease.  Macfarlane Burnett, the Australian immunologist of wide renown, summed up in the 1970s the outcome of all cancer research in just two words:  precisely nil.”  As Manubhai was to add towards his conclusion, “On the medical claims about the early diagnosis and treatment of cancer, one could invoke Churchillian rhetoric:  Never in the history of science has so much untruth been told by so few to so many for so long.”

How, then, was Dr. Kothari inclined to think of cancer?  His views may, at first, seem wholly unpalatable:  “Cancer—far more benign than malignant mankind—is what it is, and does what it does, because of unalterable, unabrogable biorealities that attend this fascinating phenomenon.” Manubhai took it as an imperative that we must first understand death and look at it not something that is to be feared, delayed, managed, ostracized, and repelled but rather as a friend, even as something that is to be revered.  He was critical of medical science for representing one disease or the other as the cause of death:  as he put it in Living, Dying, “Disease and death, in fact, are inherent components of man’s development, are governed by time and regulated by the herd, behave independently of each other and, in essence, are causally unrelated, death by itself being a programmed normal function performed by a living being.”

He argued that cancer occurs throughout the human lifespan; moreover, it is very democratic, and cancer’s “benevolence” could be inferred from the fact that it occurs everywhere “but in excess nowhere.”  He described cancer’s distribution as one in five:  one person bears the cancerous cross so that the other four might live.  Manubhai does not ask of us that we love cancer; but he does ask of us that we not hate it.  Once one understands that cancer is always with us, the very fibre of our being, we are no longer inclined to seek treatment:  he entirely rejected the idea of early screening, and deplored chemotherapy and radiotherapy as “despicable overkill by medicine.”  The fact that as a doctor, one remembered by his students as a very good one who did his profession proud, he was able to advance such views is a remarkable testament to his courage.  What is not less striking is that he had been articulating such a position for over four decades:  not surprisingly, one of his most ardent admirers was Ivan Illich, whose own Medical Nemesis, published one year after Manubhai’s The Nature of Cancer (1973), still remains the most trenchant critique of institutionalized forms of modern medicine.  Illich would go on to write the foreword to Manubhai’s smaller book on cancer.  Interestingly enough, the most recent exhaustive study on “early detection” all but confirms Dr. Kothari’s claims:  as reported by the New York Times on 20 August 2015, in an article headlined “Doubt Is Raised Over Value of Surgery for Breast Lesion at Earliest Stage”, “As many as 60,000 American women each year are told they have a very early stage of breast cancer — Stage 0, as it is commonly known — a possible precursor to what could be a deadly tumor. And almost every one of the women has either a lumpectomy or a mastectomy, and often a double mastectomy, removing a healthy breast as well.  Yet it now appears that treatment may make no difference in their outcomes. Patients with this condition had close to the same likelihood of dying of breast cancer as women in the general population, and the few who died did so despite treatment, not for lack of it, researchers reported Thursday in JAMA [Journal of the American Medical Association] Oncology.”

(to be continued)

See also Part I on this blog

Read Full Post »

Older Posts »