Kiran Raj Pandey's Blog

May 20, 2024

Translating scientific knowledge to improve people’s health

What kind of effort might have the greatest leverage in changing people’s health, especially in the developing world where resources are often limited? For decades, this question has been at the heart of debates on how to improve health around the world.

Efforts to answer these questions have resulted in two broad areas of inquiry. First, how best can we set up our health systems, and second, how best can we use resources to finance health systems. But is there a different way to think about how best to leverage our efforts and resources in changing people’s health?

A few weeks ago I had an opportunity to engage with friends at Vital Strategies in Singapore to ponder upon these issues. There I made a case about how we use “scientific knowledge” may offer an alternate way of exploring this question. Specifically, for organizations like Vital Strategies that aim to maximize the impact of their work in improving public health and healthcare delivery around the world, understanding scientific knowledge as a point of leverage to maximize the impact of their work could be insightful. This could also inform where resources and programmatic priorities should lie to maximize impact.

The ecosystem of scientific knowledge production and use runs on a spectrum. At one end of that spectrum is the production of scientific knowledge itself. This part is often understood as the basic research leading to scientific discovery or invention. For example the discovery of the fact that germs cause disease. At the other end of the spectrum is the use of that knowledge to benefit mankind — the use of antimicrobials to treat an infection. In the middle is the important task of translating the basic scientific discovery into a product or a service to benefit mankind.

The sequence of events that lead to the creation of antimicrobials is illustrative. In 1882, building on incremental scientific advances made by Louis Pasteur and others, Robert Koch discovered the Tubercle bacilli as the cause of Tuberculosis. This discovery resulted in Koch’s postulates — the boundary conditions under which a germ may be implicated as the cause of a disease. Based on this scientific knowledge, a German physician named Paul Ehrlich, conjured up an idea that it may be possible to create a specific microbial agent against the germs that cause diseases.

In 1910 — for the first time in human history — Ehrlich created Salvarsan, an antimicrobial agent that was astonishingly better at treating Syphilis than existing therapies. The treatment of Syphilis, a disease that was rampant at the time, was quite unsuccessful until then, and not for the want of trying. Salvarsan, a product that resulted from the scientific knowledge that resulted from our centuries long quest in trying to understand the cause of diseases, was successful in treating a disease considered the bane of human civilizations.

Soon after, Salvarsan was distributed to people around the world; millions of people benefited, and it also led to the creation of several antimicrobial agents including Penicillin that we use even to this day.

This begs the question, which was the most important leg of the work that resulted in people being able to be treated from antimicrobial infections? Was it Pasteur and Koch’s initial scientific discovery, Ehrlich’s translating that scientific knowledge into a successful product Salvarsan, or the distribution and use of Salvarsan around the world?

A case — and a successful one — may be made that, for the amount of time, effort and resources that went into each of these three legs of activities, the middle portion of translating scientific knowledge was the one that resulted in the greatest relative impact for the effort. So for organizations and individuals that are intent on making the greatest impact for the limited time, effort and resources that they can bring to the table, translation of scientific knowledge and discovery into successful products and programs may be the best activity they can engage in.

There are other reasons why the translation of scientific knowledge into products, programs, or even policies may be the most impactful activity. There has been a breathtaking expansion of the scientific knowledge base in health sciences, however many people in resource limited settings have been unable to benefit from these scientific advances because the technical know-how and the effort required to translate these bits of knowledge into usable programs, services or products does not exist in many communities.

For example, the knowledge that tobacco taxes are the most effective way of reducing cigarette smoking has been known for quite some time now. However the use of this knowledge to improve tobacco policies has been suboptimal in many countries. In Nepal alone, harmonizing tobacco control policies according to the latest available scientific evidence could substantially reduce tobacco consumption and save tens of thousands of lives per year. The translation of this bit of scientific knowledge into effective policy is one of the most effective ways in improving population health.

Doing translational work is not without its challenges though. There are conditions that have to align before being able to do this kind of work. The first is to find oneself in a bridging role between where the knowledge is created, and where the knowledge could be useful. It means having a ring side view of several intellectual and implementation domains that may expand from basic sciences, epidemiology and public health, clinical medicine, public policy, economics and business to the social sciences. It also means being able to bridge geographies, cultures and disparate professional environments.

Recent technical advances mostly in communications, artificial intelligence, genomics, and structural biology in general are poised to result in a Cambrian explosion of novel scientific knowledge and ideas. The opportunity to impact human health by translating these ideas into successful products and services to benefit all of mankind is tantalizing. Individuals and organizations that are situated to do such translational work should grasp this opportunity, because therein lies the opportunity to make the greatest impact on human health for the amount of effort expended.

[image error]
 •  0 comments  •  flag
Share on Twitter
Published on May 20, 2024 19:26

October 12, 2022

A Clash of Ideas

CHAPTER FIVEUp Is The Curve

October 16, 1973

Kuwait City, Kuwait

In one of the most egregious yet efficient land-grabs ever demonstrated under gunpoint, Israel expanded its territory by fourfold during the Six-Day War of 1967, taking vast swathes of the Sinai Peninsula, Golan Heights, West Bank and East Jerusalem. Egypt, Syria, and Jordan, from where most of the territory was seized, were left reeling from this loss. Seven years later, on October 6, 1973, Egypt and Syria hit back to avenge for their losses: they launched coordinated attacks against Israel during the festival of Yom Kippur that caught Israel off guard. An initially reluctant United States rapidly provided a $2.2 billion support to Israel’s war efforts when it was found that the Soviet Union was backing Israel’s opponents. In protest of the US support, the Arabs hit back with a decision that sent shock waves throughout the world. On October 16, 1973, a meeting of the leaders of the Organization of Arab Petroleum Exporting Countries (OAPEC) decided to increase oil prices by a whopping seventy per cent and embargoed oil supply to many countries including the United States. Within a period of three months, oil prices rose fourfold. The unfettered economic expansion in the US and western Europe that had followed the Second World War had already been showing signs of an incipient slowdown at the start of the seventies; the oil embargo proved to be the last straw. By late 1973, commodity prices began to rise, industrial output slowed down, and global economies in Western Europe and North America went into a recession.

Although the oil embargo was rather short — it was lifted in March 1974 amidst differences of opinion at the OAPEC — the economic recession continued until the next year. While economic ramifications of the chain of events triggered by the oil embargo were severe, the political implications, it may be argued, were even more consequential. In the industrialized economies of the global north, rising oil prices led to runaway inflation, wage pressures and economic hardships: there was a wave of major political discontent, the consequences of these set of events would continue to unfold throughout the rest of the century. But it was among nations of the global south that the oil embargo caused incipient political undercurrents to swell up and rise to momentous history-altering events.

Through the 1950s, the membership of the United Nations had ballooned rapidly, from fifty-one at the time of its founding to 110 by 1962. This continued further through the 1960s. The increasing presence of several post-colonial and developing nations at the United Nations began to seek political expression. One of the major outlets of this political force happened in 1968 at a conference of the United Nations Conference on International Trade and Development (UNCTAD) in New Delhi, where these nations announced the formation of a voting bloc called the G-77 to advance a common political and economic agenda at the global stage. The G-77 followed closely along the lines of the Non Aligned Movement that had started in the 1950s. In the years that followed, the G-77 worked with an aim to tilt the global politico-economic landscape in favor of the developing nations of Asia, Africa and Latin America. This proposal to radically re-haul the global balance of power was labeled the New International Economic Order (NIEO). To dictate favorable economic terms with the advanced economies of the global north, the proponents of NIEO had considered the possibility of arm-twisting advanced economies with a commodity cartel; the unanticipated success of OAPEC’s oil embargo in bringing the global economy to a standstill energized them. Within a few months of the oil embargo, in May 1974, members of the G-77 were able to bring to discussion a resolution supporting the NIEO at the United Nations General Assembly (UNGA) Special Sessions. The swelling ranks of global economic underclass at the UNGA, where one state counted for one vote, meant that the proposal was passed despite the protest of some powerful advanced economies.

The lifespan of the NIEO as a vision for redistributive global politico-economic justice was rather brief, brought down among other things by rifts between the members of the global south. However, before it had died an unceremonious but predictable death by the end of the decade, the NIEO was able to plant ideological offshoots with lasting consequences in many areas of global cooperation. One such offshoot of the NIEO was the World Health Organization’s proposal of Health for All by the Year 2000.

In September 1978, 3,000 delegates representing 134 governments and sixty-seven international organizations came together for an International Conference on Primary Health Care in Alma-Ata, Kazakhstan and passed a highly ambitious declaration to ensure health for all by the year 2000.9 Although the passing of the NIEO in 1974 proved to be the proximate political fillip for Alma-Ata declaration, as it came to be known, the appetite for such a global initiative had been building since at least the previous decade. Throughout the sixties and the seventies, there had been increasing dissatisfaction with the narrow and vertical focus on disease eradication, family planning and population control orientation of health systems and services across the developing world. In addition, attempts to build health systems in many developing countries were limited to building large hospitals in urban areas, often as a continuation of the legacy of colonial medical services that catered to the needs of the ruling class. In the 1970s, the successful build-up toward NIEO had spurred several post-colonial nations to reject the Western visions of development and modernization, including in health.

The attempt to transpose and transplant urban-oriented health systems of the rich world — even while more than half of the people living in rural areas lacked any meaningful access to health services — was increasingly questioned. Instead, several people and organizations went on to articulate alternate visions of health systems that were focused on meeting the health needs of people by means of simpler and more holistic interventions. As opposed to the relentless focus on medical intervention of hospital-based health systems, or the ‘shot-gun’ approach of the disease eradication efforts, these new ideas for health systems focused on community- centered care; this could tackle the broader and distal determinants of health in addition to providing relatively simpler curative health interventions. The body of evidence proving the effectiveness of community-centered health systems was building throughout this period. In the early 1960s, lay village health workers, called ‘barefoot doctors’, were able to drastically improve health conditions in rural China by means of simple interventions that combined preventive care with curative care, and traditional healing systems with western medicine. Similar efforts from Latin American countries like Costa Rica, Cuba, and elsewhere provided further proof-of-concept for these community-oriented health systems.

The work done in India, by Carl Taylor, provided some of the earliest scientific evidence and intellectual blueprint for the global movement toward community-centered health systems. Between 1960 and 1975, Taylor led the Narangwal Rural Health Study in North India where he was able to scientifically evaluate the effectiveness of training lay village workers to provide health and social interventions. By means of meticulous field audits, Taylor was able to show that village health workers were able to diagnose and treat pneumonia without radiography, administer prenatal care and offer nutritional and preventive health services to improve the health of rural populations. The work in Narangwal was later replicated elsewhere including in Jamkhed, India. Experience and evidence also came from the Philippines, Costa Rica, Israel and South Africa. In South Africa, a community-based care model that emphasized community participation in resource allocation and service prioritization emerged.

Carl E. Taylor was born in 1916 in Mussoorie, India, to American medical missionaries. After a childhood spent mostly in his parents’ clinic in the Himalayan foothills, he returned to the United States to go to medical school. Following his clinical training, he returned to India to run a mission hospital in Punjab. However, a few years of clinical practice led him to believe that populations, and not individuals, should be the targets of his efforts. He returned to the United States for a doctorate in public health. It was after this that he returned to Narangwal to do the breakthrough work in scientifically evaluating and establishing the effectiveness of health interventions provided by lay village health workers in treating pneumonia, malnutrition, and reproductive health problems. His work in Narangwal was formative in forming his views on a more grassroots-based approach toward health services.

What makes a difference in people’s health is not what physicians do, but what communities do, Taylor was once quoted to have said.12 Taylor, who had conducted the first health survey of Nepal in 1949 and founded the Department of Preventive Medicine at Christian Medical College Ludhiana, India in the 1950s, also later founded the Department of International Health at Johns Hopkins University. He went on to be a major architect of the Alma-Ata declaration.

Taylor’s views and ideas would later go global when they were taken up by the WHO and the UNICEF. At the WHO, the charge of selling that vision to the world befell on an extraordinarily charismatic figure — the son of a Baptist preacher, Halfdan Mahler.

Halfdan Mahler, the third Director-General of the World Health Organization, followed Marcolino Candau at the post. Mahler abandoned his initial thoughts of following his father into theology, choosing instead to go to medical school. After finishing his medical and public health training in 1948, he spent a few years doing TB and community health work, following which he spent a year in 1950 directing the Red Cross’s TB work in Ecuador. In 1951, at 28, he came to the WHO and was posted to India where he spent a decade as a Senior Officer in the National TB Program. After returning to Geneva from India in 1960, he led the WHO’s TB unit and rose through the ranks until he was prompted by Candau as an assistant Director-General in 1970, to rise as the Director-General three years later. Mahler’s upbringing as the son of a Baptist preacher had given him a strong moral compass that drew him toward the ideals of social justice. These values, along with his belief that existing health systems were not taking a broader stock of the overall determinants of health, led him to develop a strong ideological bias toward community health systems that provided a broad array of basic services.

Excerpted from the book Up Is The Curve — A genealogy of healthcare in the developing world. Available here: Nepal, India, UK. Worldwide delivery available here.

[image error]
 •  0 comments  •  flag
Share on Twitter
Published on October 12, 2022 23:33

September 24, 2022

The Era of Eradication

CHAPTER IVUp Is The Curve

June 12, 1958

Minneapolis, United States

Who might have been the one individual to have done the greatest good to the world? Could a relatively unknown Soviet health minister be a contender? Some people make a convincing case for this.1 Back in the late 1950s, around the time John F. Kennedy declared that mankind would place a human on the face of the moon by the end of the 1960s, an intrepid and visionary Soviet minister coaxed the world to eradicate smallpox. Speaking at the eleventh World Health Assembly (WHA) in Minneapolis in 1958, Viktor Zhdanov, the Soviet deputy minister of health urged delegates from across the world to apply themselves to the important task of eradicating smallpox. Unsurprisingly, his speech did not gather outright enthusiasm. However, the assembly delegates could not dismiss Zhdanov either.

There were several reasons for such an ambivalent response. This was 1958, after all: the Anglo-European Northern world had just carved itself and the rest of the world into first, second and third world categories. A health minister from the ‘second world’ (the Soviet Bloc) giving a speech at a ‘first world’ city, goading everyone to apply the force of their goodwill to eradicate a disease, which by then was primarily afflicting people in the ‘third world’ was probably going to have the response it did owing to the geopolitical equation of the time. The delegates did not exactly know what to make of the proposal. Was Zhdanov serious? Was it a political move to counter the United States? Right after the formation of the United Nations, the Soviets had withdrawn from the multilateral agency and its apparatuses, citing the United States’ undue influence over it. They had begun to reengage with the UN in 1956 following Nikita Khrushchev’s new focus on ‘peaceful co- existence’, and the 1958 WHA was the first one the Soviets were attending. The WHA delegates wanted to make sure that the Soviets felt welcome and stayed inside the tent of the relatively new World Health Organization. Without the powerful Soviets inside the tent, the organization risked irrelevance as it would hardly be representative of the entire world, an idea its founders had worked hard to pursue.

However, this was not the first time a similar proposal had been tabled. An earlier proposal for smallpox eradication in 1953 had gone nowhere. Additionally, the world had tried eliminating or eradicating diseases before: hookworm, yaws, malaria. These attempts had not gone well. Only a few years ago, the WHA had committed to the hugely ambitious task of eradicating malaria from the world. The Global Malaria Eradication Program (GMEP), approved in 1955, was consuming most of the WHO’s energies, as well as resources.4 There were those who had valid concerns, whether the world was capable of pulling off another disease eradication program on top of the hugely burdensome malaria campaign. Therefore, it was hardly surprising that the WHA delegates took time warming up to Zhdanov’s speech.

But Zhdanov, who was an able virologist, also turned out to be a persuasive diplomat and a wily negotiator. He understood the importance of selling his ideas, especially to the skeptical Americans, who had already committed themselves to the malaria eradication program. Zhdanov appealed to the delegates’ moral high ground. He even quoted the American founding father and President Thomas Jefferson’s message to Edward Jenner, given a few years after Jenner’s 1796 discovery of the smallpox vaccine: ‘I avail myself of this occasion of rendering you a portion of the tribute of gratitude due to you from the whole human family. Medicine has never before produced any single improvement of such utility … You have erased from the calendar of human afflictions one of its greatest. Yours is the comfortable reflection that mankind can never forget that you have lived. Future nations will know by history only that the loathsome small-pox has existed and by you has been extirpated.’5 Zhdanov made a case that the eradication of smallpox was an opportunity to realize Jefferson’s vision of a world free from the ‘loathsome smallpox’. In order to jump-start the smallpox eradication campaign, Zhdanov even committed 25 million doses of smallpox vaccine and programmatic assistance toward the campaign.6 Such an evocative and persistent appeal to eradicate the disease made it difficult to dismiss his proposal. On June 12, 1958, as the eleventh WHA was concluding, it passed resolution WHA 11.54 asking the Director-General to explore the possibility of eradicating smallpox and report back to the executive board.

It was not merely at the WHA that Zhdanov had to use the power of persuasion to make a case for the eradication of smallpox. After Khrushchev’s policy changes, the Soviets had wanted to make a splashy re-entry at the global stage. Zhdanov saw that he could exploit the emerging political landscape to advocate for smallpox eradication. However, even before Zhdanov tabled a proposal at the WHA and convinced the world, he had to spend considerable energy to win over the skeptics at home — that smallpox could indeed be eradicated from the entire world. Zhdanov’s unflinching conviction did not come out of nowhere: he knew what he was getting the world into when he made that passionate appeal. In fact, in large swathes of the industrialized world, this had already been done. The last case of indigenous smallpox seen in the United States was in 1934, although imported cases were seen as late as 1949. In Britain, the disease had ceased to be endemic after the 1930s.9 The Soviets had similarly eliminated the disease as a public health problem from their expansive territory, often comprised of inhospitable terrain hard to reach by road, rail or sea.

Earlier in his career, Zhdanov had spent a large part of his time working as an army doctor and later researching viruses. He had seen up close the success of the Soviet attempts at eradicating smallpox from their vast Union. So Zhdanov spoke from experience. His efforts and persuasion fulfilled the hugely important and difficult role of proselytizing and convincing an unsure world of the feasibility of smallpox eradication. He deployed significant political capital, technical vision and moral conviction; he also exploited the favorable political tailwinds that buffeted the Soviet delegation at the 1958 WHA in creating a momentum for the eradication of smallpox.

After the WHO Director-General reported back with a feasibility study, the 1959 WHA approved the Smallpox Eradication Program, but it was not until 1966 — when Americans started throwing their weight behind it — that the program began in earnest. Zhdanov’s speech was the catalytic event that built the political momentum for the eradication program. That momentum was carried forward by the inspired leadership of people like Donald A. Henderson, M. I. D. Sharma, Nicole Grasset and many others. By the end of the 1970s, smallpox had been eradicated from the planet.

If the smallpox eradication campaign had Viktor Zhdanov, its elder cousin, the malaria eradication program, had its own evangelist and proselytizer — Fred Soper. People like Zhdanov and Soper were proof of the fact that a select few individuals played outsized roles in determining the post-War health agenda across the world. In the early 1950s, Soper was almost single-handedly responsible for getting the world to commit to the astonishingly ambitious task of eradicating malaria. Soper, born in Kansas, was a physically imposing man and had an equally formidable personality. He trained at Johns Hopkins, and spent a considerable part of his professional life at the Rockefeller Foundation, mostly working on yellow fever and malaria eradication campaigns.13 In a way, Soper took off from where William Gorgas had left. In the 1930s, malaria eradication experts were divided on what the real enemy was: the malaria-causing protozoan, or mosquito — the vector responsible for transmitting it. Soper decidedly belonged to the latter camp.

Even before the insecticidal properties of the chemical dichloro-diphenyl-trichloroethane (DDT) had been discovered in the late thirties, Soper had been a big proponent of vector control via breeding ground inspections, drainage and chemical sprays — diesel oil, an arsenic compound called Paris green and pyrethrum. In South America, he had perfected malaria inspections to the exactitude of a military drill. When the African variety of mosquito, Anopheles gambiae began spreading malaria in coastal Brazil, Soper had an 18,000 square mile malaria-infested area sprayed in less than two years, a task that was thought as impossible, and controlled malaria. His malaria inspectors dressed in uniforms, and their days

were scheduled down to the minute. They covered every single square foot of the designated area, inspecting and spraying as they went. It is said that once in Rio de Janeiro, there was an explosion at an arsenal close to where a malaria inspector was scheduled to be. Soper, thinking of the worst, wrote condolences to the inspector’s widow and sent her a check. The next day, when the inspector was found to be still around, Soper summarily fired the inspector for not sticking to his schedule!12 Soper’s military-like approach to disease control saw similar success in controlling typhus in Cairo, Algiers and Naples during the Second World War.12 It was in Italy, during this time, that Soper came across DDT for the first time.12

DDT was first discovered in the 1870s. In 1938, Paul Müller, a chemist working for the Swiss company J. R. Geigy was attempting to discover a chemical to protect clothes against moths when he accidentally came to discover that traces of DDT were remarkably effective in killing houseflies. Soon it was found that DDT was a powerful insecticidal. Several kilograms of this chemical were manufactured and sent to Geigy’s New York office for further testing and development. From there DDT made its way to the US army. Diseases like typhus and malaria had been inflicting heavy casualties upon allied forces during the Second World War, and the army soon began its own research into DDT. When the chemical’s efficacy in killing insects was confirmed, industrial production of tons of DDT began, in secret, and it was shipped to Italy. In Naples, the army was able to use DDT to control lice, and virtually eliminate scrub typhus, making it possible for General Patton’s troops to advance and control Italy.

[image error]
 •  0 comments  •  flag
Share on Twitter
Published on September 24, 2022 02:36

September 14, 2022

Private Health Becomes Public

CHAPTER IIIUp Is The Curve

February 28, 1918

Haskell County, Kansas, United States

Dean Nilson is a new recruit at Camp Funston, hauled in by the military machinery to support America’s involvement in the ongoing First World War. This February day, he is back home to Haskell County from his duty station a few hundred miles to the east at Camp Funston. The local newspaper, the Santa Fe Monitor, is pleased by the local son’s return from his services for the country and publishes a story. It writes, ‘Dean Nilson surprised his friends by arriving at home from Camp Funston on a five days furlough. Dean looks like soldier life agrees with him.’ At the end of his furlough, Dean returns to the Camp with little idea that he and his ilk from Haskell County reporting to work could be the kind of agents of death the world has never seen before, even in the ravaging wake of a world war.

Since January that year, the sparsely populated farmlands of Haskell County have seen an unusual spike of influenza. The local doctor, Loring Miner, is taken aback by the severity of the influenza outbreak that season; somehow the disease appears to pick the most healthy young adults and fell them like a bullet shot point-blank. In no time, many of the people struck by the epidemic are losing their lives to the disease. Dr. Miner is so alarmed by the epidemic that he writes a report for the Public Health Reports (now Morbidity and Mortality Weekly Report), a weekly publication of the U.S. Public Health Service.

As troops return to the Camp from their homes, Camp Funston starts to record a spike of a similarly fulminant variety of influenza. By the first week of March, a week after the Haskell natives returned to Funston, several recruits get sick of influenza. Within three weeks, more than a thousand of the 56,000 troops at the Camp are sick with the fulminant variety of influenza. In the next few weeks, Funston sends several of its troops to camps around the United States and to Europe. With them, the deadly variety of influenza probably arrives in Europe.

A century after the 1918 influenza pandemic, historians and epidemiologists still cannot point out the origin of the epidemic with certainty. However, the historian John Barry presents a convincing case that the most likely origin might have been the corn and hog land of Haskell County, Kansas. While there is no easy way to establish with confidence that Barry is right, his suggestion does appear to be the most plausible. Regardless, the 1918 influenza pandemic is easily one of the deadliest in human history. By the time the pandemic subsided in early 1919, almost 500 million people — a third of the world’s population at the time — were infected by influenza, resulting in the death of anywhere between 50–100 million people. In comparison, the War killed about 20 million people. More people were killed by influenza in one year than the War had managed to kill or wound in four. If the unfathomable devastation of the War was not terrible enough, an as-yet-undiscovered shifty little virus has been able to cause an order of magnitude greater damage to the world. The specter of this pandemic would haunt the world for a long time to come.

As the First World War drew to a close in Europe, the first order of business was to convene an apparatus to institutionalize and maintain the fragile peace. The Paris Peace Conference that began in January 1919 resulted in the signing of treaties of Versailles, Saint-Germain and Neuilly along with the Covenant of the League of Nations. The League of Nations was inaugurated in January 1920 after which the Paris Peace Conference ended. At the Paris Conference, US President Woodrow Wilson had collapsed, often speculated due to weakness from the flu. This was very symbolic of the kind of impact the flu epidemic had: at the time, the shadow of the epidemic was still fresh on everyone’s mind. In addition, because of the overall squalor during the War, pestilence had increased astronomically, giving rise to a typhus epidemic in Poland and Russia in 1919 that infected millions of people. Health scares like these had left matters of international cooperation in health on top of everyone’s agenda at the time of the formation of the League of Nations. As a result, close on the heels of the Paris Conference, the newly formed Council of the League of Nations called for an international health conference in London in April 1920 to help the Council plan and establish a permanent health office of the League of Nations.

Revolutionary advances in steam engines and railroads in the eighteenth and nineteenth century had led to the exponential expansion of international travel and trade. This, in turn, led to a phenomenal increase in the rate at which diseases were transmitted from one corner of the world to the other. Health scares like the flu, plague and cholera epidemics became a major public health threat and the reason for the disruption of international trade. As a response to these threats, as early as 1851, the first international sanitary conference was held in Paris to figure out the rules and regulations governing the movement of goods and people across the world. This marked the first time a conference was organized to address matters of international health cooperation. In addition, developments in Great Britain also contributed to the acceptance of the idea that combined social action and cooperation was necessary in matters of health.

In 1838, Edwin Chadwick, a proponent of the miasma theory, in his capacity as the secretary of the Poor Law Commission produced a report on the sanitary conditions of the London working class, followed by another one for Great Britain in 1842. The report, called General Report on the Sanitary Conditions, laid out with clarity how poor and filthy living conditions were resulting in the working class having half the life span of the richer class. This led to a societal impetus to consider sanitation as a means of ensuring the health of the general population. Health, until then a matter of personal concern and prayers, became a subject of social action: the personal became the public. The Public Health Act followed soon in 1848. This series of events in Great Britain helped establish the notion that health could be a matter that public bodies like governments needed to concern themselves with. The Paris sanitary conference of 1851 came in this backdrop.

The conference included delegates, a physician and a diplomat each, representing twelve countries in Europe. After a six-month- long deliberation, a convention and regulations were finalized and signed by the delegates. However, when most governments failed to ratify them, they fell off the scene. The failure was partly due to procedural difficulties; more importantly, before the causes of cholera, plague and yellow fever were known, there just was no common scientific consensus that delegates and governments could rally behind. Because of the lack of scientific knowledge, the physicians had so little to offer.

By the time the second international sanitary conference was organized in Paris in 1859, the physicians were not even invited. Not much came out of that five-month-long conference. The third one was held seven years later in 1866 in Constantinople, where it was agreed that cholera was endemic in India and nowhere else and that it was transmissible. While the air was thought to be the medium of transmission, a fleeting possibility that water too may be responsible was raised, citing John Snow’s work in London. Such was the nature of progress during those times. Several other sanitary conferences were held, mostly in Europe, that similarly beat about the bush and achieved little, if only because the scientific knowledge that could form the basis of a consensus agreement just did not exist. The best the conferences could aim for was to facilitate trade in Europe while keeping out pestilence from the outside world, which was thought to be the source of cholera, plague and yellow fever.

By the turn of the twentieth century, as a result of the work of scientists like Robert Koch, Alexandre Yersin, Kitasato Shibasaburo, Walter Reed and Carlos Finlay, significant scientific advances had been made into understanding the etiology of cholera, plague, yellow fever, as well as vectors that were responsible for their transmission. This new knowledge base formed the foundation of joint international action on common public health problems. Beginning in the early twentieth century, international health cooperation took the form of standing bodies to facilitate joint international action. Two international health bodies came into being at around the same time: the first, the Pan American Health Organization (PAHO), earlier called the Pan American Sanitary Bureau, established in 1902 with an initial mandate to primarily handle yellow fever epidemics along the trade routes in the Americas. The second such body was the international office of public health, more popularly known by its French name Office International d’Hygiène Publique (OIHP) established mainly by European countries in Paris in 1907. The OIHP essentially undertook the baton from the international sanitary conferences of Europe.

Both the PAHO and OIHP worked to come up with technical standards, and suggest rules and regulations for cooperation in order to control communicable diseases — all with the intent of facilitating trade among their member states. These organizations aggregated and transmitted to member states the latest surveillance information and also made available technical information on diseases of interest.

Although these organizations did important work for a few years, when the First World War broke out in Europe, their functions became limited. By the time the War was over, the fear of pestilence and epidemics was so high on everyone’s mind that the League of Nations covenant explicitly articulated the need to form a dedicated League of Nations Health Organization (LNHO). The fact that the devastating influenza and typhus epidemics of 1918 and 1919 had come not from the hinterlands of Asia or Africa — places Europeans were wont to blaming as a den of diseases — but from the middle of North America and Europe must have convinced Western countries of the need to address health issues with urgency. In its report, the London conference organized by the League of Nations wrote of the typhus epidemic in Poland, ‘the prevention of typhus in Poland and the spread of that disease across Poland is a matter which calls most urgently for united official international action’ and suggested that the League of Nations Health Office was the ‘the sole organization sufficiently strong and authoritative to secure that the measures required are taken.’ Such were the hopes for a health organization under the ambit of the League of Nations. The League was envisioned as an umbrella organization meant to bring all extant bodies of international cooperation under one roof, and the health office under it hoped to fulfill a similar umbrella role in matters of health. The initial ideas for the Health Organization were for it to subsume the other international health organizations like the OIHP, however, politics came in the way. The OIHP continued to remain independent and the LNHO had to maintain an uneasy existence alongside existing bodies like the OIHP and PAHO. The latter continued to exert themselves in matters related to international cooperation in health in their home turfs of Europe and the Americas and the Health Organization of the League of Nations had to seek new roles for itself.

Excerpted from the book Up Is The Curve — A genealogy of healthcare in the developing world. Available here: Nepal, India, UK. Worldwide delivery available here.

[image error]
 •  0 comments  •  flag
Share on Twitter
Published on September 14, 2022 20:00

September 8, 2022

The March Of Western Medicine

Chapter IIUp Is The Curve

August 31, 1909

Frankfurt, Germany

It’s a gloomy day outside in Frankfurt. Inside, in a dark cramped laboratory at the Royal Institute of Experimental Therapy, Paul Ehrlich and his Japanese trainee and colleague Sahachiro Hata are hunched over the table, hard at work injecting a series of experimental Arsenicals in syphilitic rabbits. The dim lamp that lights up the study table illuminates the work area in a pale shade of amber. But that is enough for a pair fired up by an idea: the idea is to make a chemical that can selectively attack the syphilis-causing bacteria — recently discovered and christened as Treponema pallidum. Over the years, Ehrlich has been experimenting with the idea of creating a specific magic bullet — an antimatter of sorts — against disease-causing microbes. He has amassed a chest full of hundreds of chemicals, mostly based off of Arsenic, a known poison that might fit the purpose. Ehrlich’s trainee Hata, who arrived from Japan to train at the Institute, has been skillfully conducting the experiments. He has been trying out the Arsenicals to see if they have any bactericidal activity against the spirochete, injecting them one after the other on rabbits with syphilitic ulcers. On this fall day, with Ehrlich watching over, Hata injects yet another Arsenical, labeled №606 — the 606th chemical that he has tried. Thus far, even after two years of relentless work, Hata and Ehrlich have not been lucky. Today however their luck is about to take a turn for the better. To their utter joy, by the next day, the №606 injected ulcers have no syphilis-causing bacteria. In a few weeks, the syphilitic ulcers are completely healed.

The promising results vindicate Ehrlich. After decades of work, his ideas on magic bullets that would attack the parasite and spare the organ tissue were coming to be. In №606, called Arsphenamine, the promise of antimicrobials had finally begun to come to fruition. In March 1910, Ehrlich presented his findings at a scientific conference in Germany. Soon after, №606 was picked up and manufactured by the German drug company Hoechst and marketed all around the world as Salvarsan. Ehrlich had 65,000 vials of this drug sent to doctors all over the world for a therapeutic trial. In their tests, doctors overwhelmingly found that Ehrlich’s chemical was better than anything they had tried against syphilis until then. Within a year of discovery, №606, branded as Salvarsan by Hoechst, became available in clinics all over the world, with remarkable efficiency of distribution for that time. The drug was so well received that it soon became the most prescribed drug in the world. In fact, Salvarsan has the unique distinction of being labeled as the world’s first blockbuster drug.

Toward the turn of the twentieth century, even as many colonial physicians and scientists were making their own breakthroughs in the far outposts of the colonial empire, Europe was emerging as the epicenter of groundbreaking discoveries in biological sciences, taking Western allopathic medicine decidedly into the future. These developments would cement Western medicine — the heir of the 2,000-year-old Galenic legacy of the four humors and the miasma theory — as the prevalent form of medical practice not only in the West but also elsewhere over other forms like Ayurveda, the Chinese healing traditions or the German practice of Homeopathy. The majority of this cutting-edge work was being done in Germany and in France: in Germany, people like Paul Ehrlich and his early mentor Robert Koch led the way.9 Alongside, in France, the work was being done by Laennec, Pasteur and their colleagues, who were breaking medicine from the shackles of its rudimentary past and bringing it to the forefront of science. The discovery of Salvarsan was a pioneering example of the inroads science was making in medicine. Until that point, the best medicine had to offer for the treatment of syphilis ranged from the mundane to the macabre — pills and potions, Mercury fumigation, even plasmodium injections to induce malaria.

With Salvarsan, medicine finally had something that was remarkably effective. In the words of one writer, Salvarsan was the first drug treatment that ‘destroyed the disease — and not the patient — although as we saw in the previous chapter, quinine could probably make similar claims as well. Such an effective drug was not concocted with some black art, shamanism, or a random hunch, as was often the case with a lot of medical interventions until that point. The highly effective Salvarsan was made possible because of a thorough scientific understanding of the causation of the disease, and the pharmaco-therapeutic agent that was tailor-made based on the application of that science.

Medicine was the sciences’ Johnny-come-lately. Even while the physical sciences had begun to make spectacular progress by the nineteenth century, medicine largely remained a set of best practices and cultivated art. Medicine was bereft of the empiricism and experimentation that lay at the heart of the other sciences. It just did not use enough scientific methodologies — of deductive thinking, theorization and experimentation — nor did it offer much of the modern therapeutic and diagnostic armamentarium that it does today. It is no wonder that the natural scientists and empiricists — for whom anything that could not be measured and experimentally verified was not science — did not take medicine seriously enough.

It is hard to say with certitude the moment when the practice of medicine might have started transitioning from mostly art to mostly science. But a safe bet might be the period that led to the development of Salvarsan: that period started about three decades before, with the development of Pasteur’s early ideas of the germ theory of disease — the idea that germs cause diseases — and Robert Koch’s experimental validation of the same. The Koch postulates effectively established the conditions under which a germ could be implicated as the causative agent for a disease. He isolated the germ from a diseased subject, cultured and transferred it to another, where it caused the same disease, and was able to isolate the same germ from the second subject as well — thereby proving that it was indeed the germ that caused the disease. Until Koch came up with his postulates, humankind did not know for sure that germs could cause disease: diseases like tuberculosis were thought to result from bad miasma or unbalanced humors, or were considered hereditary (even though the genetic basis of disease had not yet been discovered). Granted, there had been incremental work in the last two centuries by people like Antonie van Leeuwenhoek, Ignaz Semmelweis, Joseph Lister, John Snow and Louis Pasteur — leading up to Koch’s seminal work in anthrax that allowed germs to be implicated as causative agents for diseases.

Today, the idea that germs cause diseases appears rather prosaic and commonplace, but in 1882 it was anything but. The amount of abstraction and deductive thinking that Pasteur and Koch had to invest to come up with the idea was herculean, demonstrated by the fact that the idea had escaped humans for so long, and surely not for the lack of trying. The germ theory of disease gradually displaced the 2,000-year-old theory of the four humors and the miasma theory of disease. Subsequently, within a period of thirty years, Paul Ehrlich and colleagues had discovered an antimicrobial agent that specifically targeted a microbe that caused the disease.

While Koch’s germ theory of disease had an important role in bringing medicine to the hallowed grounds of science, Ehrlich’s efforts made sure that scientific knowledge had an application for the betterment of mankind. It was fortuitous that a major application of this newly-discovered knowledge as the basis for therapeutics began with syphilis. Even as late as the early twentieth century, the disease had been a pernicious blight on the world: it was endemic the world over, and its overall burden in human history comparable to the other great killers like the plague and cholera. Syphilis had, in fact, been anointed as the third plague, a worthy successor to the two plague pandemics in the sixth and the fourteenth century. By the turn of the twentieth century, with improving sanitation, the prevalence of diseases like the plague, cholera and tuberculosis was gradually falling in the Western world. But syphilis was still festering, especially with increasing urbanization. Worse still, while diseases like tuberculosis and plague were mostly physical, the burden of syphilis was much more onerous — it was moral as well as physical.

In the prevailing moral landscape at the turn of the twentieth century, syphilis was always a byword for amorous crimes. There was no redemption from such invidious moral rot. Tuberculosis (TB), on the other hand, was the opposite: it had an aura of honorable pathos associated with it. In the eighteenth and the nineteenth century, TB was considered to be the disease of the creative class. The association between consumption (as TB was often popularly called) and the creative class was so thick that it was assumed that TB gave creative people their artistic vigor — spes phthisica — as well as a heightened sex drive.15 A long list of artists, writers, and intellectuals, including the Bronte sisters, John Keats, Anton Chekov, George Orwell and several others died of tuberculosis — it was no less than a badge of creative honor.16,17 An equally impressive list of artistic and noteworthy people suffered from syphilis but it was hardly ever mentioned in polite conversations. From Oscar Wilde to Nietzsche, Arthur Schopenhauer, princes and paupers — people from myriad walks of life suffered from syphilis, lived with the consequences and died from the disease. That Casanova suffered from syphilis should draw no surprise, but even Eduard Monet and Van Gogh went down with syphilis, as did Franz Schubert and Ludwig van Beethoven. Generals and monarchs were not spared either: the Henrys and the Georges of England, the Pauls of Russia were all afflicted with syphilis. The mighty Al Capone who lorded over the streets of Chicago during the Prohibition, died in his Alcatraz cell with a failing heart and a crazed mind, a consequence of late-stage syphilis.18 If there was a disease as scorned as syphilis, it might have been leprosy — only because it was outwardly visible.

Syphilis had managed to cast a morbid penumbra on the collective human mind. The popular perception that syphilis might have been the result of an unholy communion between a leper and a prostitute is a testament to the unsalvageable reputation of the disease in the social milieu of the times.19 A sizeable proportion of all the pills and potions that were available in the nineteenth century were tried and tested for the treatment of syphilis. Understandably, the social value of any effective treatment for syphilis was astronomical. Salvarsan — a product of science, unshackling humanity from syphilis — became the perfect ambassador to instill onto humankind the utility of the scientific thinking and methodology as a basis for modern medicine.

It took us several millennia after civilizations began and centuries after we had discovered planets and the galaxies, to understand that germs gave rise to diseases. This was because unaided by microscopes, our eyes could not see germs. The abstractions that Koch had to conjure up in order to conclude that anthrax was caused by a bacterium or that the tubercle bacilli caused tuberculosis were enormous for this very reason. Once the germ theory of diseases broke the obstacle of the microscopic barrier, a whole new world — hitherto invisible — opened up. From there, anti-microbial therapy became a logical extension. With Ehrlich’s discovery of Salvarsan, medicine finally began to grow beyond the pseudo-scientific exercise and black art that it was accused of being.

Excerpted from the book Up Is The Curve — A genealogy of healthcare in the developing world. Available here: Nepal, India, UK. Worldwide delivery available here.

[image error]
 •  0 comments  •  flag
Share on Twitter
Published on September 08, 2022 18:31

May 6, 2020

The Way It Was

Cannons fired all night. At the break of dawn holy men took the Emerald Buddha on a boat procession, sprinkling sand and ceremonial water…

Continue reading on Medium »

 •  0 comments  •  flag
Share on Twitter
Published on May 06, 2020 09:17