Archive for March, 2012

A short history of Leishmania vaccines…

In February of this year we saw the launch of the first human trial for a new vaccine for Visceral Leishmaniasis.

The new trial was launched by the Infectious Disease Research Institute (IDRI) in Washington, USA with the plan to hold a further Phase 1 trial in India. The Bill & Melinda Gates Foundation is funding the Phase 1 clinical trials, as part of the recently announced worldwide partnership with the WHO and 13 pharmaceutical companies to control or eliminate 10 neglected tropical diseases.

This new development on a Leishmania vaccine can be added to a fast-expanding list of so-called “anti-poverty” vaccines; the famed RTS,S malaria vaccine that last year proved to be effective (albeit not to levels some would deem completely effective), rabies, hookworm, Schistosomiasis, and a dengue vaccine to be seen before 2015.

Visceral Leishmaniasis represents one form of a disease seen across much of the old and new world — across 88 countries — and is one of the most common parasitic infections behind malaria. In India, they refer to it after the Hindi word that means black fever — kala-azar. The black fever that haunts those infected and whose skin becomes dark and gray. Where ever it infects, kala-azar is the most deadly form of leishmaniasis.

For the sake of brevity, the history of a Leishmaniasis vaccine dates back to the 1940s with “leishmanization” as it was known. The deliberate inoculation of infective and virulent Leishmania from the “exudate” of a lesion on the skin. Crude, unreproducible and wholly unsafe, the method of leishmanization gave way to first generation vaccines, consisting of killed or live attenuated parasites.

Second generation vaccines came much later; when we were able to genetically modify the leishmania species themselves or use bacteria or viruses as surrogates carrying leishmania genes.

Some methods rely on the identification, thanks to genome sequencing, of proteins on the parasite’s surface that can be used to elicit an immunological response. And most importantly, a protein that is expressed in more than one life cycle stage of the parasite.

More recently, the ability to manipulate the Leishmania genome to create genetically modified parasites by introducing or eliminating genes meant the potential of using live attenuated parasite vaccine. And represents a powerful alternative for developing a new generation vaccine against leishmaniasis.

The prospect of DNA vaccines came only when it was discovered that directly injecting relatively small circles of DNA that encode foreign proteins could lead to a specific immune response. Only then was a new perspective of vaccine formation imagined. One that had no need for the invading parasite itself, and one driven by our advances in molecular biology and biotechnology.

Elliciting an immune response to leishmania is something easier said than done. Many early vaccines that showed promise lacked the ability to ellicit the exact kind of immune response. This is a fact made even more complicated by the fact that leishmania as aparasite lives within immune system cells (macrophages). Leishmania survive within host cells, hiding and inhibiting the cell’s interior defenses.

The IDRI vaccine, known as LEISH–F3 + GLA-SE, is a highly purified, recombinant vaccine. It incorporates two fused Leishmania parasite proteins and a powerful adjuvant to stimulate an immune response against the parasite.

With a slow realisation that the geographical range for leishmaniasis is expanding, a vaccine could not come at a better time. Spurred on by global warming, mass migration and rapid urbanization, cases are being reported in previously unaffected areas.

Vaccines are seen as the silver bullet — the game changer. Sophisticated pieces of science that are so simple in their function. For all we know about the way our own immune system works, there still lies large blind spots and gaps in our knowledge. The delicate balance and complexity hidden within the immune system is only made evident when diseases and germs find a way to avoid and exploit it. The possibility of a kala-azar vaccine is made even more sweeter by the simple fact that the leishmaniases are unique among parasitic diseases because a single vaccine could have the potential to protect against other leishmania diseases.

Image — source.

Originally appearing at endtheneglect.org

ResearchBlogging.org

Dunning, N. (2009). Leishmania vaccines: from leishmanization to the era of DNA technology Bioscience Horizons, 2 (1), 73-82 DOI: 10.1093/biohorizons/hzp004

Chakravarty, J., Kumar, S., Trivedi, S., Rai, V., Singh, A., Ashman, J., Laughlin, E., Coler, R., Kahn, S., Beckmann, A., Cowgill, K., Reed, S., Sundar, S., & Piazza, F. (2011). A clinical trial to evaluate the safety and immunogenicity of the LEISH-F1+MPL-SE vaccine for use in the prevention of visceral leishmaniasis Vaccine, 29 (19), 3531-3537 DOI: 10.1016/j.vaccine.2011.02.096

Nagill, R., & Kaur, S. (2011). Vaccine candidates for leishmaniasis: A review International Immunopharmacology, 11 (10), 1464-1488 DOI: 10.1016/j.intimp.2011.05.008

The logic of fashion cycles…

I long for the death of skinny jeans.

Those in the fashion business have to keep up to date with an ever-evolving scene. Trends and themes change from year to year, and for those that count, looking “last year” is the worst crime to commit. With a fashion industry always trying to stay at the leading edge of itself, fads are created and as a result trends die. An endless self-consuming cycle is only really made poignant and summed up when you hear the phrase “black is the new black”. This tells you everything you need to know about the evolution of trends.

To the naked, untrained eye this might seem like an industry predicated on a whim. An industry where King-makers make kings based on flights of fancy. But, alas, no. There is some logic behind it.

This is just one of the many examples of culture-omics, and a certain logic can be applied to many other indicators of culture — songs, baby names, and even words. Researchers publishing in the open-access journal PLoS One point to a way how this all unfolds and the detailed logic behind it all.

Fashion, in its most broadest definition, has always been associated with social stratification. Sticking with the clothes example, what you wear signals your status. I do not need to point out the implications of the terms “blue-collar” and “white-collar”. We use it so explicitly to depict a social status. So much so, entire nations build themselves around this social divide.

And therein lies the birth of a “fashion”. When individuals of low social status copy those of a higher one, a “fashion” arises. A peak is reached when the fashion becomes too popular. Then the early-adopters move on to find more original styles to make their own. Resulting in periodic rise and falls in different fashions.

There is also the notion that a much more random mode of copying exists within a society. That people just copy each other at random. That way of looking at fashion doesn’t really cover all the bases, and as such, leaves things a bit simplistic.

The researchers go one step further, and propose that people have preferences for cultural traits and that those preferences influence the copying process. Fashions that result from individuals signaling their social status, or from individuals randomly copying each other, do not satisfactorily reproduce the right empirical observations. Thus, the status-only copying model accounts for brief fads, but not for the long-term “classic” cultural traits that never seem to go out of style.

Of the “preference”, “neutral” and “status” models tested only this new proposed “preference” model was the only way that the rise and fall of cultural trends could be simulated.

It seems that individuals in a population are much more adaptive in their copying ways. Fashions, fads, and trends arise not just from individuals copying other’s traits, but also from individuals copying the preference for the trait. A blue hat is just a blue hat until Kate Middleton puts it on — then a fashion is born.

The new model goes further to characterise the Kate Middleton’s of the world: “influential individuals are those who possess many traits that others prefer and, at the same time, have low preferences for widespread traits”. And interestingly enough, influential individuals are themselves not immune to outdated cultural trends, as changing traits and preferences of a population can change the perceived status of the former influential individual.

In all, the paper describes a more complex system — one that accurately reflects what really goes on.

A lot of maths just for one blue hat.

Image — source.

Originally appearing in Australian Science Mag

ResearchBlogging.org

Acerbi, A., Ghirlanda, S., & Enquist, M. (2012). The Logic of Fashion Cycles PLoS ONE, 7 (3) DOI: 10.1371/journal.pone.0032541

World Poetry Day…

Stupid America

stupid america, see that 
chicano 
with a big knife 
on his steady hand 
he doesn’t want to knife you 
he wants to sit on a bench 
and carve christfigures 
but you won’t let him.

stupid america, hear that 
chicano 
shouting curses on the street 
he is a poet 
without paper and pencil 
and since he cannot write 
he will explode.

stupid america, remember 
that chicano 
flunking math and english 
he is the picasso 
of your western states 
but he will die 
with one thousand 
masterpieces 
hanging only from his mind.

 

by Lalo Delgado

Joseph Priestley and the story of dephlogisticated air…

DOCTER PHLOGISTON, The PRIESTLEY politician or the Political priest. Anonymous caricature, 1791

The facts, as they stand, are these: every creature, when respiring, releases phlogiston. In fact, respiration is simply to be considered a form of combustion. Anything that can burn contains phlogiston. Substances, when burnt, release this weightless, invisible substance — an element of their being, their composition — the phlogiston. The phlogiston is always in need of somewhere to go. Such as, air is best for the phlogiston. Air can absorb it. Taking this mode of thinking to its furthest logical conclusion we can only state that the reason creatures “suffocate” is because there is nowhere for the phlogiston to go. When air was removed from around a living creature then there is nowhere for the phlogiston to go and so respiration would cease and the creature dies.

Every now and again we are reminded that science is trial and error, an ongoing process. A collection of our true best current understandings. Yesterday’s taboo and tomorrow’s cliché. And quite often vice versa. We dismiss the science of old for its slavish and almost religious adherence to dogma, classical and entrenched in superstition.

Alas, this is not a story of when science “gets it wrong” so to speak. This is the story of the accidental discoverer. Joseph Priestley. The man who stuck by his wrong theory, right to the very end.

A mild-mannered chemist by nature, Joseph Priestley goes down in the history books as discovering oxygen. Except that he didn’t, and yet, at the same time he did. The problem was he didn’t realise what he discovered. This new-found discovery he called “dephlogis-ticated air”.

You see, this was a recurring problem in the scientific life of Joseph Priestley.

“It has often been claimed that Priestley was a skillful experimenter who lacked the capacities to analyze his own experiments and bring them to a theoretical closure.”

In other words, Joseph was a man who lacked the ability to connect the dots. In this day and age he would probably be the inventor of many answers in search of a question (the guy who invented Twitter for example). Or perhaps carbonated water. Soda water, something he named “mephitic julep”. His mephitic julep was developed hoping it would be the cure for scurvy. When that didn’t quite work he gladly handed over the recipe to a Mr Schweppes, securing his place in an alternative history where we all drink Priestley bitter lemon with ice. Other examples of the talented Mr Priestley include him being the first to note the electrical conductivity of graphite, as well as the first to describe the use of India rubber to erase pencil marks.

A common trait to all scientists is their stubbornness. Joseph was no different. The phlogiston theory, for lack of a better metaphor, had a lot of holes in it. When something burns its weight increases, and yet if a substance is to lose phlogiston from its being it should lose weight. A simple paradoxical problem to solve — the phlogiston had a negative weight!

For much of the 18th century this was the popular school of thought, until the French — Antoine Lavoisier — rivals to the English and American way of science, saw a different way to look at things and called it oxygen instead.

Joseph championed the colourless, odourless gas he had discovered as phlogiston right until his old age. Even after he had been harassed out of England and fled to America as a result of his incongruous opinions on God and politics. In 1796, eight years before his death, he published a final scientific paper on why the phlogiston theory was still valid, and why the Antiphlogistic theory had got it wrong.

“And yet, not having seen sufficient reason to change my opinion, and knowing that free discussion must always be favourable to the cause of truth, I wish to make one appeal more to the philosophical world on the subject, though I have nothing materially new to advance. For I cannot help thinking that what I have observed in several of my publications has not been duly attended to, or well understood. I shall therefore endeavour to bring into one view what appears to me of the greatest weight, avoiding all extraneous and unimportant matter; and perhaps it may be the means of bringing out something more decisive in point of fact, or of argument, than has hitherto appeared.”

Ironic when you think that Joseph Priestley came to prominence at a time when many still believed, as Aristotle did, that there was but one “air”.

Joseph Priestley was an undoubtedly brilliant man, imprisoned by the four walls of his conviction.

ResearchBlogging.org

Fara, P. (2010). Joseph Priestley: Docter Phlogiston or Reverend Oxygen? Endeavour, 34 (3), 84-86 DOI: 10.1016/j.endeavour.2010.07.005

Wilkinson, D. (2004). The contributions of Lavoisier, Scheele and Priestley to the early understanding of respiratory physiology in the Eighteenth Century Resuscitation, 61 (3), 249-255 DOI: 10.1016/j.resuscitation.2004.04.007

Basu, P. (2003). Theory-ladenness of evidence: a case study from history of chemistry Studies In History and Philosophy of Science Part A, 34 (2), 351-368 DOI: 10.1016/S0039-3681(03)00022-0

Sternbach, G., & Varon, J. (2005). The discovery and rediscovery of oxygen The Journal of Emergency Medicine, 28 (2), 221-224 DOI: 10.1016/j.jemermed.2004.10.012

Space Worms…

We begin with a confession. There comes a time in every boy’s life that they realise, or rather they make peace with the fact, that they’ll never play professional sports. Basketball, rugby, football, hockey… the sport of choice changes, but the realisation stays the same.

I mention this only as an analogy. A segue into the real realisation that matters. I have come to the realisation that I will never go into space. A realisation not fueled from a fear of flying or vertigo or any other physical barrier to space flight (apparently almost half of all the medication used by astronauts are sea-sickness tablets. Sea-sickness in space is such a problem they even have a scale for it. The Garn scale. Did not know that). This realisation, not based on anything tangible and not based on any kind of pragmatism.

Allow me to explain. As a child of that generation lost between science and science fiction. Born and living through a time when space exploration was not only real — as told through NASA space shuttle launches, Hubble, Discovery, Lunar landers and the rarely mentioned, unsuccessful Beagle II — but also when space exploration was hyper-real. As told through popular culture. A love affair with space, cultivated from TV shows, movies and science fiction.

After all of this, how could a Space that has been romanticised by popular culture be frightening? How could infinite possibilities, far off worlds and the chance to understand things we haven’t even come into contact yet with… how could all of that be scary?

The answer: I am a parasitologist. What that means is that for years I have tried to understand the inner-workings of life… in one way or another, through detailed biochemistry, biology at the molecular level, stripping life down to its basic parts and trying to build it back up again. From a basic grounding in biological chemistry I have sought to apply what I had learned to things more relevant and with a more clear goal. I chose disease. Not just any disease… diseases of a tropical and infectious nature. Parasites! Diseases that not only use you, but need you to survive. Diseases that prey on our vulnerabilities as well as our strengths. Diseases that use our patterns of everyday living against us to get what they need. Diseases that go out of their way to change us to help them procreate and survive.

Apply this to space and you begin to see my problem. Still, hidden within this is a fascination… a fascination predicated on the simple question “what happens when we do encounter alien parasites?”

Alas, with a planet that seemingly has no where left to explore the only destination is to look towards the stars. With the planet’s impending doom (an exaggeration just for effect), and with the ultimate survival of humanity dependent upon colonization of other planetary bodies there are those who have taken it upon themselves to get us there. There are a myriad of different steps along the way. One of which is lies in tiny worms that have become the ultimate model organism — Caenorhabditis elegans. C. elegans shot to fame when it was discovered that these worms survived the Challenger disaster.

C. elegans have been used as model systems to compare muscle protein synthesis in astronauts and as the canary in the coal mine to detect in-flight radiation exposure. And when researchers wanted to see if RNA interference could be used to block muscle protein degradation — they shot the worm into space.

All these were, however, short-term experiments. So the ideal experimental condition would be long-term. Space is vast, so any exploration to another planet is going to take a while. We would need to know the long-duration, long-term effects of space travel. Long-duration space flight poses any number of problems, and a key question is how to study them in a way that’s applicable, and most of all cheap.

Changes in physiology from long-duration space flight lack appropriate biological models that can be sustained for long periods of time in space. Researchers have recently been able to develop a remote automated culture system to successfully grow multiple generations of the worms in low Earth orbit for six months.

For over 12 generations, the worms were grown and their development observed — particularly when fed and re-fed. They found that the multi-cellular soil worm develops from egg to adulthood and produces progeny with identical timings as on the Earth.

C. elegans has been the model organism of choice for many experiments and investigations right here on terra firma and that trend, thanks to this unique culture system, will continue into low Earth orbit, and hopefully beyond.

Image — source.

Originally appearing in Australian Science Mag

ResearchBlogging.org

Oczypok, E., Etheridge, T., Freeman, J., Stodieck, L., Johnsen, R., Baillie, D., & Szewczyk, N. (2011). Remote automated multi-generational growth and observation of an animal in low Earth orbit Journal of The Royal Society Interface, 9 (68), 596-599 DOI: 10.1098/rsif.2011.0716

Etheridge, T., Nemoto, K., Hashizume, T., Mori, C., Sugimoto, T., Suzuki, H., Fukui, K., Yamazaki, T., Higashibata, A., Szewczyk, N., & Higashitani, A. (2011). The Effectiveness of RNAi in Caenorhabditis elegans Is Maintained during Spaceflight PLoS ONE, 6 (6) DOI: 10.1371/journal.pone.0020459


What had I twaught…


%d bloggers like this: