Article Text

Download PDFPDF

Malaria in the post-genomics era: light at the end of the tunnel or just another train?
  1. D L Gardiner1,
  2. J S McCarthy2,
  3. K R Trenholme3
  1. 1Malaria Biology Laboratory, Australian Centre for International and Tropical Health and Nutrition, Queensland Institute of Medical Research, Herston, Queensland, Australia
  2. 2Clinical Tropical Medicine Laboratory, Australian Centre for International and Tropical Health and Nutrition, Queensland Institute of Medical Research
  3. 3Malaria and Scabies Laboratory, Australian Centre for International and Tropical Health and Nutrition, Queensland Institute of Medical Research
  1. Correspondence to:
 Dr D Gardiner
 Queensland Institute of Medical Research, 300 Herston Road, Herston 4006, Queensland, Australia;


Malaria remains the third leading cause of death attributable to an infectious disease worldwide, with an estimated death toll of over 2 million per year, predominately in sub-Saharan Africa. The first serious attempt to eradicate this disease was unsuccessful, and 50 years later in 1998 a second programme coined “roll back malaria” was started. While this programme is at present unlikely to reach its stated aims, the completion of the genome sequencing projects on the human host, the mosquito vector, and the malaria parasite offers new hope. It is probable that the burden of disease caused by the most malignant form of the parasite Plasmodium falciparum can be, if not eliminated, then effectively suppressed within a generation through new and novel treatments aimed at all three arms of malaria control.

  • malaria
  • vaccine
  • chemotherapy
  • vector control

Statistics from

Malaria from the Latin “mal-aria” or bad air, is the most prevalent parasitic disease of humans, and is arguably one of the most ancient diseases of humans; it has not only influenced human evolution, but history as well. It was described in the fourth century bc by Hippocrates, who gave an accurate description of intermittent malaria fevers. He was also the first to record a link between the disease and the environment, noting that these fevers were clustered in swampy areas. The ancient Romans were also afflicted with malaria with epidemics commonly occurring in the marshy areas around Rome.1 Laveran first saw what we now know to be gametocytes of the malaria parasite Plasmodium falciparum, in the blood of a French soldier in Algeria in 1880 and these findings were subsequently confirmed by Machiafara and Celli in 1883.2 In 1897 Ross described the mode of transmission via the mosquito vector while serving in India.3 However, it was not until 1948 when Shortt and Garnham described exoerythrocytic schizonts in the livers of infected monkeys and subsequently in humans, that the complete life cycle of the parasite was elucidated4,5 (fig 1).

Figure 1

 Life cycle of the malaria parasite.

There are at least four species of malarial parasite that infect humans, Plasmodium falciparum, P vivax, P ovale, P malariae. Recent reports of P knowlesi infections have been made in south east Asia.6 Most of the morbidity and almost all the mortality attributable to malaria is caused by P falciparum. Infection with this species leads to a wide range of clinical symptoms including fever, life threatening anaemia, and coma in children and naive adults. Primigravid women and their unborn infants are particularly at risk from P falciparum infection.


Figure 1 shows the life cycle of P falciparum. Infection in the human host begins when an infected female anopheles mosquito feeds on a human host and sporozoites are injected into the host’s blood stream. These sporozoites rapidly (within minutes) invade hepatocytes where they multiply extensively to form exoerythrocytic schizonts, each containing up to 30 000 merozoites. Six to 16 days after infection (depending on the species) the schizont infected cell ruptures releasing mature merozoites into the bloodstream. These merozoites invade red blood cells and undergo a second round of multiplication that lasts 48–72 hours and produces 16–32 merozoites per infected red cell. The released merozoites invade new red blood cells to carry on the cycle.

The asexual erythrocytic cycle is comparatively synchronous in the natural host, one cycle taking 48 hours in P falciparum. In synchronous infections the rupture of the infected cells and merozoite release is associated with the characteristic fever and acute symptoms of malaria.

Later in the infection some merozoites give rise to sexually differentiated forms (gametocytes). The trigger for this gametocytogenesis is unclear. When a female anopheles mosquito ingests the blood of a host containing malaria parasites the red blood cells and asexual stage parasites are digested while the gametocytes undergo further development to form macrogametocytes (female), or microgametocytes (male). In the mosquito stomach, the male and female gametes fuse to form a diploid ookinete (the parasite is haploid during the rest of the life cycle). As the oocyst matures, it divides to produce sporozoites that move to the salivary glands and are able to infect a new host when the mosquito next takes a blood meal and the cycle continues. This developmental cycle in the mosquito host takes about 12 days depending on parasite species and ambient temperature.


The concept of malaria eradication was adopted by the World Health Assembly in 1955. Two years later the World Health Organisation launched a global campaign to eradicate this disease. This campaign led to the eradication of malaria in some countries and its suppression in many more. For a period it appeared that Russell’s 1955 book Man’s Mastery of Malaria would prove prophetic.7 Unfortunately, there has been a significant resurgence of malaria in more recent years. While estimates vary, up to 48% of the world’s population or 3 billion people now live in areas at risk for malaria.8 The annual death toll has been estimated to be upwards of 2 million. Eighty five per cent of these deaths occur in sub-Saharan Africa, predominately in children under 5 years of age.

Almost half a century after the first global malaria eradication programme was started, the roll back malaria initiative (RBM) was launched in 1998 by the WHO, Unicef, UNDP, and the World Bank. Its aim was to provide a coordinated international approach to fighting malaria, with the goal of halving the burden of malaria by 2010. RBM is a global partnership established to enable countries and communities to take effective and sustainable action against malaria. The WHO strategy to roll back malaria includes: prompt treatment with effective drugs, effective use of insecticide treated materials, and other vector control methods; intermittent preventive treatment in pregnancy; and emergency and epidemic preparedness and response.

Unfortunately, at the half way mark, recent reports suggest that not only is the RBM programme unlikely to reach its stated goal by 2010, but that in fact malaria morbidity and mortality have increased significantly since the inception of the programme.9 Without increased support from donor countries the RBM initiative is likely to emulate its predecessor. Fortunately, there may be a light at the end of the tunnel.

In 2002 the malaria genome sequencing project was completed.10 It was performed by a consortium of laboratories predominately comprising The Institute for Genomic Research (TIGR) and Stanford University in the United Sates of America and the Sanger Centre in Great Britain. In addition, the genomes of both the human host and the most important mosquito vector Anopheles gambiae have also now been completed.11,12 The information that has been, and will be gleaned from all three genomes, but particularly that of the parasite is providing new insights that may lead to novel strategies to combat this disease.13 Already, this information has permitted the development of new approaches in all three arms of malaria control.


While control of malaria has traditionally relied on two arms—vector control and case management through chemotherapy—it is unlikely that these two measures alone will be sufficient to significantly affect the global burden of malaria in the long term. Only the third arm of a cheap, effective, and safe vaccine is likely to lead to a significant prolonged reduction in disease burden.


The first known effective antimalarial was quinine often referred to as the Jesuit’s bark. It was derived from the bark of the South American chinchona plant. It first rose to prominence in the 17th century when, in the apocryphal story the 4th Condessa (Countess) de Chinchon was cured from malaria while staying in Lima.1 Quinine still remains an effective antimalarial drug, although resistance to it has been reported in Indochina.14

Until the development of resistance, perhaps the most effective of all antimalarial drugs was chloroquine, first produced in the 1930s. It was this drug in combination with vector control measures that initially produced the spectacular results seen in the first attempt to eradicate this disease in the 1950s and 60s. However, resistance to chloroquine emerged in at least two independent loci, one in south east Asia and the other in South America at around this time, and by the late 1970s drug resistance had a global distribution. Despite resistance, surveys in several countries across Africa show that chloroquine is still used to treat up to 84% of malaria infections, even where the drug is ineffective.15

The next generation of antimalarial drugs fared even worse. Resistance to sulfadoxine-pyrimethamine (Fansidar) developing within a year of its introduction into Thailand.16 Resistance to Fansidar is now widespread. Resistance to other antimalarial drugs including mefloquine and atovaquone-progunil have also been reported, predominately from south east Asia where the drug is more commonly used than in Africa. In addition to resistance, adverse side effects to many of these drugs limit their use. For example, the potentially serious neuropsychiatric events of mefloquine. We well know doxycycline, an important prophylactic drug cannot be used by young children or pregnant women.

Only one class of drugs has not seen the development of drug resistance and that is the artemisinin derivatives derived from the Chinese herb qinghao. None the less, because of the short half life of theses drugs in vivo, recrudescence of parasites is seen in patients undergoing short course treatment, unless it is combined with another long acting drug, such as mefloquine.17

Because of the widespread incidence of drug resistance advocacy for multidrug therapies has increased. The rational behind this is to increase efficacy of treatment, plus drug combinations can also shorten the duration of treatment, which in turn increases compliance and decreases the risk of resistant parasites arising through mutation during treatment. This topic is reviewed extensively by Kremsner and Krishna 2004.18 However, the cost of combination therapy can be prohibitive, particularly in sub-Saharan Africa where even now many effective treatments are not used because of the high cost of treatment.

While artemisinin derivatives, particularly in combination with other drugs offer the best hope for treatment of malaria, particularly multidrug resistant parasite strains19,20 there have been a number of exciting new developments. Information gleaned from the malaria genome sequencing project has led to the identification of a number of novel drug targets in the parasite that can be exploited. Many of the enzymatic pathways in plasmodia are unique or differ significantly from the human host.21 The many and varied proteases of the parasite, particularly those involved in haemoglobin digestion, such as the aspartic protease plasmepsins and cysteine protease falcipains are also possible new drug targets. A recent report showed that a number of the aspartic protease inhibitors of the HIV are also effective antimalarials.22 The medicines for malaria venture (, a non-profit organisation created to facilitate discovery, development, and delivery of new affordable antimalarial drugs, is currently supporting the evaluation of the effectiveness of many new classes of drugs. When or if these new drugs will become widely available is still far from certain, yet they do hold the promise that drug resistant malaria can be combated for the foreseeable future. Unfortunately, as with all the other newly developed drugs, the cost per treatment may be the important rate limiting step to their general use.

While the first line treatment for P falciparum is antimalarial drug therapy, in the case of comatose patients, or those developing severe malaria, the time taken for antimalarial treatments to become effective may not be sufficiently rapid to be of benefit and there is a pressing need to identify methods of reducing mortality in children and adults who have already developed infection.


Vector control has historically played a significant part in malaria control programmes, and was the key to the eradication of malaria from many areas of the world. The most successful malaria control programmes used the two elements of insecticides and bed nets.23 In areas of high endemicity, the use of vector control programmes including insecticide impregnated bed nets have been highly successful in reducing morbidity and mortality from malaria even though such approaches do not interrupt transmission.24 While an initial concern was that the reduction in the number infections per year in high endemicity area would prevent development of “natural acquired immunity” and therefore shift the morbidity and mortality curves from young to older children, this does not seem to be the case.25,26

Today, the most important impediment to vector control is that in Africa, only 1 in 50 children sleeps under an impregnated bed net.9,27 Furthermore, the cost of these bed nets is prohibitive to many residents of poor rural communities. Increased support from donor countries is required to ensure that impregnated bed nets will reach a substantial proportion of the at risk population. Other problems include availability, initial acceptance, and replacement and re-impregnation of the nets.28 Even if problems with bed net distribution are overcome, the potential problem of insecticide resistance developing threatens their long term effectiveness. This is not necessarily attributable only to the use of bed nets themselves, but rather from agricultural use of pesticides as well.28,29

Completion of the Anopheles sequencing project has increased interest in vector control using a range of approaches. As noted above insecticide resistance in vector populations is of concern. With the availability of the mosquito genome sequence this will lead to a better understanding of this resistance in the wild type populations as well as the identification of new targets for insecticides. New approaches include research into the genetic control of vectors based on the propagation of sterility or other useful genetic traits such as refractoriness to infection by the malaria parasite. This approach entails the development of transgenic mosquitoes that are either refractory to invasion or development of Plasmodia, or carry a female specific dominant lethal gene that will kill female progenies of matings with transgenic males.30,31 An Anopheles sp refractory to infection with the murine malaria P berghei has been developed.32 One disadvantage of these control programmes is that, like bed net programmes, they need to be maintained for long periods of time.

While genetic manipulation of the mosquito vector has potential in the long term if implemented with sufficient vigour, it is unlikely to be a useful tool in the short term because of extensive variation between anopholes vectors and high transmission intensities. In addition other factors such as civil unrest, natural disaster, and “donor fatigue” influence the long term outcomes of these programmes. Only a safe and effective vaccine holds the promise for effective malaria control.


Support for the idea that a vaccine against malaria is feasible comes from a number of clinical findings. Firstly, immunity can be acquired as a result of natural exposure and secondly it is possible to induce protection against experimental infections in animals and human volunteers. Passive transfer of purified immunoglobulins from “immune” people has also been shown to be protective. However, natural immunity is only acquired after many episodes of infection as a major component of this protection is strain specific. Initially immunity to malaria results in a reduction of the severe complications of the disease. After more extended exposure the person’s immune response is able to control parasitaemia at low or undetectable limits. As a consequence most of the severe disease in these areas occurs in young children before they have developed immunity.

A malaria vaccine would be a valuable tool that offers the potential to reduce both the morbidity and mortality of the disease. The parasite has a huge repertoire of possible antigenic variants, this together with the ability to undergo rapid antigenic switching presents an important challenge to vaccine development. It has been over 30 years since the first successful malaria vaccine was trialled.33 As it consisted of the bites of a thousand irradiated mosquitoes infected with the parasite the feasibility of this approach option for large scale use has been until recently dismissed.

While in theory a single vaccine that results in sterile immunity to malaria is the holy grail of all vaccine researchers, practical and theoretical considerations have led to three separate but overlapping approaches to vaccine development for malaria. These approaches are, firstly, a pre-erythrocytic vaccine to prevent invasion of hepatocytes by sporozoites, secondly a blood stage vaccine that acts as an antidisease vaccine for people who live in endemic areas, particularly pregnant women and children under 5, and thirdly a transmission blocking vaccine, or the so called altruistic vaccine that blocks parasite development in the mosquito.

A pre-erythrocytic vaccine would ideally not only prevent parasite sporozoites invading hepatocytes but also induce cytotoxic cell mediated immunity against any infected hepatocytes. Several pre-erythrocyte vaccines have reached the clinical evaluation phase with the lead candidate the RTS’s recombinant protein vaccine. This vaccine uses as its target the circumsporozite antigen that is expressed on both sporozoites and the infected hepatocyte. The circumsporozite antigen protein is a surface protein synthesised by sporozoites as they develop in the salivary gland of the mosquito.34 This antigen is fused with the hepatitis B surface antigen and expressed as a recombinant protein in yeast. In conjunction with the potent AS02 adjuvant this recombinant protein has been shown to provide 30%–60% protection against experimental challenge. In field trials among semi-immune people efficacy was about 70% over a nine week period after vaccination, but offered no protection over the next six weeks,35 suggesting that the immune response generated was short lived without a memory component. In recent phase IIb trials in African children this vaccine also showed an efficacy of 30% over a six month period, but more importantly the incidence of severe malaria was significantly reduced.36 While this trial was successful in showing the vaccine safety and immunogenicity it must be acknowledged that an efficacy of 30% is low and that reduction in severe disease did involve small numbers. However, it does still show that development of an effective vaccine is feasible. Other vaccines using variations of this theme are in clinical trials.

Key references

As well as recombinant protein vaccines a number of DNA vaccines against the pre-erythrocytic stage are currently under development or in phase I and phase II trials.37 These include multigene vaccines that encode for a number of sporozoite and liver stage antigens. While intense efforts have been made to develop DNA based vaccines, the antibody production in humans immunised with these vaccines is generally low, when compared with some animal models. None the less these vaccines are safe, induce significant T cell responses, and show some efficacy against heterologous sporozoite challenge.38

A blood stage vaccine generally has two aims—either to prevent merozoite invasion of red cells or to prevent the complications of the disease either through increased clearance of parasite infected red cells or prevention of sequestration of the parasite infected red cells in the microvasculature, itself an important cause of disease pathology. Progress in the development of such vaccines has been hampered by a number of factors, including a comparatively poor association between the findings from animal studies and human disease. Also, the correlation between a specific antibody response to a malarial antigen and clinical protection or results of experimental human challenge models has not been uniform. One vaccine that received widespread attention in the early 1990s was Spf66, a vaccine that contained sequences from three blood stage antigens, combined with the tetrapeptide repeats of the circumsporozite protein. Unfortunately in large phase III trials this vaccine lacked efficacy.39,40

Most of the blood stage vaccines are directed at antigens of the merozoite that are important for red blood cell invasion, such as the apical membrane antigen (AMA1), and merozoite surface antigens (MSP1, MSP2). These vaccines have been shown to be highly effective in animal models. However, their main problem seems to be that high antibody titres are required to prevent invasion. Malarial antigens tend to be poor immunogens, and many of the currently licensed adjuvants for human use are insufficiently immunogenic.

Another blood stage vaccine target, with the potential to prevent the serious complications of malaria would be a vaccine directed at malarial antigens exposed on the surface of the parasitised erythrocyte. Such an immune response would either result in the removal of infected cells from the circulation or prevention of sequestration of the parasitised cells. Unfortunately, the immunodominant surface protein on the parasitised erythrocyte is the highly variable ligand PfEMP1, with each parasite genome having up to 50 variant members of this gene family. This ligand is thought to mediate cytoadherence of the parasitised red cell to various host cell endothelial receptors such as CD36 and ICAM1. None the less, while use of such a variable antigen in a vaccine is probably not a realistic target for all malarial infections, the use of such a vaccine to prevent against the effects of sequestration of parasitised erythrocytes in the placenta of primigravid women has significant promise. This is attributable to the finding that sequestration of parasitised erythrocytes in the placenta and the risk of developing severe complications from malaria, occurs predominately during the first pregnancy with decreasing complications in subsequent pregnancies. This is thought to be attributable to the induction of antibodies to the comparatively few PfEMP1 variants that can bind condron sulphate A, thought to the major host ligand expressed in the placenta.41

The transmission blocking or so called altruistic vaccines are the poor relations in malaria vaccine development as they do not have a potential market within the developed world, because they do not protect the parasitised host from infection but rather protect communities from infection if high coverage is achieved. These vaccines rely on preventing fertilisation of the sexual stage of the parasite in the gut of the mosquito vector. Antibodies generated in the host are taken up by the mosquito during its blood meal block fertilisation or exflagelation. As some of these antigens are not expressed during the intra-erythrocytic cycle in the human host they are not subject to immune selection and are generally more conserved than other target antigens thus circumventing the parasites defence of antigenic variation. Two candidate antigens under development are Pfs28 and Pfs25. Ironically, these vaccines are also the easiest to test, with mosquitoes simply being fed infected blood with or without antibodies, and the development of the parasite evaluated.

Many of the vaccine candidate molecules outlined above had their genesis before the start of the malaria genome sequencing project and represent, in some cases, ad hoc targets. Over 5000 genes have been identified in the annotated P falciparum database ( and many new candidate targets exist for all three vaccine types. While a vaccine that induces sterile immunity is unlikely in the short term, even a partially effective vaccine would significantly reduce morbidity and mortality. Over time new candidates will be incorporated into vaccine formulations, which will lead to the development of more effective vaccines.


In the short term malaria is likely to remain a significant cause of morbidity and mortality worldwide, but particularly in Africa. The RBM programme is unlikely to reach its stated aims without massively increased donor support. While resistance to the artemisinin derivatives has not yet been reported, its widespread use may eventually lead to the creation of drug resistant parasites. Bed nets have proved to be an effective tool, but resistance to pyrethroid pesticides drugs may eventually limit their effectiveness.

Information gleaned from the genome and proteomes of the parasite and its two hosts has led to novel insights on the basic biology of this most insidious of all parasites. Whether and when this will be translated into new and novel drugs and vaccines cannot be foreseen. None the less, within the next 10–15 years there should be new classes of antimalarials, as well as vaccines that show sustained protective efficacy in children. Malaria vaccine research has progressed substantially over the past few years; with over 30 candidate vaccines in development, many of which are in clinical trials.42 A more important and significant real problem than the production of these agents, is the availability of funding that will be required to implement the delivery of these agents to the people most at need—children in sub-Saharan Africa. Distribution, education, and compliance are significant issues, and may be as important as the actual development of new techniques to combat this disease.


Supplementary materials

Related Data


  • Funding: this work was supported by Australian National Health and Medical Research Council (Grant Nos. 137211 and 290208), and by a generous donation from Mark Nicholson, Alice Hill and the Tudor Foundation.

  • Competing interests: none declared.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.