History and importance of antimalarial drug resistance
The emergence of Plasmodium falciparum resistance to widely used antimalarial drugs such as chloroquine (CQ) has made malaria control and treatment much more difficult. This is particularly dramatic for Africa, as few affordable alternatives are available. Drug pressure has been identified as one of the key factors for the emergence and spread of resistance. The contribution of the extensive use and misuse of antimalarial drugs to the selection of resistant parasites became particularly evident during the Global Malaria Eradication campaign, launched by World Health Organization (WHO) in 1955. The first reports confirming P. falciparum resistance to CQ came almost simultaneously in the early 1960s from South America and South-East Asia, where direct or indirect (through use of medicated cooking salt) mass drug administration (MDA) had been implemented. Similar approaches were very limited in Africa, where P. falciparum resistance to CQ was first reported from the eastern region in the late 1970s and spread progressively west. Most African countries still rely heavily on CQ as first-line treatment despite various levels of resistance, although some states have changed to sulphadoxine-pyrimethamine (SP) as the first-line drug. Unfortunately, the predicted SP useful therapeutic life might be very short, probably because of its prolonged half-life, causing a higher probability of selecting resistant strains and a consequent fast development of resistance. CQ resistance is not evenly distributed and important differences can be found within and between countries. It seems to have spread more rapidly in East than in West Africa. Considering the high level of CQ use in West Africa, other factors such as intensity of transmission, population immunity or population movements should be considered when explaining the different levels of resistance. Understanding such factors may help us in devising strategies to contain the spread of drug resistance.