Clinical Epidemiology of Malaria in the Highlands of Western Kenya

Simon I. Hay, Abdisalan M. Noor, Milka Simba, Millie Busolo, Helen L. Guyatt, Sam A. Ochola, and Robert W. Snow


Emerging Infectious Diseases. 2002;8(6) 

In This Article


We examined longitudinal, age-structured, clinical data on the frequency of admission for severe and complicated P. falciparum malaria at the three hospitals located above 1,600 m in the highlands of western Kenya. These data provided an opportunity to explore in more detail several generally accepted positions about the clinical epidemiology of malaria at high altitude in East Africa.

In these time series, the increased malaria admission at each of the three hospitals was concentrated in children <15 years of age (approximately two thirds of all admissions). Given the equivalent sizes of at-risk population below and above 15 years of age[36], one must assume that adults have developed a degree of functional immunity to the severe consequences of P. falciparum infection. The hypothesis that communities located at high altitude are prone to unstable, infrequent parasite exposure limiting the development of functional immunity before adulthood[8,9,30] is, therefore, not reflected in our data.

Complicated malaria warranting intensive clinical management is a problem every year at each hospital. Previous cross-sectional estimates of the prevalence of P. falciparum infection in children from birth to 10 years of age in homesteads in Kisii Central during 1990 suggested infection rates between 4.5% and 13% (42). More recently (July 2000), the prevalence of P. falciparum infection was 10.3% in children from birth to 9 years of age (HL Guyatt, unpub. data). Neither the clinical epidemiology nor estimates of the prevalence of infection in the community corroborate the view that the high-altitude areas served by the hospitals in our study support unstable transmission. Transmission is better characterized as seasonal and meso-endemic[43].

"Highland" malaria is either a new phenomena[16,17,18,23,24,25,30] or a reemergence of a previous prevailing epidemiology[21,44]. Our data confirm significant surges in malaria cases, requiring intensive clinical management during specific years of the 1990s because of substantial overall increases in the number of cases at each hospital. To provide a series of explanations for these increases is tempting, invoking arguments for and against climate change, drug resistance, and land use changes; various authors discuss these arguments elsewhere[16,17,18,20,23,24,25,26,30,45,46,47,48]. We emphasize in these arguments, however, the importance of considering population growth as the simplest explanation and note the close correspondence between the percentage increases in the population's growth rates in the districts served by each facility and the percentage rises in malaria cases. Characteristic of much of sub-Saharan Africa over the last 3 decades, including the highlands of western Kenya, has been a high rate of increase in population size, resulting from high fertility rates and increasing child survival. In the populations served by the hospitals in our study, annual growth rates averaged 3.9%. Under such circumstances, without any change in disease incidence, the increase in disease would be expected to have doubled over approximately an 18-year period. Clearly, without a concomitant investment in essential clinical services, beds, staff, and supporting infrastructure, the changing requirements for clinical management will have been perceived by most district-level public health officials as a crisis.

Defining true epidemics is difficult[32]. For most public health workers, epidemics represent exacerbations of disease out of proportion to the normal level to which that facility is subject; these increases overwhelm the facility's ability to cope. Therefore, a slow but pervasive epidemic of clinical malaria may have emerged in the highlands of western Kenya, where lack of investment in the physical capacity to manage an increasing population has resulted inevitably in more malaria cases that require a basic clinical service. In addition to this demographic-to-service determinant, the western highlands are subject to acute seasonal transmission, as evidenced by the temporal distribution of cases (Figure 1A,B). These seasonal peaks in clinical disease exhibit marked between-year variations, and several years exhibit dramatic rises in severe and complicated disease (Figure 2A,B,C). Moreover, years of exceptional cases can be very different between health centers separated by no more than 10 km. With limited resources and bed capacities, these acute rises in disease incidence within a given year will undoubtedly put a considerable strain on any clinical service and represent a crisis[32].

We used a crude measure of transmission stability based largely on our understanding of patterns of acquired functional immunity[8,49]. The ACR was derived from hospitalized patients diagnosed with malaria. Many of the cases would not have been confirmed with any degree of reliability through microscopy or careful clinical exclusion of alternative causes for fever[50]. Our data and approach must therefore be interpreted with this caveat. Nevertheless, in other areas of Kenya where stable transmission is well established[51], notably coastal Kwale (ACR = 4,181/6,692 = 0.63 based on admissions data, 1984-1999) and lakeside Homa Bay (ACR = 18,686/35,703 = 0.52 based on admissions data,1982-1999), many more children than adults are admitted to hospital with a malaria diagnosis, resulting in ACRs similar to those described in the highlands (R. Snow, unpub. data). Conversely, in an arid area of northeastern Kenya (Wajir), where a major malaria epidemic occurred in 1998, more adults than children were admitted to the hospital (ACR = 2,704/1,369 = 1.96 based on admissions data, 1988-2000)[52]. Despite poor malaria diagnosis in many routine clinical facilities, we believe that the ACR is one possible tool to rapidly assess the extent to which a community has sufficient parasite exposure to invoke some degree of clinical immunity early in childhood. This tool should be explored further within the context of malaria classification for epidemic-prone areas of Africa.

In high-altitude zones of western Kenya, clinical malaria has an acutely seasonal distribution, is comparatively concentrated in the pediatric population, and is a substantial public health problem every year. Occasional, but exceptional, temporal surges of disease occur in some years. We can assume that parasite transmission in this area of Kenya is stable and a degree of functional immunity is acquired during early childhood. Low levels of parasite challenge have been found to be sufficient for early development of functional immunity[53]. We argue that large parts of the western highlands, located at a similar altitude, have ecologies similar to many other areas with low, stable, but seasonal malaria in Kenya. Treating the highland districts as special cases; demanding intensive investment in early detection, warning, and forecasting systems; and frequent complex-emergency responses by government or nongovernmental organizations[33] may not be the most appropriate and cost-effective use of limited resources. Investment in sustainable approaches to vector control (spraying households with residual insecticide), promoting individual protection (insecticide-treated bed nets), and effective case management are perhaps more likely to achieve long-term reductions in disease.


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.