Clinical Epidemiology of Malaria in the Highlands of Western Kenya

Simon I. Hay, Abdisalan M. Noor, Milka Simba, Millie Busolo, Helen L. Guyatt, Sam A. Ochola, and Robert W. Snow


Emerging Infectious Diseases. 2002;8(6) 

In This Article

Abstract and Introduction

Malaria in the highlands of Kenya is traditionally regarded as unstable and limited by low temperature. Brief warm periods may facilitate malaria transmission and are therefore able to generate epidemic conditions in immunologically naive human populations living at high altitudes. The adult:child ratio (ACR) of malaria admissions is a simple tool we have used to assess the degree of functional immunity in the catchment population of a health facility. Examples of ACR are collected from inpatient admission data at facilities with a range of malaria endemicities in Kenya. Two decades of inpatient malaria admission data from three health facilities in a high-altitude area of western Kenya do not support the canonical view of unstable transmission. The malaria of the region is best described as seasonal and meso-endemic. We discuss the implications for malaria control options in the Kenyan highlands.

The temperate highlands of western Kenya were regarded by colonial settlers as safe havens from the surrounding malarious areas of Uganda and Kenya[1,2]. After World War I, malaria encroached into these highland communities as a result of wide-scale population settlement linked to transport and agricultural development[2,3,4,5,6], and malaria epidemics were frequently reported by the early 1930s[7,8,9,10,11]. These epidemics in the highlands caused concern to those in the colonial administration because of the economic importance of agricultural exports. During the 1950s and 1960s, control efforts such as indoor residual house-spraying, mass drug administration, or chemoprophylaxis effectively contained or prevented epidemics in some of these high-altitude areas[12,13,14,15].

In the late 1980s and early 1990s, a series of malaria "epidemics" were reported in Kenya and other communities located at high altitudes in the subregion[11,16,17,18,19,20,21,22,23,24,25,26]. Some authors have labeled these resurgences as a new typology variant, "highland malaria," demanding special attention in the new global commitment to Roll Back Malaria[27,28,29]. A generally accepted view has been that the transmission of Plasmodium falciparum in high-altitude communities is limited by low ambient temperature. Small changes in climate may therefore provide transiently suitable conditions for unstable transmission in populations that have acquired little functional immunity[8,9,30].

The highlands of Kenya constitute a densely populated, politically significant area, which serves as a major source of revenue and foreign exchange from agricultural exports. The Kenyan government has recently defined 15 districts in the highlands[31,32] as being prone to epidemics, meriting close inspection, preparation, and intervention[33]. We examine a time series of age-structured clinical malaria data derived from three hospitals with inpatient admission facilities in the highlands of western Kenya. These data provide an empirical basis for understanding the epidemiology of malaria and consequent strategic approaches to disease management and prevention in this area. A companion paper investigates the epidemiologic and statistical problems associated with defining true epidemics in these high-altitude locations and tests a variety of epidemic surveillance algorithms on the monthly malaria admissions series abstracted from these facilities[32].


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.