Many areas of the United States underlain by soils derived from continental glacial deposits generate elevated indoor radon levels (> 4 pCi/L). For example, Iowa (71 percent), North Dakota (63 percent), and Minnesota (46 percent) have some of the highest percentages of homes with elevated indoor radon levels in the State/EPA Residential Radon Survey. Determining the radon potential of glaciated areas is complicated by several problems: 1) surface radioactivity is generally uncharacteristically low in glaciated areas and does not appear to correlate well with indoor radon values; 2) because glaciers redistribute the bedrock they override and entrain, the composition and physical properties of till soils do not necessarily reflect those of the underlying bedrock (transport distances were much further for the continental glaciers of the Great Plains and Great Lakes regions than for glaciers in New England or for valley glaciers, however); and 3) where glacial cover is thin, the radon potential may be a complex product of the glacial cover and the underlying bedrock. Crushing and grinding of rocks by glaciers increases the mobility of uranium and radium in the resulting tills, allowing them to move readily downward through the soil profile with other mobile ions as the soils are leached. Clay-rich tills in North and South Dakota and Iowa generate significant numbers of elevated radon levels, whereas sandy tills in Michigan typically have low to moderate radon levels. Some of the highest indoor radon levels in North and South Dakota are associated with deposits of glacial Lake Agassiz and other glaciolacustrine deposits. In contrast, glacial lake deposits in Wisconsin, Michigan, Illinois, and Indiana are typically associated with low radon levels. Differences in source-rock composition, permeability, and soil moisture conditions of the glacially-derived soils are likely responsible for these dissimilarities.