Historically fire-danger rating stations have been located by convenience rather than design. The resulting collection of stations may or may not sample the data the fire manager needs to get a useful picture of the variability of fire danger in his area. The objectives of this analysis were to determine how well an existing network sampled the fire danger and to determine the station spacing for an optimum network. The method of optimum interpolation was used to analyze the burning index and ignition component computed from the National Fire-Danger Rating System, assuming fuel model G and using weather data from 28 Federal and State fire-danger stations and 9 NOAA/NWS stations in Minnesota, Wisconsin, and upper Michigan. Two indicators of performance were studied: the coefficient of variation of interpolation estimated from distance-correlation relationships, and the station spacing necessary to limit interpolation errors of the burning index to less than ±5. In the spring, minimum interpolation errors ranged from 15 to 20 percent of the station mean, increasing to 20 to 30 percent in the summer and decreasing only slightly again in the fall. The station spacing required to limit the interpolation error to a range of ±5 burning index units was closest in the spring and farthest in the summer. Strategies for refining fire-danger network design are discussed. Forest Sci. 30:1045-1058.