Visualizations of uncertainty in data are often presented to the public without explanations of their graphical conventions and are often misunderstood by nonexperts. The “cone of uncertainty” used to visualize hurricane forecasts is a case in point. Here we examined the effects of explaining graphical conventions on understanding of the cone of uncertainty. In two experiments, participants were given instructions with and without an explanation of these graphical conventions. We examined the effect of these instructions on both explicit statements of common misconceptions and users interpretation of hurricane forecasts, specifically their predictions of damage from the hurricane over space and time. Enhanced instructions reduced misconceptions about the cone of uncertainty as expressed in explicit beliefs, and in one experiment also reduced predictions of damage overall. Examination of individual response profiles for the damage estimate task revealed qualitative differences between individuals that were not evident in aggregate response profiles. This research reveals mixed results for the effectiveness of instructions on comprehension of uncertainty visualizations and suggests a more nuanced approach that focuses on the individual’s knowledge and beliefs about the domain and visualization.