Fibrillation potentials have been reported to decline in amplitude with time after denervation. The use of maximum fibrillation potential amplitude to determine the relative acuity of axonal loss ("old" v "new/recurrent") has been advocated but with conflicting endorsements as to the appropriate benchmark amplitude, i.e., 100 μV compared with 250 μV. This investigation uses computer simulations to examine the rate of fibrillation potential amplitude decline expected given known values for muscle fiber size atrophy and conduction velocity slowing over time after denervation. Factors that affect the amplitude and potentially lead to erroneous interpretations in the clinical scenario of partially denervated muscle tissue are discussed. The use of fibrillation potential maximum amplitude criteria to determine the age of lesion onset in both totally and partially denervated muscle is fraught with technical and pathophysiological hazards of interpretation and must be considered cautiously, if at all, in clinical practice.