Currently optical networks have been employed to meet the ever-increasing data transfer demands of grid applications and thus give rise to the concept of an “optical grid”. Task scheduling is an important issue for an optical grid, for it optimally allocates both grid and optical network resources to accelerate application execution and increase the resource utilization ratio. However, most task scheduling algorithms based on theoretical models may generate accuracy deviations between the scheduled results and the actual finish time of the applications. Accuracy deviations may lead to inefficient resources utilization and unsatisfied Quality of Service (QoS). This paper aims to improve the accuracy of task scheduling algorithms in optical grid environments. We first propose the theoretical task scheduling algorithm and demonstrate that the scheduling result is deviated with actual finish time in the real optical grid environment. Then, we reveal several factors which are likely to influence scheduling accuracy and develop a realistic task scheduling algorithm. We evaluate the theoretical and realistic task scheduling algorithms in our optical grid testbed. The experimental result shows the scheduling accuracy can be improved significantly by the realistic task scheduling algorithm.