Mainstream theories of visual perception assume that visual working memory (VWM) is critical for integrating online perceptual information and constructing coherent visual experiences in changing environments. Given the dynamic interaction between online perception and VWM, we propose that how visual information is processed during visual perception can directly determine how the information is going to be selected, consolidated, and maintained in VWM. We demonstrate the validity of this hypothesis by investigating what kinds of perceptual information can be stored as integrated objects in VWM. Three criteria for object-based storage are introduced: (a) automatic selection of task-irrelevant features, (b) synchronous consolidation of multiple features, and (c) stable maintenance of feature conjunctions. The results show that the outputs of parallel perception meet all three criteria, as opposed to the outputs of serial attentive processing, which fail all three criteria. These results indicate that (a) perception and VWM are not two sequential processes, but are dynamically intertwined; (b) there are dissociated mechanisms in VWM for storing information identified at different stages of perception; and (c) the integrated object representations in VWM originate from the “preattentive” or “proto” objects created by parallel perception. These results suggest how visual perception, attention, and VWM can be explained by a unified framework.