During communication, people rely on both linguistic messages and paralinguistic cues to convey information. A question then arises concerning the neurocognitive mechanism underlying the integration of multimodal sources of information during communication. Within such a background, this study compares the early processing stages between network emoticons and emotional words, as both involve emotional content, by recording the N170 of event-related potentials. We found that the N170 elicited by emoticons showed higher amplitude and longer latency than the N170 by words over right occipital-temporal region. Additionally, the brain activation pattern of emoticons showed the right hemisphere dominance and that of words showed the left hemisphere dominance. These data indicate differential neurobasis underlying processing paralanguage such as network words versus real words.