Accidents caused by errors and failures in human performance among traffic fatalities have a high death rate and become an important issue in public security. They are mainly caused by the failures of the drivers to perceive the changes of the traffic lights or the unexpected conditions happening accidentally on the roads. In this paper, we devised a quantitative analysis for assessing driver's cognitive responses by investigating the neurobiological information underlying electroencephalographic (EEG) brain dynamics in traffic-light experiments in a virtual-reality (VR) dynamic driving environment. The VR technique allows subjects to interact directly with the moving virtual environment instead of monotonic auditory and visual stimuli, thereby provides interactive and realistic tasks without the risk of operating on an actual machine. Independent component analysis (ICA) is used to separate and extract noise-free ERP signals from the multi-channel EEG signals. A temporal filter is used to solve the time-alignment problem of ERP features and principle component analysis (PCA) is used to reduce feature dimensions. The dimension-reduced features are then input to a self-constructing neural fuzzy inference network (SONFIN) to recognize different brain potentials stimulated by red/green/yellow traffic events, the accuracy can be reached 87% in average eight subjects in this visual-stimuli ERP experiment. It demonstrates the feasibility of detecting and analyzing multiple streams of ERP signals that represent operators' cognitive states and responses to task events.