Event-related potentials (ERPs) and oscillations (EROs) are reliable measures of cognition, but they require time-locked electroencephalographic (EEG) data to repetitive triggers that are not available in continuous sensory input streams. However, such real-life-like stimulation by videos or virtual-reality environments may serve as powerful means of creating specific cognitive or affective states and help to investigate dysfunctions in psychiatric and neurological disorders more efficiently. This study aims to develop a method to generate ERPs and EROs during watching videos. Repeated luminance changes were introduced on short video segments, while EEGs of 10 subjects were recorded. The ERP/EROs time-locked to these distortions were analyzed in time and time-frequency domains and tested for their cognitive significance through a long term memory test that included frames from the watched videos. For each subject, ERPs and EROs corresponding to video segments of recalled images with 25% shortest and 25% longest reaction times were compared. ERPs produced by transient luminance changes displayed statistically significant fluctuations both in time and time-frequency domains. Statistical analyses showed that a positivity around 450 ms, a negativity around 500 ms and delta and theta EROs correlated with memory performance. Few studies mixed video streams with simultaneous ERP/ERO experiments with discrete task-relevant or passively presented auditory or somatosensory stimuli, while the present study, by obtaining ERPs and EROs to task-irrelevant events in the same sensory modality as that of the continuous sensory input, produces minimal interference with the main focus of attention on the video stream.