I’m trying to show a visual stimulus in 1s (i.e., 60 frames on my computer) and record keypress and response time. The stimulus would be disappeared after 1s if there is no keypress, or it would vanish immediately once the response is happening. For the former condition, I record the response time as offset - onset
and which is equal to none response in my script. But for the latter one, I’m quite confused. Is the response time should be regarded as secs - onset
, or it should be offset - onset
too? I found that the results of these two approaches only have minimum differences (for example, 1.0003s verse 1.0164s). Does secs - onset
is the more accurate one?
Screen('Preference', 'SkipSyncTests', 0);
[w, wrect] = Screen('OpenWindow', 0, [0 0 0]);
topPriorityLevel = MaxPriority(w);
Priority(topPriorityLevel);
numSecs = 1;
waitframes = 1;
ifi=Screen('GetFlipInterval', w);
frame = round(numSecs / ifi);
vbl = Screen('Flip', w); % onset of stimuli
onset = vbl;
for i = 1:frame
Screen('DrawDots', w, [wrect(3)/2; wrect(4)/2], 10, [255,255,255], [], 1); % stimuli
vbl = Screen('Flip', w, vbl + (waitframes - 0.5) * ifi);
[keyisdown, secs, keycode] = KbCheck; % record response time
if keyisdown == 1
break
end
end
offset = Screen('Flip', w, onset + (waitframes - 0.5) * ifi);
Screen('close', w);