Return to the Internal vs. External Attention Task page
						
								Internal–External Attention Task
									(Dot Probe Paradigm)
SCRIPT INFO

Script Author: David Nitz (dave@millisecond.com) for Millisecond Software, LLC
last updated:  11-10-2015 by K.Borchert for Millisecond Software LLC

Copyright ©  11-10-2015 Millisecond Software


BACKGROUND INFO

											*Purpose*
	For details on the procedure implemented by this script refer to

	Mansell, W., Clark, D. M., & Ehlers, A. (2003). Internal versus external attention in 
	social anxiety: an investigation using a novel paradigm. Behaviour Research and 
	Therapy, 41, 555–572.

											  *Task*
Participants are asked to react to a series of probes as quickly as possible while looking at a series of pictures 
on a computer screen that are each displayed for app. 25 s. Pictures include faces of happy, angry,
and neutral facial expressions of males and females as well as objects. 
Two types of probes are used: Participants are asked to press the Spacebar whenever they 
feel a slight vibration, claimed to be due to changes in their physiology (internal) and measured by some sensor, and 
whenever they see an "E" flashed onto the screen (external).


DATA FILE INFORMATION: 
The default data stored in the data files are:

(1) Raw data file: 'IEAT_raw*.iqdat' (a separate file for each participant)

build:							Inquisit build
computer.platform:				the platform the script was run on
date, time, subject, group:		date and time script was run with the current subject/groupnumber 
blockcode, blocknum:			the name and number of the current block
trialcode, trialnum: 			the name and number of the currently recorded trial
									(Note: not all trials that are run might record data) 
stimulusitem:					the presented stimuli in order of trial presentation
response:						the participant's response
correct:						the correctness of the response (1 = correct; 0 = incorrect)
latency: 						the response latency (in ms)
/duration:						randomly selected trial duration (based on probepattern)
picture.currentpic.currentitem: stores the currently presented picture
/pictype:						the type of the current picture (angry, happy, neutral, object etc.)
/nprobes:						counts the number of probes per pictype in a block
/nprobesblock:					counts the number of total probes run in a block
/nextprobe:						1 = internal probe; 2 = external probe
/probetype:						1 = internal probe; 2 = external probe
/probeseq:						stores the current fixed probe sequence (one of 4 possible)
/probepattern: 					stores the currently Probe Timing Patterns:
									1 designates 'x1' (see list.x1)
									2 designates 'x2' (see list.x2)
									3 designates 'y1' (see list.y1)
									4 designates 'y2' (see list.y2)

(2) Summary data file: 'IEAT_summary*.iqdat' (a separate file for each participant)

script.startdate:				date script was run
script.starttime:				time script was started
script.subjectid:				subject id number
script.groupid:					group id number
computer.platform:				the platform the script was run on
script.elapsedtime:				time it took to run script (in ms)
/completed:						0 = script was not completed (script was prematurely aborted); 1 = script was completed (all conditions run)


STIMULI
	NOTE: The stimuli used in this script are not the same as in Mansell et al. (2003).
	Original face and object images are not in the public domain and/or may not be
	redistributed.

	Face images courtesy of the Face-Place Face Database Project
	(http://www.face-place.org/).
	Copyright 2008, Michael J. Tarr, Center for the Neural Basis of 
	Cognition, Carnegie Mellon University
	(http://www.tarrlab.org/).
	Funding provided by NSF award 0339122.
	
	Face stimuli released under the Creative Commons Attribution Share Alike license
	(https://creativecommons.org/licenses/by-sa/3.0/).
	
	Object images courtesy of the Object Data Bank.
	Copyright 1996, Brown University, Providence, RI.
	All Rights Reserved.

	Permission to use, copy, modify, and distribute this software and its 
	documentation for any purpose other than its incorporation into a 
	commercial product is hereby granted without fee, provided that the above 
	copyright notice appear in all copies and that both that copyright notice 
	and this permission notice appear in supporting documentation, and that the 
	name of Brown University not be used in advertising or publicity pertaining 
	to distribution of the software without specific, written prior permission.  
	Images produced by this software are also copyright Brown University and 
	may not be used for any commercial purpose.

	BROWN UNIVERSITY DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, 
	INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR ANY 
	PURPOSE. IN NO EVENT SHALL BROWN UNIVERSITY BE LIABLE FOR ANY SPECIAL, 
	INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM 
	LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE 
	OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR 
	PERFORMANCE OF THIS SOFTWARE.
	
	
INSTRUCTIONS

EDITABLE CODE:
check below for (relatively) easily editable parameters, stimuli, instructions etc. 
Keep in mind that you can use this script as a template and therefore always "mess" with the entire code to further customize your experiment.

Copyright © Millisecond Software. All rights reserved.
Contact | Terms of Service | Security Statement | Employment