I am adapting the alcohol dot probe and go/no-go tasks from the library to use in an MTurk study. I have a couple questions.
1. If the participant is using a phone or other touchscreen, how do I deal with orientation? It would be better if the participant held the phone horizontally when completing the dot probe since the image pairs are presented side by side. Can I lock the presentation orientation to be horizontal for phones/tablets?
2. In the original dot probe script, the probe is a character ("X") in white front against a black background and the participant responds with a key press ('e' or 'o') to indicate whether it is in the right or left position. For touchscreen, the participant would tap on or near the probe. I would want to record the x,y coordinates of the tap and then specify a range of correct responses in two-dimensional space (e.g., any value within 15% of the screen from the center of the probe). Ideally, I would want the script to record the exact x,y, coordinates of the tap location AND calculate whether it falls within the range that qualifies as "correct". Can you provide an example of script that does this?