Experiment slowing down towards the end


Author
Message
kev_dent
kev_dent
Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)Partner Member (726 reputation)
Group: Forum Members
Posts: 47, Visits: 153
Dave - 4/21/2020
kev_dent - 4/21/2020
Dave - 4/21/2020
kev_dent - 4/21/2020
Dave - 4/21/2020
kev_dent - 4/21/2020
I am having an issue with an experiment I have created.
The experiment requires participants to remember a set of stimuli.
Then watch a short movie before having their memory tested.

I notice that the experiment seems to slow down towards the end.
The movies are presented for longer than they should be and they stutter.
The experiment then also takes several seconds maybe 10 or more to quit after the final trial.
The computer is then very slow to restore the windows and the performance is slow immediately after running the experiment.

Can somebody advise me why this is and what I can do to prevent these issues.

Best
Kevin

This very much sounds like the stimuli (here:probably primarily the videos) are simply exhausting the system's available resources.

Ok yes could be that is there a way to close / flush the movies from memory during the experiment?
Alternatively are there any workarounds to reduce the load on the system?

> Ok yes could be that is there a way to close / flush the movies from memory during the experiment?

Not really, no.

> Alternatively are there any workarounds to reduce the load on the system?

You might want to try reducing the videos size and resolution and/or try out different formats.

Currently the experiment is a single expt with a couple of hundred trials.
Would it be more efficient to break the experiment into multiple expts in separate scripts using batch?

Yes, that might help. If you're running this offline (not on the web), then using command line syntax to run the separate scripts in sequence would be another (even better) option.

My experiment uses 256 different image files and a handful of movies, yet when it is loading and the splash screen displays there is a count of stimuli loaded that takes a long time and counts to over 1000!
I wonder what additional things are being loaded aside from the image files?

Dave
Dave
Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)Supreme Being (1M reputation)
Group: Administrators
Posts: 12K, Visits: 98K
kev_dent - 4/21/2020
Dave - 4/21/2020
kev_dent - 4/21/2020
Dave - 4/21/2020
kev_dent - 4/21/2020
Dave - 4/21/2020
kev_dent - 4/21/2020
I am having an issue with an experiment I have created.
The experiment requires participants to remember a set of stimuli.
Then watch a short movie before having their memory tested.

I notice that the experiment seems to slow down towards the end.
The movies are presented for longer than they should be and they stutter.
The experiment then also takes several seconds maybe 10 or more to quit after the final trial.
The computer is then very slow to restore the windows and the performance is slow immediately after running the experiment.

Can somebody advise me why this is and what I can do to prevent these issues.

Best
Kevin

This very much sounds like the stimuli (here:probably primarily the videos) are simply exhausting the system's available resources.

Ok yes could be that is there a way to close / flush the movies from memory during the experiment?
Alternatively are there any workarounds to reduce the load on the system?

> Ok yes could be that is there a way to close / flush the movies from memory during the experiment?

Not really, no.

> Alternatively are there any workarounds to reduce the load on the system?

You might want to try reducing the videos size and resolution and/or try out different formats.

Currently the experiment is a single expt with a couple of hundred trials.
Would it be more efficient to break the experiment into multiple expts in separate scripts using batch?

Yes, that might help. If you're running this offline (not on the web), then using command line syntax to run the separate scripts in sequence would be another (even better) option.

My experiment uses 256 different image files and a handful of movies, yet when it is loading and the splash screen displays there is a count of stimuli loaded that takes a long time and counts to over 1000!
I wonder what additional things are being loaded aside from the image files?

The count doesn't reflect the number of files downloaded, but rather the overall number of preparatory steps taken. Parsing the script, putting things into memory, etc.

Download times will improve due to caching which kicks in over time. Only the first few download attempts in a given geographical region will take longer.
GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Reading This Topic

Explore
Messages
Mentions
Search