Near or Here

MuLab 5

Version 5.0.41 is available, with numerous improvements in speed, functionality and stability. I believe this is a release candidate (MuLab 5 is in open beta). A visit to the KVR MuTools forum can be enlightening.


Version 1.4 of the MindLights editor will deliver some welcome changes. Apart from being quicker all round, the change I most appreciate is that from the point dialog bobbing around as you go to a stay-in-place properties panel. Not yet available, but should appear here soon.


With an example of said beasty in hand I will be writing a review in the next few days. Their website provides details of their unusual stimulation strategy.

From the Lab

Seems it is possible for a closed-eye device to deliver color almost as well as a true open-eye device even down into the blues. It’s all a matter of relative color intensity.


Post a comment or leave a trackback: Trackback URL.


  • Camm  On March 3, 2013 at 1:41 am

    heya Craig…
    I am curious as to whether you think the upcoming oculus rift might make a great addition to a mws setup. Instead of spending $300 or more on a decoder and glasses to instead spend this money on a device which has a much wider set of uses and possibly be even more powerful with the use of visualisation plug ins and the like. My only concern is the latency and possible mismatch between the audio and visual. How important is this in your experience for the two to be in direct sync? Is there a way of offsetting audio from visual in mws that you know of?

    • CraigT  On March 3, 2013 at 6:53 am

      Hi Camm,

      The Oculus looks as though it is going to be a reasonably serious entrant in the VR field. I’ve played with lo-res VR glasses and the potential for complex stimulation is vast. I have discussed audio/visual latency with a couple of people who know enough to have an opinion and my impression is that for most purposes it is not significant. Reading the Oculus material briefly I’m not sure that the latency to which they refer is the same as that which I have investigated. Causing the leading or trailing edge of an audio/visual impulse to synchronise over the auditory and visual cortices would be something that would require real-time measurement and correction.

      Flash-and-beep AVS is well established – it is effective. Flash-and-beep can be delivered in many forms and I can see no reason why the Oculus shouldn’t provide great opportunities, but I think it is worthwhile using established technology while becoming familiar with AVS before venturing into something which may prove to be a quite different thing in itself.

      The big question is why you want to use AVS. If you have a therapeutic or functional need, or wish to conduct specific mind-state experiments then “ordinary” AVS will be ideal – protocols and techniques well known.

      Let us know if you end up with an Oculus.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: