Getting Started with Magic Eye FX

March 24, 2020

What is it?

Magic Eye-FX is a set of software applications designed to:

  1. provide users with opportunities to learn, develop and enhance their eye gaze skills.
  2. provide facilitators with an assessment tool to help measure eye gaze suitability.
  3. help practitioners with cognitive and communication assessment

What is the criteria for success?

From a facilitator’s perspective, determining whether a learner is going to be able to develop or possesses the skills required to use eyegaze for computer access, early stimulation activities and simple games is relatively straight forward. The learner will need to notice the stimuli presented on the monitor – this does not necessarily mean that they are controlling things at this stage, but there needs to be an awareness that something is happening on the screen.

It may take some time or several sessions before you arrive at a definite answer. It is important to evaluate what has been demonstrated, experienced and perceived, and use the data collected during the sessions to inform judgements.

Understanding your users and knowing how they respond to a range of stimuli is essential in helping to determine early outcomes!

Please take time to read through this guide before launching into Magic Eye-FX.

Cause & Effect

Understanding the cause and effect association of eye gaze can be a very difficult task, so facilitators must approach it thoughtfully and cautiously. It should be noted from the outset: EYE GAZE IS NOT SUITABLE FOR ALL USERS, and careful observation is required at the early stages. If learners show potential then the road beyond Magic Eye-FX is well established with a clearly defined progression route into Alternative and Augmentative Communication (AAC).

Eye Gaze and Complex Needs

Early Eye Gaze is the term used to describe eyegaze access for users at an early stage of cognitive development. It does not mean that users are necessarily young in age. In the UK the term ‘profound and multiple learning disability’ or PMLD is used to describe individuals who have a profound cognitive impairment alongside multiple disabilities including physical, sensory and/or health-related difficulties (Cartwright and Wind-Cowie, 2005; World Health Organization, 1992).

When assessing whether eyegaze is suitable for a user we must do so on the basis of the evidence obtained from trialling the technology, and not, on the presupposition that eyegaze is beyond the intellectual capabilities of any PMLD/Complex Needs labelled user

We have been developing eyegaze software applications for nine years and during this time we have met hundreds of PMLD users; several have gone on to demonstrate abilities far in excess of their perceived limits. It is for this reason that we would urge practitioners to try the technology out with as broad a selection of users as possible. It may transpire that you find some users to be significantly higher functioning than previously identified.

It is important to remember that users at an early cognitive level need to learn the skills of eyegaze, and this, like all things, will take time, practice and patience. It may be that some flourish and develop quickly, whilst for others, progress will be slow or just not possible, due to eye conditions and processing difficulties.

Evaluate Everything!

Make sure that you take notes and create a profile for each user that you trial eyegaze with. Record every nuance of the session including the duration, screen engagement intervals, the applications that you use, notes about the environment, learners eye profile, position and disposition. This will enable you to chart and assess progress.


Things to consider

There are a variety of factors that will influence an eyegaze session. Some of these factors fall under the facilitator’s sphere of influence, such as, environment, the positioning of equipment, which activity you choose to challenge your users, and others that facilitators have no control over, such as learner eye profile and cognition.

To ensure that the best possible results can be achieved for the widest number of users, we would recommend facilitators pay particular attention to the following factors.


For early eyegaze access it is important to remove as many distractions as possible, for this reason it is recommended that a dark, quiet location is used. Ideally, we want the monitor to be the dominant source of light in the room. With early eyegaze users it can be difficult to determine if there is a visual impairment to contend with. Reducing all unnecessary sources of light and visual clutter will give you the best starting point for success.

It is also important that we pay attention to sound levels and make efforts to minimise all unwanted sounds so that the user can focus their attention on the screen. We also recommend that only the people that need to attend the session are present.

Think About!
Light Level




It is useful to have a mounting solution that offers flexibility in the positioning of the monitor. We recommend Rehadapt Floor Stands (whilst other stands exist these are currently the best in our opinion) as they can be used to position a monitor for a user that is laying on the ground or a user that is more than 6 feet tall in a standing frame. This level of flexibility means that it is easy to position the monitor at the optimum height and distance for use with any user. Another, major benefit is that we can orientate the monitor in the optimum position relative to the user without interfering with their natural position (comfortable position) such as, when a users head is tilted to the side etc.


If you have available an adaptable/flexible monitor mounting solution as described, the user can be positioned almost anywhere. There are a few considerations that we make when we first encounter a new user. Position should be comfortable for the user. It is especially important that during the first few encounters the user is not required to have to sit in a particular way in order to access the technology. The objective is for the user to have fun and to have opportunities to succeed and not be deterred. We have successfully tracked users laying on waterbeds, interactive floors, suspended in hoists (proceed with caution), laying on their backs, sitting in chairs and so on… Whatever suits the user goes!

You may wish to involve Occupational Therapists, Physiotherapists and other members of your multidisciplinary team to help ascertain a good solution going forward.


it is important for the facilitator‘s head not to be next to the user‘s, as the eye tracker will track the facilitators eyes rather than the users, making it difficult to ascertain who is controlling events. It is recommended that the facilitator should be positioned either behind or slightly to the side of the monitor. This way, the user is more likely to engage with the screen as the facilitator will be visible and prompts / vocalisations will be coming from an area close to the visual source. Please do not place yourself behind the user as this can be very distracting. Alongside the user is fine, but do not make too much noise as it is likely that you will inadvertently divert the user’s attention away from the screen.


What to Look Out For

Please remember that an eye tracker does not discriminate between eyes that can see and eyes that cannot. It is easy to misinterpret what is happening so we need to break things down and evaluate each outcome as a set of incremental steps:

      • Do learners attend to content on the screen?
      • Do they look around the screen seemingly aware that something is happening?
      • Do they appear to be interested?
      • Are they engaged?
      • What applications are you using?
      • How much time do they spend looking?
      • Are they fixating, scanning, tracking, targeting and so on?

Try to suppress the urge to verbally encourage or prompt too much – it can be hard to do, but this is important in helping learners work out the cause and effect association/eye screen relationship. When a user diverts their gaze away from the screen, wait until you are sure they are not going to voluntarily re-engage with the screen before intervening. We must allow time for users to look and look away – the more they do this, the more they are likely to be working out and processing for themselves. Screen Engagement activities focus on teaching that something can happen when you look at a blank screen.

Do not try to cover too much, too soon, and resist the urge to swap between apps too quickly. What you, as a facilitator/practitioner, find basic and easy to process will be completely different for your early eyegaze users. Repetition is very important so prepare for lots of it.

Repetition is very important so prepare for lots of it.

Do not try to cover too much, too soon, and resist the urge to swap between apps too quickly. What you, as a facilitator/practitioner, find basic and easy to process will be completely different for your early eyegaze users.



 Calibration is the process by which the eye tracker measures characteristics of the eyes in order to accurately calculate gaze direction.

Understanding Calibration

Whilst obtaining a good calibration is important when it comes to accurate eye control in a Windows environment, it is not so important when we are using large targets and less so when we are using applications suitable for early eyegaze users. Magic Eye-FX software can be seen as a training tool that may help a user build the skills necessary to achieve a calibration, but more importantly, it can be viewed as a tool that does not require any user specific calibration; early users accessing Screen Engagement activities can use any calibration.

Calibration Offset

When using someone else’s calibration settings there will always be a slight offset from the users natural gaze point and the target on the screen (see diagram above). This can be anything from a small distance where gaze proximity is so close so as not to be noticed by the user, or, in some circumstances, as much as 200mm out. With this in mind it is particularly important that you use Screen Engagement activities to observe a user’s functional visual field and the Tobii Gaze Evaluator application when users have demonstrated some calibration potential.

Keyboard step-through

Tobii Gaze Interaction Settings provide a range of custom options to help achieve a calibration. One of the often neglected functions is the keyboard step through. This function enables the facilitator to control each calibration step manually as opposed to the standard auto cycle. This can really help with sensory learners as you trigger the calibration step when you know the learner is looking at it.


Eye Gaze Methods

Tobii Eye-Enabled Software

Tobii Eye Enabled Software introduces a proximity based method for controlling mouse events based on a grid or matrix of cells. Eye enabled software will compensate for less accurate eye control by snapping the mouse to the centre of a cell as soon as a users gaze is detected anywhere within the cell frame. This is very useful for communication based eye control software such as Tobii’s Sono Suite as it reduces the need for pinpoint accuracy. If a user is predominantly using communication based software it is not necessary to develop high levels of accuracy. If, on the other hand, a user is looking for computer access within a non-eye-enabled Windows environment, accuracy becomes significantly more important. To navigate effectively in a Windows environment a user will need to learn to control their gaze in a precise manner.

Windows Control (TGIS)

Tobii Windows Control allows users to move and execute mouse functions in exactly the same way as they would if they were using an ordinary mouse. Magic Eye-FX utilises the Tobii Windows Control method. The software requires Windows Control to be active as the applications are designed to develop mouse control skills. The reason that this software is not Eye-enabled like Tobii Communicator is because we are interested in having events activate linearly across the screen – this enables us to quickly workout a user’s functional visual field and determine how eye movements influence events on the screen. For example – if you were to use the application Splodge, from the Screen Engagement app set, you will be able to tell very quickly how the eyes fixate, saccade and see a visual representation of a users functional field of vision. This information is very useful for determining how we present content to a user in future sessions, about the screen size we should be using and the best positioning / location of the device.

Related Articles



Here is a test post to test out the blog page. Wahoo!