In orientation and navigation systems special emphasis has to be put on the software because any underlying data representation has to be processed. Therefore, the MoBIC consortium considers there is a strong need for co-operative software for the support of blind and elderly people. The consortium has identified three ways how the software from different providers can work smoothly together. These will be defined in the MoBIC Software Interface (elsewhere referred to as MoSI).
The MoSI level one will consist of a defined access to hardware devices. This will be useful for the easy exchange of hardware (which is a major issue in the rapidly changing field of hardware development) as well as for the joint use of hardware devices, such as synthetic speech or GPS receivers.
No single system is likely to cover all aspects of orientation and navigation. In order to give the possibility to automatically switch between different systems in a given situation the consortium will provide a communication protocol making it possible to dynamically choose the system best suited for the current situation of a blind or elderly traveller.
Orientation and navigation systems generally depend on the availability of data sources (such as map data, public transport data etc.). Although it seems unlikely that a common system for storage and access of this data can be developed (mainly because of the restrictions coming from the commercial providers of the data) the consortium considers there is a need to share publicly available information (for instance timetables). In the MoBIC project a standardised access to this data is under development and will be made available through interested developers.
The first point to deal with is the global access to hardware in a complicated environment as MS Windows. In the modern operating systems (one can count Windows as a half operating system) it is never easy to address the hardware directly. In fact, it is not allowed to do so, because the same hardware can be shared between several applications and direct access would cause serious damage to the hardware and the software integrity of the system.
Because of this reason, the idea of device drivers was developed. A device driver is special software which encapsulates the hardware for shared use by software programs. Most modern operating systems allow the access of these drivers but often give an additional software interface for the use of standard hardware (for instance on pc the mouse, sound cards etc.). So nearly every hardware is accessible at least in two ways:
The division into these two interfaces is advantageous for several reasons. First of all, the low-level interface might be different for different hardware components, even for the same class of device. One can design the low-level interface to catch many similar devices, but it is difficult to catch them all. So, when a new low-level interface is necessary all software using the hardware device has to be changed. When using the high level interface, the vendor of the device could give a library which gives access to the high-level functions specifically for this device.
A second reason is that every software has errors. When programming complex functionality it is likely to have errors in the software, so, why should every application invent the wheel once more and make all the same errors again? To avoid this, more complex functionality is integrated in the high-level interface, which can be better tested, because of the higher modularity.
As one can see, the arrangement of a two level interface is profitable for the application programmer. That is the reason, why MoBIC will use this interface for standard devices as well as for their own hardware devices. How this is done is explained in the following sections.
The pre-journey system will mainly use standard components such as a mouse, a sound-card, a printer or a touch pad. It is currently not known which hardware components will additionally become available in the future, therefore this part covers both the pre-journey and mobile hardware access in the global way.
Figure 1 MoTA access to hardware
The MoTA software will use a two level interface as explained earlier. This means primarily that the software will only address the high level functions which are provided by one (or more) interface dlls (dll means 'dynamic link library' and is a special kind of library in Windows). The interface dll itself will use the system integrated hardware access for standard devices (for instance the serial interface) and the device driver interface for non-standard devices.
It is possible (and at some points to be expected) that the hardware will be shipped with a kind of high-level interface. This makes it much easier to access the functionality but could cause problems when changing the hardware and non-standard protocols are being used.
According to the OSI-RM, Level 5 is concerned with the problem of connection as well as buffering of data and synchronisation. Translated in terms of software it has to synchronise several applications probably using the same hardware device. Any device driver used should therefore have the ability to share its resources (this could include speech output as well as positioning devices etc.) with several applications.
The MoBIC consortium encourages all developers to keep the following requirements in mind:
OSI-RM Level 6 defines conversations between character sets and the transfer of data types. For the software this can be interpreted as standard protocols used to access the device driver functionality. In the MoBIC project the following standards are being used:
For the GPS/DGPS positioning device, the NMEA 0183 standard is used. It was defined by the National Marine Electronics Association for marine instrumentation and communication via ASCII sequences but has turned out to be a powerful but easy to manage protocol. The latest version of the standard is Version 2.00, published in January 1992.
For speech output there are several protocols all of which were defined by the vendors of the available devices. The SSIL standard has turned out to be the most common one and is therefore used in MoBIC. To give an example, one big advantage of this standard is the synchronisation of spoken text with user responses, thus making it possible to find out at which text position the interaction took place.
Applications designed to support the mobility of blind and elderly people should be able to share the responsibilities for guiding people. This means often that they have to know about the abilities of a concurrent application. It could, for instance, mean that at a certain point one application is no longer able to guide the user whereas another application could take over the control. The same is true in the case that a currently not active application detects a threat for the guided person. Therefore a communication protocol between applications is necessary making it possible to get control as well as to hand over the control to another application. To keep this protocol as simple as possible only four different messages are defined:
This requests immediate control over the system. The currently active application has to stop the access to IO-devices (especially any output devices). The application requesting the control should return the control to the active application as soon as possible.
An application can request reliability information from other applications to check which is the best suited in a given situation. This can be due to the fact that an application wants to hand over the control to another application.
When an application finds out that the conditions have changed and it is now better suited to guide a person it can request to get the control by using this message. The active application is informed about the fact and can hand over the control. It is not required to honour the request because the currently active application could remain the application best suited for the current situation.
When the active application finds out that it is no longer able to guide the user or that another application is better suited to guide a user in a certain situation it can hand over the control using this Message.
Communication takes place using the Windows message mechanism. The details of the messages are explained in the part "Application Communications Description".