Making the X Window System More Accessible, the DACX Project


The following narrative was recorded during a presentation entitled, "Making the X Window System more Accessible, the DACX Project". This presentation occurred at the California State University Northridge (CSUN) Conference, March 18th, 1994, in Los Angeles, CA. The presenters included Earl Johnson from SUN Microsystems Inc., Beth Mynatt from the Georgia Institute of Technology, Will Walker from Digital Equipment Corp., and Mark Novak and Dr. Gregg Vanderheiden from the Trace Research and Development Center, Madison, Wisconsin.

Each section of the presentation format which follows is preceded by a speaker's name to allow you to better follow the discussion. The "audience" was also encouraged to ask questions during the presentation. If you have any questions after you have read through this presentation, please direct them to Mark Novak at the Trace Center (608) 262- 6966 or "novakme@macc.wisc.edu". Please do not reproduce or rebroadcast this narrative without permission of the presenters.

Thank you. Mark Novak


Mark:

Good morning. I'd like to welcome you to our presentation entitled" Making the X Window System more Accessible, the DAC X or DACX Project.

I would like to introduce our speakers for this mornings presentation. Earl Johnson, Manager of Enabling Technology from Sun Microsystems, Beth Mynatt, Research Scientist at the Georgia Institute of Technology, Will Walker, Engineer and Senior Programmer at Digital Equipment Corp., and my name is Mark Novak, I'm a project engineer on staff at the Trace R&D Center, Madison, WI. Also presenting with us this morning is Dr. Gregg Vanderheiden, the Director of the Trace R&D Center. Please keep in mind that we are just a small group representation for DACX.

For our presentation this morning, I am going to continue with a brief introduction on DACX. Then Will is going to discuss the X Windows System. Finally, Beth and Earl are going to present the results of some of the accessibility work that the DACX group has been involved in. We are going to cover a lot of topics. If you have questions, all the presenters have agreed to take questions on the fly.

DACX is an acronym that stands for the Disability Action Committee for X. DACX consists of a group of individuals from various backgrounds, some affiliated with universities or colleges, some from industry, people who are or have worked for workstation manufacturers, people who just have a concern, consumers and the like, but all with a common goal to make the X Windows System more accessible. DACX as a group has been meeting for about 18 months. We had our initial meeting as a group at Closing The Gap in the fall of 1992. Since that time, we have had three large group meetings. We met at the Gap last fall, and we've met here at CSUN twice. This is not a lot of face-to-face meetings for a group working on this project, but what has really been nice is the power of computers, and especially with workstations, since we do a lot of our work at our individual work sites and talk to each other via electronic mail. So that is really the medium by which we work together, by electronic mail. Oh yes, I'm reminded that we occasionally have to answer the phone.

Again, DACX is a group of people who came together with a common goal and common interest to work in the X Window system to make it more accessible. I want to get it across right upfront that it is an open group. While we are very formal in the work which we are trying to accomplish, we have some very concrete goals and very big objectives that we want to accomplish, but we are an open group and everyone is welcome to participate. We typically communicate by electronic mail, as I said, and I will put up the DACX Project electronic address a little bit later. But for those of you that don't have email capability right now, postal mail is just fine, so don't feel that you can't become a part of this organization.

People who attended the first DACX meeting decided to work to make X accessible in two specific areas, at least initially because of our limited resources. These two areas were making X accessible for people with visual impairments, and for people with mobility impairments. Beyond that we hope to branch out and look at all the types of different access needs.

As global goals for the DACX project, kind of like the long term, what we would like to see down the road. We can narrow DACX long term goals into three specific topics. One is to come up with access solutions that work across multiple platforms, that work across multiple computer companies equipment. DACX members do not see any real benefit coming up with a solution that works on a Hewlett Packard workstation but which doesn't work on an IBM workstation. We are not approaching access solutions for X in that regard at all. The solutions we are working on will work on all the major workstation manufacturer's equipment. So that is an upfront given. We are also trying to build access solutions in a model way, one that Trace has been using in the field for several years, by keeping the standardization of features which we try to build in. I think some of the discussions later about the mobility access issues might make some of this more clear to you.

The second major goal of the DACX group is to try to do this as economically as possible for both the consumer and for the workstation companies. Throughout my experience in the disability field, I've found the easiest way to do that is to build access into the products themselves. It saves a lot of time and effort on both the developers and consumers side if the features which people need to gain access to X and the computer is built-in from the ground up and they are there with everything. These features are done transparently so if you need them, you can find them and use them. If you don't need them, you perhaps are not even aware that they are there.

In no particular order the third goal of the DACX group and perhaps most importantly was just the fact that getting this diverse group of people together from a different market perspective and background. We are talking now about workstations and in a second we will get a little bit better feel for what an X workstation is all about. DACX has been acting as an advocacy group for people with disabilities and working with computers in the workstation market. I can see from chairing the DACX group, that my perspective of what DACX has really been doing is not only "techy" stuff, its not just writing software, its not getting down to the nuts and bolts of the computer itself, it is getting out to conferences like this, doing presentations, talking about accessibility, getting access issues represented in the computer design guidelines, going to conferences that are not necessary disability related. Just getting the word out, people going back to their companies and saying, "hey we need to get this stuff in our system". It is a long, long, long process to which many of you in the audience probably understand. We cannot stop advocating to get these types of things done. I see that as perhaps the most important goal of DACX in the long run. So with that brief introduction, I am going to turn it over to Mr. Will Walker who is going to talk a little bit, I believe, about what X is all about.

Will:

I work in the X Windows group at Digital, so I am a very technical engineering type guy that pretty much knows X inside and out relatively. But my goal today is to really describe who uses X, why is it used, where is it used, etc. It is not a sales talk on X but it is more of like, "who cares talk", who cares about X. Everybody here uses MS Windows, I think. How many people here use X or have a concern for X? Okay, how many people here use MS DOS or MS Windows? So most everyone here is familiar with the GUI problems. So who uses X? Last year I got myself in trouble but I'll say it again. Professionals use X. These people are in the federal government, engineering firms, finance firms, accounting firms, banking, hospitals, etc. What the professional firms care about is a stable, reliable system. If an application crashes, you don't have to turn your system off and/or reboot it. X is dependable and secure. In addition, X is becoming more pervasive.

One of the themes this week at this CSUN Conference is the international electronic super-highway. Part of that highway is the Internet. One of the backbones of the Internet is UNIX. X is the default windowing system on top of the UNIX. So that is another reason why X and UNIX are becoming more and more pervasive, is this big international highway that is happening now. We are realizing that the stand alone computer solution is not an option but that networking and systems that provide interconnection are providing the solutions that we need. X comes from the X Consortium. The X Consortium is a multi-vendor and multi-organization group of people who are very concerned about having a good windowing system on top of their operating systems. X differs from MS Windows in that it is not supplied by a single vendor. X is very open. If you are a user of X, you can FTP, that is an Internet term, to get the sources for free, compile it and run it on your operating system. If you want to make a change, you have access to the sources for free. You can modify the sources and you can make suggestions and send them back to the Consortium for hopeful inclusion into the next version of X. That is exactly what the DACX group is doing. We are consumers of X, we are taking sources, modifying them and sending them back to the Consortium for a free inclusion. This is different from a single vendor solution, where if you have a problem, all you can do is make a request and you are at the whim of a company that either may or may not have time to dedicate towards the problem and you.

So X is a way to get it implemented the way you want it, and if you want it strong enough, you can do it yourself. Why is it important? I already mentioned that we are getting away from the stand alone mentality. We are getting away from proprietary unopened systems. X is a very open system. X is a distributed, hardware independent windowing system. It runs on Digital platforms, SUN, IBM, Hewlett Packard, it even runs on PC's, it runs on MACs and it even runs on NT. A lot of people think NT is MS Windows. NT is really just an operating system and the X Windows system can run on top of that.

Here is the tough part. Earl and Beth are going to talk about the architecture of X and talking about X architecture is always a very difficult thing to do, especially for me because I am so involved with it, and it makes such common sense to me. So I tried to come up with a different metaphor to help you understand X for the remainder of our presentation, okay?

X is a client-server architecture. A client can be viewed almost as a business group in a small company that produces a product. That is like a manager that produces a small product and the service would be viewed as a catalog. And these people produce a product and show it in a catalog so the catalog demonstrates all the stuff being showed by these product groups. Now the business group can be further broken into a product manager, the project leader, the engineers, and the secretaries. The equivalent of that in X is the product manager is more of the X application. The project leader is a layer of some thing called Motif, which is kind of a specific widget set (group of objects) that provides for the enabling technology of the graphical user interface, the feel. The engineer is something called the Xt Intrinsics layer which provides the base of support. The project leader tells the engineer what to do and the engineer does it. The secretary ties it all together and becomes the low level X libraries, sometimes called X lib. Everybody informs the secretary about what to do and the secretary takes everything and puts it in a catalog. The catalog becomes the X Server part of this metaphor. All interactions with the company, just as all input from the user and output to the display, go through the catalog or X Server. How the secretary puts things into the catalog, or the communication to the X Server, is usually referred to as the X protocol. If somebody wants to order something from the catalog, they call the secretary.

Just to make this really quick then. Where Earl and the AccessX changes to the X Server are taking place, is at the X Server level. That is the stuff from the catalog. What we are presenting about regarding mobility access is happening in the X Server or catalog. Where Beth is working is at the engineer and project leader level or the Xt Intrinsic layer and the Motif tool kit level.

Should I go farther into that or are people starting to blink? Are there any questions right now before I pass it over to Beth. I am passing it off to Beth who is going to talk about Mercator, or access for the visually impaired to X, which falls into the project leader and engineering part of this metaphor.

Beth:

Basically, what Mark has done is given you an introduction into what the DACX group is all about and what we are trying to do. Will has given you a feel of what X Windows is and how there are these different layers of X that we must contend with and how we can use a metaphor to better understand their inter-relationships.

What I want to talk to you about is how this group, DACX, is focused on building solutions to X Windows for screen reader access, or access for people who are blind and visually impaired. I am going to talk about this in context to a project that we are running called the Mercator project. Again, it is from Georgia Institute of Technology, and to make the sponsors happy, this project has been co-funded by Georgia Tech, by Sun Microsystems, and by NASA.

We've been working with the DACX group on screen reader access. Basically, there are two problems with screen reader access. First, is you have to get access to the information itself. You have to be able to find out what's on the screen, you have to know enough about that information to provide a consumer interface. That is sort of what we call the "techy" problem, the technological problem. That is where the DACX group is really going to focus, is what are the "hooks" that we can put into X Windows so that we can get that type of information. The second problem is what do you do with that information, what kind of interface do you create. This really gets into what are GUIs all about and how do you convey the power of a GUI, how do you convey what it does for sighted users to someone who cannot see the screen? How do you let the user know what the interface is about and what it can do for you, and that is really an interface design problem. So, the first problem again is the "techy" problem. How do we get access to this information?

The goal of our project and the goal of the DACX group is to use standard X mechanisms. It doesn't do any good if we just have this homegrown system that has specialized libraries that we have modified ourselves that provide access to X applications. There are many problems with a none standard industry solution and you are not going to find that solution when you walk into a college lab where all the undergraduate students have to use X terminals. You need something that can be available on every single platform. So, the idea all along has been to use standard X mechanisms for solutions and where the standard X mechanisms don't exist already, the goal of the DACX group has been to specify those mechanisms, give them to the X Consortium so that they can be included in the future releases.

So there are three levels of information and access control and this corresponds directly to the Will's metaphor of project managers, leaders, engineers, and secretaries. The project managers or X application are the ones that provide most of the infrastructure that's in an X application while the project leaders or Motif toolkits give the look and feel. The engineers or Xt Intrinsics are the ones that provide a lot of how an application behaves. This is implemented in what we call the Xt Intrinsics layer or libraries. That is part of the library of X Windows. What the DACX group has been working on is placing a number of "hooks" into these libraries that gather information about what is going on in a graphical application. So what these hooks do is anytime the graphical application changes, anytime something appears on the screen, anytime a font is changed, anytime something changes about it, the hooks trap that information for you and then are able to send that information to the screen reader application which can then read better and put that information into an auditory format or interface for the user to understand. That is where we hook into the engineers or Xt Intrinsics.

We also touch base with the secretary of the system in Will's metaphor as well. The secretary is what we call the X lib layer or the low level X libraries. The secretary or X libraries are necessary because there are always ways to go around project managers, project leaders, and engineers, which means that there is always ways to go around the Xt Intrinsics. I think everyone familiar with access on Macintoshes or PC systems is familiar with the problem of applications that just don't behave well. With X, applications that behave poorly or bypass the hooks that we put in the Xt Intrinsics libraries can still get caught when we check in with the secretary. That is why we have one hook in the X library layer which says let us know everything that is being sent to the catalog or X Server, which is, let us know everything that is being sent to the display to the screen. This information is more at that level because the secretary does not have the same knowledge that the project manager or application does. The project manager usually knows why he or she is doing what they want to do. The secretary is just sort of the last person in the chain that gets to carry things out. So when we get information from the secretary or the X libraries, it is usually not full of what we call "high-level" information, it doesn't tell us as much, but it still lets us know what is going on in the system. So those are our two hooks. Just to fill out the picture, we are using a standard X mechanism that is called X Test. That is the way things go directly to the catalog if you want to put it that way.

Basically, one of things you want to do with the screen reader GUIs, is you don't want to have the user need to use the mouse, if the user doesn't want to use it. The mouse as well as most pointing devices are designed for sighted folks. It just doesn't work very well, especially with auditory speech as part of screen readers. So, a lot of times we want to be able to fake out the application and have it think that the user moves the mouse, for example if you just need to double click on something. But the user with a visual impairment is using the keyboard because the keyboard makes more sense to them and it is easier for them to use. So, we use X Test to basically fake out the application. So the application thinks it is receiving a mouse event, but really what the user has been doing is using the keyboard. This is very similar to what MouseKeys does in AccessDOS. I probably have confused everyone by now.

So, this overhead is the picture of basically the types of systems I've described. Here is a block diagram of the layers that we were talking about. Here is our X server, the catalog. Here is the X libraries, Xt Intrinsics, and then the X protocol. This is at the secretary level. This is that low level information and as you can see by the picture we have a hook that goes from the X lib or secretary level to our screen reader. Here is the Xt Intrinsic layer that is our engineer. We have a two way connection with that layer so we are able to find information about what is going on in the application and this will also let us change what's going on in the application as well, like easily set the application space. Basically, we have all these different types of hooks that feed into the screen reader environment.

What the screen reader does in X is the same old story. It processes all this information, keeps an off screen model of the application interface and then has different ways of sending out the information on the screen in terms of synthesized speech, nonspeech or sounds and those sorts of things. That is basically what the screen reader system looks likes. What we have been doing within DACX is establishing these hooks into the secretary layer, engineering layer, and the project manager layer. Then, anyone interested can go and build a screen reader system for X Windows and that is sort of what the DACX group has been about.

Audience: Is anybody working on this?

Beth:

In the next part of this presentation, I will tell you about the screen reader that the GA Tech group is working on called Mercator. Other groups working on X screen readers are IBM, BSI is working on the screen reader, and I know that there is a European project called GUIB which is also pretty far along working on a screen reader system. The people who have prototyped one, are all waiting for the hooks to become standard. Part of the DACX work all along this year has been defining the hooks, getting someone to do the prototype hook development, and getting the changes accepted through the X Consortium. And the good news after all that is as of next month, April 1994, what we call X11R6 which is the next release of the X Windows system will contain these changes. Now there is still some parts missing which is how we get the information from the hooks to the screen reader. That is being ironed out right now, and the goal is to have a prototype version of that in place by this June, so that people can begin to use X11R6 to build prototypes. So really the work of the DACX group this past year has been getting industry standards in place so that people can go through these applications. Then part of the delay is waiting for these changes to propagate. Any more questions?

Audience: Do you have copies of your overheads?

I don't have copies with me and yes you can get them from me, either I'll make copies here at the Conference or they can be sent to you. I do have a one page handout, both text and Braille, that is basic information on this type of work but it is not as fully detailed as we would like. The other thing I'd like to also mention is our FTP site, where you can go and get tons of information about the DACX group and the Mercator project. For those of you who can do FTP, its "multimedia.cc.gatech.edu" and that is an anonymous FTP site, so you should be able to connect to that pretty easy and navigate within the subdirectory "papers\Mercator". The FTP site has almost every single paper that we have written in the past few years in this area. Any other questions?

Audience: How long will it take to get these changes to users??

Beth:

Will is better at answering this question than I am. Which is how long does it take things to propagate in the industry. Like when X11R6 comes out next month, how long does it take for DEC and SUN to have that on their systems.

Will:

X11R6 comes from the X Consortium next month. What happens for individual vendors like Digital, IBM, SUN, and Hewlett Packard is that we ship it as a part of our operating systems. You as a user, provided you have a machine you can FTP to MIT with, can build X11R6 yourself. Okay? I'm guessing the lag time is on average 6 months? What happens is the vendors want to be very specific and test their version very well to make sure everything tests out.

Beth:

Another nice thing about this is that if you are using applications that are dynamically linked or you can get a version of the application which is dynamically linked you can basically use these modified libraries and then run that with the application. When the application dynamically loads these libraries, then you can have access. So with earlier X11R4 and X11R5 applications, you should be able to do that, since that's how we test our stuff now.

I want to jump into really the other question which is what kind of interface do you provide once you have all the information about an X application. Really the question boils down into how do you provide an auditory or tactile interface to a GUI application. Because GUIS are very different from the ASCII interface that we have been using so far. Some of the things we want to make sure we provide access to and we think represent GUIs are symbolic representation, you have pictures or icons in GUIs. One of the reasons sighted users like using graphical interfaces is because things look like something. You can tell what something is by what it looks like and you don't have to be constantly reading text all the time to understand what's going on in the interface.

Another nice thing about GUIs which is harder to provide access to is the concept of spatial organization. This is were a lot of people talk about having lots of windows on the screen. Instead of looking at one thing all the time, I can have multiple windows on the screen and I can be looking at different files and running different applications and I can have all of these different pieces of information available to me at once. Depending on how the environment works, if these applications were pulled together, it is very easy for me to share information back and forth between them.

That is a third point which is essentially intuitive interface and multitasking, or to be able to jump back and forth between different tasks that you are doing on your desktop. You want to provide intuitive navigation. I've already talked about how the mouse is problematic in this situation and how it's as relative pointing device designed for a direct manipulation type of interaction with a graphical interface. A lot of these properties don't map well to a spatial auditory interface. So, how do we have them take care of that?

Lastly, if you are building a screen reader system for doing the interface, you had better provide a lot of support for user customization and control, since they have barely figured out how to make graphical interfaces intuitive for sighted folks. There is not near as much research that has really been done on how to make auditory or tactile representation of these interfaces using people with visual disabilities. What I am trying to say is that users are very different and are going to have different ways that they want information represented to them and you better provide capabilities for user to use in their environment.

So, what do we trying to do at least in Mercator in terms of our screen reader interface. Our first question is really at what level do you start translating the interface. You have a lot of information going on in a visual interface that's to the point of exactly where something is located on the screen, exactly what color something is, etc. How much of that information do you pull over into your screen reader environment. What we have done with our screen reader is really do the translation at sort of the object level. Which is, what are the constructs of the interface, how are people using the interface to make something happen. The way I try to explain that is in my laboratory is, we have about seven people working in our lab and we all use X Windows and we all use about the same application. We are working with the same stuff everyday. But I can walk into the lab and look at everybody's screen and everyone's screen is going to look pretty different. Then, I think about X Windows and one of the bad things about X Windows is you can really configure your environment. You can configure it so it looks nothing like the person who is sitting next to you. We can all use the same stuff but all of the screens look different. All of the basic visual interface looks completely different and sometimes how we interact with the interface is really different. But I can walk into my lab and say hey I am trying to do this on Frame Maker, trying to figure out how to put something in a certain format and they say what you do is go up to this menu, pull this down, and select a dialog box, and you select these options and click apply, and it will do what you need to do. So they haven't told me to go to some certain X-Y coordinate on my screen and move down 3". What they told me is go to a menu, select something this dialog box will appear, select these other objects and then select apply, which means make all this stuff happen, and you accomplish your task. What they have done is talk to me in terms of the objects in my interface, in terms of the different things like tools that I can work with. That is really how we want to try to do our screen reader transformation. It's not in terms of exactly where something is located on the screen, but in terms of what makes up the interface and then how to convey that well in a screen reader.

What are some of the things we trying to d?. Again, sighted users like pictures. That is one of the reasons GUIs are so popular. Pictures are pretty hard to provide access to, but one easy thing you can do is translate a picture into sounds. So, instead of everything talking to you in a screen reader, we try to use a lot of nonspeech auditory clues or cues to convey the different object interfaces. So, for example, the type of an object tries to sound like something. For example, text areas in our screen interface sound like old fashion typewriters. So, if I am working with an interface and I read the text areas expecting my input, I will hear a little sound of the typewriter. That tells me what it is and that's analogous to a sighted user being able to look at something and tell that it is a text area. Other examples are push buttons which sound different than toggle buttons. Labels which are text areas that you can't type into sound like printers so it still conveys the notion of something being printed out to you, but it tells you can't type into it. That sort of thing. We also want to try to convey attributes of the object in the same manner. Things like highlighting, that is real common in a graphical interface. So is greying out objects. What greying out does for you, if a button is greyed out, it tells you that it is there but it is unavailable right now, so you can't use it. What we do in the auditory interfaces is we muffle the sound in the screen reader, so you hear the auditory cue for a button, but it sounds slightly distant, slightly softer, basically like somebody has covered it up with a pillow. It is muffled so it is conveying the same type of information to you.

The last thing we want to do and we are still working on this, is talking about conveying hidden attributes of the graphical interface. These are the kinds of things that opposed to highlighting, sighted users probably use this information but they may not even consciously use it. For example, how long a menu is. If the menu contains 4 items or 20 items. That can be useful information when you are looking at a graphical interface. It tells you something about the different options that you have to select from. Now if you are working with a fairly straightforward screen reader, it might just read to you the menu item one at a time. You really don't have any idea how long this menu is and if it is really long, you may even loose track of where you are in the menu. But one thing you can do is map all menu items to a certain range of pitch. Just like a piano has. So all menus go from a very high pitch to a very low pitch. You don't have a screen reader saying there are 26 items in this menu and reading that to you every time you read this, but instead you just have this really short auditory cue that tells you a little bit about the length of something and it also tells you your relative location by what the sounds are. So these are some of the techniques that we are trying out on the screen reader interfaces, to get a handle of some of the benefits of why people use graphical user interface to begin with.

The next big problem is navigation. This is really hard to describe and makes a lot more sense if you just try it out. Hopefully you are going to be able to do that for the rest of the conference in the Trace Exhibit, where we will hopefully have Mercator up and running. Again, we want to throw out this concept of moving in 3" to left and 1" up or something like that. So the idea of navigation is you work through the interface in terms of the object interface and how they relate to each other. So we take everything in the interface, map it on a tree structure and as you are walking the interface, what you are really doing is just going up and down these certain points on the tree structure. When you start an application you are at the top of this inverted tree structure. You can go down one level of the tree structure and you are at the main part of the interface. If you think you are doing a search just going left to right, you will hit this menu bar, this a big text area, this is the label, this is another big text area and this is message brought down at the bottom. You very quickly figured out the structure of this interface. It has five main parts to it. Then, what you can do is by going down deeper into the structures, you can go into that menu bar, go down one level, and go through the main menu, file menu, edit menu, and options menu and so on. So that is the idea you walk up and down the tree structure to get to the different parts of the interface. The good things about this is as you are navigating the interface, you are learning about the structure of the interface. So everything you do is telling you something about how these interfaces are designed.

Other good things about this approach is every part of the interface is accessible to you. One of the big problems with older screen readers is this concept that you can't get to some parts of it. Especially the home message bars at the bottom of the word processors that gives you error messages. Those are sometimes hard to get to in a screen reader, but with this approach, you can get to every single part of the interface and you can configure it when it should tell you information.

So come play with the interface, come give me feedback about it. There is a handout up here in front, just to let you know, which can also help you find out more information about this project.

Earl:

I am going to talk about AccessX. What AccessX is, is basically is a sibling of Easy Access, AccessDOS, or Access Pack, or q package of features for users with a mobility impairment. AccessX was co- developed by Digital, Sun Microsystems, and Trace Research.

As far as usability of the AccessX interface is concerned, all the items in the interface are accessible via the keyboard using the tab or cursor keys. Additionally, all the key sequences that you've used on the MAC and PC for evoking things such as StickyKeys, MouseKeys, RepeatKeys and SlowKeys, etc., are exactly the same sequences for AccessX. So you don't have to throw away and learn something new when you shift from a PC and a MAC over to the UNIX and vise-versa if you are using two systems at the same time, you are using the same key sequences.

Before I go any farther because it will kind of dictate what I am going to talk about, is everybody fairly familiar or how many people here are not familiar with StickyKeys, RepeatKeys, SlowKeys, MouseKeys, ToggleKeys? Do I need to talk about any of those really quickly? Okay good.

So we'll concentrate on the user interface by first looking at the main AccessX window. There is four windows associated with AccessX. There is the main window which is the one that somebody will typically call up when they want to turn a feature on and off. There is the adjustment or "settings" window which will come up next and that's how I can change some of the features. There are two status windows that will let me know the state I am in for StickyKeys and MouseKeys. Finally, there is a help window where I can go to get specific information from each item.

Some quick items from AccessX main window. The menu bar allows me to save my user settings so when I go through and I set how fast I want MouseKeys to move the pointer, I can set it and then I can save it, so every time that I call up AccessX, I will always call in my user settings. Status windows allows me to call up an indication for StickyKeys and MouseKeys. With StickyKeys, it lets me know which modifier keys are either latched or locked. So for example shift, control, alt, and on the SUN system, meta are defined as modifier keys. What the MouseKeys status window does is it lets me know which mouse button is activate, because we can have the workstation support at "least" three mouse buttons (sometimes many more), and it also lets me know which mouse button is being activated.

The Help Menu gives me information to specific items, on how to use them, or what the StickyKeys and other features are for.

The Enable AccessX button is kind of a difficult item to explain and was added to the client interface because of worry over having conflicts between certain hot-key sequences. Specifically the shift keys, where the example that has always been used, apparently conflict with some games. So for example, if you have this game where the left and a right shift key are used to kill and destroy and send missiles, or the right one and left one is for running away. What happens when people continue to press the shift keys is they get into a stage where StickyKeys is going on and off, and the system is confusing the user. So what this Enable AccessX button allows, is for those users to turn off all the keyboard access features of AccessX, so you don't turn any access features on accidentally. The generate beeps button allows you to get audio information to know whether or not you turn on a feature like StickyKeys or MouseKeys. Right now we are limited in that we provide only one beep or two beeps to let the user know that you just turned something on or off. This may change.

The final window I will talk about is the settings window. When you select the settings button on the main window, you invoke the setting window. What I can do, I didn't provide any Braille pictures of this because the presentation is so graphic oriented. What I can do is find out if I can get some Braille printed out for the pictures aspect and people who would like to see what that is in a Braille mode, I can send it to them. If you are interested in this, you can leave me your card so I can send them out to you.

The Settings or features adjustment windows is the next slide. Whenever I make a change to a particular setting in this window, what happens is that feature change takes place immediately. So for instance, if I increase the mouse pointer speed to 1000, if I have MouseKeys on when I press a key to move the pointer, it will move instantly at the new speed.

I can cancel out of the Settings window by selecting the cancel button, and all my "changed" settings go away. When I press the okay button, what that does is saves for the user your current settings, and provided you don't log out of your window system, it keeps all of those settings so that every time I call up AccessX, all of those settings stay correctly. Cancel and okay also dismiss the settings window. The Settings window help button can also be used to call up a general help screen. If you want help on specific items you go back to the main access window to call up information on each one of those items.

The final slide I'm showing is just an example of a help window. This slide is the general help, so it talks about AccessX, who developed it, as well as some of the features that are available on the main feature window. AccessX is available on two different workstation platforms for people to experiment with in the Trace exhibit. That is about it for my part of the presentation.

Mark:

Here is the information if you want to contact and get involved with DACX directly. Send email to "novakme@macc.wisc.edu" which comes directly to me and then if you send a note regarding which area you might be most interested in, I'll add your name to that committee. We have the various subcommittees for visual impairments, for people with mobility impairments. We also have a subcommittee looking at screen enlargement. The main DACX mailing group will notify you of meeting announcements, when we are going to get together, etc. We also have had some sub-sub groups on topics like guidelines. For example, we had a sub-sub group to put this presentation together for the conference. This was all done via e-mail again. We should probably list everyone's email address.

Earl: my e-mail address is "earl.johnson@eng.sun.com"

Will: my address is "wwalker@zk3.dec.com"

Beth: my address is "beth@cc.gatech.edu"

Mark:

One other address which you might want to be made aware of is the general information FTP site at the Trace Center. I believe that is still "trace.waisman.wisc.edu". You can FTP anonymous and then it is pretty straightforward to navigate around. Also, I believe we have many of our other Trace Center reprint documents on line. Any other specific questions regarding the presentation this morning.

Audience: Will AccessX be in X11R6?

Will:

AccessX was developed by Digital, Trace, and SUN and we did it for X11R5. Digital and SUN will be releasing it with the next version of their operating systems which include the X11R5 server. DACX has worked closely with Silicon Graphics, who are working to develop the XKB extension for X11R6. It is a much larger extension that deals with the keyboard and is a logical place to put the AccessX code. So for X11R6, AccessX will be a part of a larger extension called XKB. The answer is yes it will be there but it will be a different name.

Earl:

One additional item. The fact that the underlying is changing, AccessX to XKB, the user interface will still be the same, so to use the R5 version on a SUN or DEC and when you transition to R6 version of the window server, the user interface will look the same, the interaction will be the same. So you won't have to change how you interact with the systems.

Audience: I understand you are making changes to Motif also?

Earl:

The work on Motif is in style guide for designing applications. As far as work that is in the Motif toolkit itself, that actual component, they (OSF) are looking at 2.1. So we're probably looking at least a year from when Motif 2.0 comes out.

Audience: A lot of the approaches you are using are really creative, they blow away a lot of the approaches for the other window based access. But on the other hand, I am wondering if you are keeping in touch with different user base issues that are being dealt concerning Windows?

Earl:

When we are developing these products, we are not starting from scratch. We are looking to see what has been developed in the MAC or the PC land. So as new products come out especially accessibility wise, we need to evaluate them so we can find out what those features are, so when we bring those solutions to the X platform, we can make sure that those are the requirements that are part of the application or the tool.

Beth:

And another point is that we are just some of the representatives from the DACX group. The DACX group is made up of members that work on Windows and MAC screen readers and PC access solutions.

Gregg:

I think you will also find some of the things you are perceiving in this area migrating back to the other direction. So each time a new generation comes there is another bunch of good ideas and the idea is to try to keep everything involved together.

Will:

One of the things mentioned about the approach was we are modifying the base system itself so we are not attaching it. What that means it may take a little longer to get it out there but once it is there it is there to stay.

Earl:

One final thing. It is important that we consider especially in accessibility, kind of transportability. AccessX is a key example where the key sequences that you use for invoking StickyKeys are the same across platforms. It is really important that as we do this which means we need to stay in contact with accessibility on Windows as well as the MAC. So we are kind of following the same track and the same key sequences as people go from platform to platform.

Audience: Is the screen reader work that you are doing under your funding, is it going to result in a actual product?

Beth:

One would hope. That is the plan.

Audience: Do you have any sense as to when something like this will be available?

Beth:

Right now it is not available. It is not user friendly enough for that. Currently, our idea is to work with Georgia Tech on a one year commercialization plan so we have something by the end of that one year period which starts in July.

Audience: Are there access ports or alternative keyboard layouts like?

Will:

One of the things I am doing at Digital is alternative input devices. In addition, X as it ships from MIT, allows the modified keyboard do anything. The XKB keyboard extension which I mentioned earlier even lets you take that further. For example, you can turn any key into a caps lock key. If you wanted your "q" key to be caps locked, it can be that. They are very flexible.

Mark:

Another thing, Trace has been working with for many years in the area of computer access with the manufacturers of augmentative and assistive communication (AAC) devices. It use to be called simply the keyboard emulating interface. Now we call it the GIDEI interface, or the General Input Device Emulating Interface. It is a proposed standard by which, if your AAC device uses the protocol, which is a pretty simple ASCII protocol, and can do some simple RS-232 handshaking at this point, you can access the computer. Any of these augmentative communication devices can act as keyboard or pointing device input, for example to DOS using AccessDOS. We have done the same thing with Microsoft for Windows, and we are in close contact right now working with a couple other platforms and it is going to be very simple to that situation with X because X allows you to inject events right in to emulate mouse and keyboard actions. So that will be following probably very shortly after X11R6 ships and that will give you the ability to take your Light Talkers, Touch Talkers, DynaVoxes, or what have you, and either direct cable or infrared link or radio link or whatever you want and you will be able to literally interact with an X workstation, using the Serial port on your devices.

I might note, that as long as that is the standard "serial" port. There is some stuff going through IEEE, if anybody is watching, and the serial port capability looks as though it is going to change along with other computer ports.

Gregg:

I guess the way to say it in is just to come into the standard serial port. Whatever it is.

Mark:

Our time is about used up. I want to thank you for coming and I want to encourage you to stop up by the Trace Booth, Booth 16, to see AccessX and Mercator.


BACK to DACX PAPERS