1993 CSUN Conference


Minutes from the DACX Meeting
Thursday, March 18th, 1993
California State University, Northridge (CSUN) Conference
Los Angeles Marriott Hotel, Los Angeles, CA.

Meeting called to order at 7:10 pm by Mark Novak. A copy of the meeting agenda, minutes from the previous DACX meeting at Closing-The-Gap, and a sign-up list were passed around to all 25 individuals in attendance (see attendance list at the end of this document).

Introductions were made by all in attendance.


Mark asked if anyone wished to add items to the agenda.

Beth wished to raise the issue of funding. Paul asked for discussion to clarify the IBM screen reader efforts under AIX which were mentioned by Dennis Mitchell in a pre-conference presentation. Jim spoke briefly on the subject of the IBM screen reader for AIX, which is not an announced product.

There was no discussion of old business.

First item under new business, Bob Scheifler briefed the group on the status of the next release of X Windows.

Bob - In terms of X MIT release schedule. We are looking to put a release out at the end of this year. That is the current theory anyway. We are looking at an Alpha release around the middle of July so the Beta release would be sometime "Octoberish". We have a goal to try and get something on the order of 75-80% functionality in the Alpha release. So if there are significant new chunks of stuff we really would like to see them in the Alpha release. I think we will find that at the server internals level, that we don't have much of a problem because most of it is just the internals. Although Will was talking about an extension and that extension will need to go through a review process or it might fold into another keyboard extension that we are working on. Just another piece of that overall keyboard extension rather than a separate extension. As for Beth's work on Xt, the consortium will be having an Xt meeting the week of April 19, and I can't remember whether Xt is the 19- 20 or 21-22 but that's the week that it is in. Our goal there is to try to have all the major Xt proposals sort of "on the plate" by the time of that meeting. So that puts the pressure on Beth to try to get some concrete specifics for the Xt changes done by then.

I am not sure, but I think we are talking about two major pieces or changes to X, As for screen magnification, I don't know whether we (DACX) have any solutions at this point that are really viable for our (MIT) next release.

I think we are basically talking about some incremental pieces to existing standards so the review of them will kind of just fold into the review cycle for the next revisions of low standards rather than having to go through any kind of separate review process.

Earl - Would this be a patch then Bob?

Bob - At this point I don't really expect to have any of this stuff come out as patches to our existing release. My guess is that by the time any of this stuff really stabilizes we will be sort of too close to our X11r6 cut over that our source tree won't look like anything what your source tree looks like either in structure or internal code, so that generating a public X11r5 patch is going to be difficult. We don't have enough cycles to go around anymore than anyone else does so at this point, so my thinking is that we should be more focused on getting the right stuff into X11r6 rather than producing other patches to X11r5. Now, having said that, if we get some of this stuff in early, like particularly Will's stuff down inside the server, some of that can go out as patches to X Consortium members even if we don't turn that into public patches. It would at least mean that server vendors would start to see this stuff earlier. Give them the leg up in terms of rolling it out into their own products.

Earl - The screen magnification you pointed out that we really aren't to far along, so is that going to be in the after X11r6 comes out? Would that be a patch type of thing?

Bob - At this point I don't even understand what the right solution is or how to ship it.

Beth - It's been rumored that X11r6 is the last release of the day. If we don't get it in now, then forget it, we are never going to get it in. Is that true?

Bob - No it's not really true. It is true that I will no longer be the director at the end of this year. But the X Consortium will continue spinning out from MIT into a separate organization and there is a new president who will be taking over. He has already selected. He has been working with us at MIT for about a month now so he is sort of getting his legs and trying to get the new organization up and running and that will happen over the course of this year. But we all fully expect that the Consortium will continue on into the future. So, no X11r6 is not the last release.

Paul - How about the commitment to accessibility. Does that go with you too?

Bob - No, I don't think that it goes with me. I don't see how it can be anything other than obvious that you have to deal with it.

Darin - Well, that is the problem that I am running into is that it is obvious to me and my manager and maybe even my second line that this is important work to be doing, but when you talk to marketing and folks that give us a list of features which must be in a release, and it doesn't show up.

Bob - I don't think that is a problem. I think that enough sensitivity has been raised within the X Consortium staff, some significant fraction of whom I think will carry over to the new X Consortium, that we will continue to see a commitment there. But I am hoping that we will have enough of it done in X11r6 and that will sort of establish a "beach head" so that additions to that won't sort of seem as traumatic or starting from scratch or whatever. You (DACX) will already have made the important roads to continue.

Paul - The other question I had is does it matter who submits the code. Do you have to be a consortium member or could it be anybody?

Bob - It has to be a great coder so that I don't have to do a whole lot of re-engineering on it! No, I don't care where the code comes from. I care that it is as easy as possible for me to re-integrate into our current sources. And I care that it has the same kind of permission notice that we have on our current sources so that we can just give the stuff away. I don't want any patent infringement coming on. Hopefully that is not an issue here, on some of the things that we are talking about for now.

Mark - There was a question raised during yesterdays pre- conference session on some of the patentability of some of the commonality of the solutions amongst the different platforms. We will have to do some looking into that. I believe the Trace Center developed most of the ideas. Or I should say Dr. Vanderheiden originated most of the ideas so I think we have a good leg up on most of that. To explain for most of you, backing up a little bit for instance, Trace tried to popularized the feature of StickyKeys which is currently available as Easy Access on the Macintosh, AccessDOS on an IBM running DOS, and Access Pack for Windows. They all function in a similar fashions in that if you need to use them, the user can just tap the shift key five times and the feature will turn itself on. So it is those types of "commonality" of access solutions across the various platforms that the Trace Center has tried to promote. I think we can do the same things in X Windows and UNIX and whatever other systems we want to work.

Beth - I don't want to open a Panadora's Box but I recall there being some contention on whose concept was the "off screen" model and screen reader and all that kind of stuff, even though it is a pretty common sense approach.

Mark - That I am not as well versed on. Perhaps we might need to do some looking into that.

Bob - The screen reader stuff may be a problem at the upper levels where you are working on it. But in terms of the underlying mechanism about putting it into X, that is not an issue.

Peter - You are not putting in an off screen model. I think, but I am not sure, we (BSI) were the first ones to do an off screen model. I am not sure if we were the first ones to use the term or not and I wasn't around then nor can I speak for the president, however, we seem to have no trouble with IBM doing it, we seem to have no trouble with SythaVoice calling theirs an off screen model, so I can't imagine that would be an issue.

Mark - If people have thoughts in that direction, those are the kind of issues we need to bring up as we are developing solutions. These are things we don't want to run up against when we get close to release.

John M. - I think unless I misunderstood the point that you are referring about the commonality of interfaces. My interpretation of the main thrust of that particular question during the preconference was that with the Macintosh Command+O is used for opening a file, and on the Microsoft Windows Alternate+O was used and that seems to be the commonality that particular person was looking for more than the commonality across the screen reader. Although the access tool kit commonality would be nice. I think that particular person would be more concerned with the actual user interface commonality which from my perspective is probably beyond the control of anybody in this room.

Jim - There seems to be a relaxation of rules of announcement and secrecy and confidentiality whenever it comes to an issue of adaptive technology. You can't think of any other IBM products short of screen reader that ever got shown at a show in the state that was shown in.

Mark - I would like to have the individuals who are leading the subcommittees give us a quick report on their progress. I am going to start with Beth Mynatt from Georgia Tech, our subcommittee chair working in the graphics user interface access issues.

Beth - For those of you who have come in late I do have a print copy of the protocol specification that we are working on. I don't want to backup too much and take up time but basically, I joined this committee back in October and one of the reasons we started participating in DACX is that we already had done a year's worth of research on trying to provide access to X Windows for a screen reader system.

I am going just sort of page through this document as a way of structuring my presentation. I don't think I need to cover motivation or why this work is important, so we will go straight to the requirements of what we think is important for any type of screen reader system. Basically, these are the requirements for a screen reader system that would work off of this protocol.

The first and foremost requirement is transparency to the application. We obviously don't want to require people to do individual application modifications. A person needs to be able to grab an application off a shelf or grab it off their net, that haven't worked with a screen reader system, and get it to work. The things that the protocol should allow us to do are to be able to read the static state of an application interface, the objects interface, the objects such as scroll bars that make up the interface and associated with it. Static stage is defined as things that rarely change in an interface, although especially in the X World, anything can change. We also need to be able to set the static state, maybe not as important for a screen reader, but important for other uses of this protocol. We need notification of changes in the dynamic state. Again this is without requiring our assistance to be polling for protocol or something like that. That was part of the problems we ran into with our first systems. We want notifications when something goes on in the system, the dialog box pops up, the labels have changed, any and all of these things are needed to update our off screen model. Also we want to be able to trigger dynamic changes likewise back to the interface. One of the examples I gave in an earlier talk is being able to completely map mouse operations to keyboard operations so that a person can perform a double click mouse operations from the keyboard. Protocol should enable us to do that.

Then I have what we call trigamedic controls and I think medic comes from the fact that I am in the academic environment and so we always have to have something with the word medic in it. Medic controls are commands to your environment, not commands back to the applications, so that is actually a simple problem and I just want to be able to capture this. For example, if a user moves to change the rate of the speech synthesizer and moves from one application to another. So I capture that input and change their environment but I don't forward that request on to that specific application. Arbitrary connection setup is very important to us. That means that you don't have to start the application running in the mercator environment or what other screen reader environment you have. This enables a person, a nonsighted user, to go to someone else's office where they are already working with an application and they say okay, let me start up my screen reader, and connect to the application to get its current state and start working from there. It is a very important requirement versus having to restart the application within their environment.

Some pragmatic requirements that changed during our dinner conversation. I think they required changes to Xt that may now be translated to required changes to Xt and Xlib only. Both of these being the underlying libraries that form the stable basis for the X Window System. This is opposed to requiring changes in the widget sets such as the Motif widget set which varies more widely from one year to another and it would be difficult to maintain changes so we want to stick to the underlying library.

Greg - Just a quick question. Is your goal to gain access to all applications. That's the ultimate goal right?

Beth - Our goal is to gain access to all Xt base. Xlib is due to what we think that we can't get with Xt direct. Everything is Xt based. That includes Motif which is the most popular and seems to be GUI of choice. That is not the entire X World, but it is most of it. Also what is important is the design of how this protocol works. The protocol would be mostly portable to other X tool kits or other window systems in general.

Other pragmatic requirements include potential implementation by the X Consortium, working through this committee and working with Bob Scheifler, and that we want this to be adopted by the X Consortium and become part of the general release. It is not going to do anybody any good if it is just a specialized system. I already mentioned potential non Xt took kits.

To catch everybody up on the design rationale. There are really three main strategies you could use for providing access to X applications. Extreme, but worth noting as an example we don't want to do, is modifying the applications on a per application basis. It is worth noting that as an extreme because that is how you have the highest level of semantic information of what's going on in the interface so you could go and modify the applications itself. Obviously that is not a very generalized strategy that is going to be useful to this group.

On the other extreme which is what our group tried the first year when we were working on this, is monitoring the X protocol. It is a very standard screen reader type of technique where you are monitoring the communication between the graphical application and the display server that is controlling graphics drawn on to the screen, also controlling gathering keyboard and mouse input. For a number of reasons the strategy was seen as not particularly robust and not particularly efficient for how we want to provide access to X applications.

Through this committee we have been able to explore a third and hopefully final strategy which is the implementation of this protocol which works on the underlying libraries of the X application. So, again, an example of how this all works and getting a feel for it is that the X application is executing sending messages through all these buttons on the screen or these menus on the screen. The underlying code associated with that is the Xt level and at the Xlib level is going to be contacting and talking to us as an outside agent and saying these things have just happened. So the application itself has not been modified but the underlying layers underneath it are going to provide a communication channel for us to be able to build the off screen model and work with that system.

The rest of this document is not something that you would discuss at this kind of meeting I don't think. This is the actual description of the protocol of specs. For those that really love this kind of stuff in the X World and have experience in this, I definitely welcome your feedback as to how this should be designed. Basically, the protocol requests follow along the requirements already mentioned, being able to get the static state of an application, being able to set that, being able to be notified and also cause dynamic changes in the application interface and also access to controls. So that's the high level "techy" talk and I appreciate everybody's review of this document and getting back to me. My phone number is on the document and if anybody wants my email address, it is "beth@cc.gatech.edu", and email is always the best way to reach me.

Earl - Is this document electronic?

Beth - Yes it is, at "multimedia@cc.gatech.edu". And you need to go into paper/mercator and I think its called something like Xt "protospec" or something. Not too hard to figure out. I am going to try to be sending out at least an electronic distribution and hopefully also papermill distribution next week to folks that are probably on this agenda list as well as other people I have talked to in this conference which will contain information about where you can access these documents electronically.

Earl - I see underneath your comments you have both normal text and italicized text. Could you fill us in on the meaning of that? I know there are unanswered questions.

Beth - Basically, the document is finished when all the italics are gone. The italics are questions to ourselves and questions to our readers. Should we do this, how should this be done, is this particular request necessary? This is definitely a draft a working plan. Although, I now have until April 19 to take all the italics away. The majority of this protocol has been implemented and the changes into Xt. We are going to do some new changes in Xlib for things like grabbing texts and grabbing pure graphics.

Earl - Was that based on your discussion earlier tonight?

Beth - Yes. Basically what we were doing is Xt allows us to get things like creation of widgets when things are displayed on the screen and it is also able to get most of the resources associated with the interface objects. Text is somewhat more difficult because it is handled in different ways for different widgets. Sometimes it was escaping us at the Xt level. We were having to do protocol snooping to catch that. The danger of that is when you do protocol snooping, you can't do light connection which was one of our main requirements. So after discussions with Bob tonight we are going to make some changes to Xlib, to enable us to do that instead. So right now I will give you a description of the demo we almost brought with us. You can connect it to the system and we are able to dump the structure of the interface so that we are able to navigate through the interface using keyboard controls were the navigation is based on the widget structure of the interface, but you are bypassing objects that you don't really want to hear about. Like form widgets or other widgets that really are just controlling geometric layout, but are not really interesting to you. In the auditory presentation and the graphical presentation you are not really even aware that they are there either. So you are able to bypass that. For a proof of concept we also implemented voice recognition into the system using a product called "InCubed" by Command Corp. and that is an example of using this as a separate input. The mercator screen reader will read our voice or speech command that has been recognized and then we are using that to control navigation or to initiate a switch between applications. So that is a good example of how this protocol is really standardized to included use for nontraditional input. Hooking in to the auditory cues, if you heard me talk about some of my interfaces, is also part of what we will be doing. Next quarter we will also be hooking in some software based speech synthesis as well so we won't be using our old, very old DecTalk.

Earl - How are you going to deal with some of the issues like the latency stuff that you have identified with the software implementation of "Centigram", or are you just going to kind of hang with it at this point?

Beth - That we have to hang with. What we found in performance is that the basic running system using the keyboard input into the system, we found no performance decreases that anybody has noticed, We aren't using the latest workstations either. So we haven't had performance lags in that area. The only thing we have noticed a performance problem is with the Centigram voice recognition and that is because of the voice recognition system, and is not anything we are doing. You can sit there and go up, down, left, right, over, extern, those sorts of things. It's at that stage so it's nothing too bad.

Earl - Isn't the issue when you are actually inputting text and things like that into a document. That's where you start really seeing the latencies?

Beth - No. It hasn't shown up yet. We are trying to be as light weight as possible.

Peter - This may be a bit technical and may be this would be best off-line but looking over the protocol it seems to be very widget oriented. What about when somebody just decides all right, I want to paint a rectangle on the screen or I just want a render a line into the window and no widgets are involved.

Beth - That's where the Xlib stuff we discussed over dinner is going to come in to play, when it is not on a widget base, but it is just graphics or can they just paint text wherever they want to?

Bob - Yes.

Beth - When that happens, we will have to figure out by geometry where that is suppose to be. We are going to get that with the Xlib stuff. That is going to be an addition to the protocol that's spec'ed out.

Peter - There shouldn't be any problem to be told more or less, here's a rectangle bounding some text, these are the font names, the font style, etc.? Given that kind of inquiry into character offsets between each character.

Bob - You have to do a little work to do all of that.

Peter - I expect that.

Bob - What Beth and I were talking about over dinner was what sort of most expedient way to get this information in the lightest weight way, and that is to give a hook to see as X would have actually taken the output buffers and ship them off to the server, to let you see those output buffers and you can "gravel" through in parts protocol and find the drawing requests and the text requests. For font kinds of information you are going to have to look at the graphics context idea and map that back to what you think the current PC state is. So, you may have to track Gts and there is actually some Xlib hooks today you could use to track Gts state changes, even if it is not through Xts read only Gt, so there is a mechanism there that you may be able to link to as well.

Peter - So you mean being notified when the state changes rather than inquiring.

Bob - Right.

Peter - In that way I can get all the font information and all the spacing information between.

Bob - Right.

Greg - Do you really want to get rid of having the pseudo-server in the system?

Beth - Yes. Complexities in dealing with the server itself and even in the read only to the server which isn't near as bad and definitely the late connection. Makes it much simpler to the user to be able to set this up in their environment and work within the application.

Mark - Next I would like to like to introduce Earl Johnson from SUN and Will Walker from DEC, and ask them to share the progress on what issues they have been working on in the physical access area.

Earl - I am going to turn all of my stuff over to Darin because he is the one who has implemented the solutions we have developed at SUN. One thing I would like to say is that the server, as I am understanding it, is broken into two layers. Device dependent layer which is specific to companies and device independent layer which is with the MIT X Consortium. What we have developed, while it is an incorrect way for the DACX, we've developed in the device dependent layer as the initial prototypes. I think that it might be best as Will is talking through his solutions to address these issues. We have three prototypes that Darin will talk about.

Darin - As Earl said I did all the work in the device dependent layer for the StickyKeys functionality which basically allows, for those who are not familiar with it, modifier keys such as control, shift, and so forth to be typed serially rather than having to be typed at the same time as the key they modify. I basically went into the device dependent handling code for keyboard events and created a state machine to keep track of what state all the different modifiers keys are actually in and either masked events or perhaps modified them. Basically, to modify the event stream so that it would look like the modifier keys that went down, but hadn't come up. Most of the complexity of my code is actually keeping track of which key was used to actually turn on a particular modifier. For example, if the left shift key is used to actually lock the shift modifier, then when the user uses the right shift key to unlock it, you can't send the right shift key up because it was the left shift key that went down. You just got to keep track of that stuff and send the appropriate key up. That was the one prototype I've done. The other one was for SlowKeys and for that I basically modified the repeat key handling. Repeat key handling currently has two levels of delays. There is the repeat initiate which controls how quickly the first repeat keys start being sent and the repeat delay which controls how quickly they are sent after that. I just added another level which was a slow key initiate delay and if the slow key mode was on, then I suppress the initial key down event and when the auto repeat came along, I sent what actually looked like the first key down event. Those are the two prototypes I've done.

Bob - So this was done against an X11r5 server?

Darin - Yes. And like I said this is a prototype that's not finished. It doesn't have a nice interface that you would like. Such as displaying the states so that the user has some way to find out which modifier keys are actually locked at any given time which is a need a real need before it can really be used. I found myself thoroughly confused a couple of times when I was testing it. I had no idea what modifier to use and there was no way to find out other than to try and unlock each one of them.

Paul - You said the stuff that you did is device dependent. So will we be able to use that in what we are doing with DACX?.

Will - We may be able to use some of the code. Actually we will be able to draw upon the experiences that I have done and what Darin has done. We can put them together and maybe make a very good solution that would work for everybody and all our platforms.

Earl - It will reside in the device independent layer also. Which is very good news. Extremely good news. I am going to give a quick definition of the prototypes that we worked on and I will leave the definitions of Toggle and MouseKeys to Will. StickyKeys for those of you who aren't familiar, allows a person that has the use of only one hand and in fact one finger or a mouth-held stick to use the computer keyboard. StickyKeys allows a person to say hit the shift key and then hit an alpha numeric key and output a capital letter. You can also use other modifier keys, like control or alt, it locks these down and then it would allow you to hit an alpha numeric key so that function can be implemented and that can all be done with one finger. RepeatKeys allows the user to set the delay rate before repeat mode is activated. So it is utilized by somebody with a physical disability that has trouble pulling up either the mouth-held stick or their finger off of the keyboard. RepeatKeys gives them the time to remove their finger before you start getting this massive repeat rate going to the application. Then SlowKeys is for a person who has trouble locating a specific key.

At times I use a little plastic finger that I put on my hand at this finger here. At times I hit two keys at once. What SlowKeys does is allows me to pick the key that I am choosing, hold that down for a specific time before it outputs that character on to the application and then I can lift my hand off.

Will - Before the Closing-The-Gap meeting we worked on several prototypes for implementing AccessDOS type features in X Windows. These range from potentially highly "customizable" client that was a very small client with a couple thousand lines of code all the way to modifying the server directly. Even though I modified the server directly, it was done as a proof of concept, more than a working prototype and all of this was developed with the intent of being device independent. No matter what platform you were on, this code would work in that platform.

One thing Beth forgot to mention is that Beth, Keith Edwards, and Tom Rodriguez presented a paper at the X Technical Conference and Mark and I co-authored a paper which was presented at the X Technical Conference. The reason I bring that up is not just to ring our bells, but to just say we are getting accessibility issues out of this arena and into a public arena. So more than just people here at a Conference with a focus on disability being aware of the issues, getting the issues out in the public and making people aware of the problem.

After discussion today with Bob there was very, very, very good news that they (X Consortium) are willing to put the accessibility features like the AccessDOS functionality directly into the server and all we have to do is get our butts in gear and get the thing implemented in two months. What that means is that our ideal goals will have been met if we can pull this off. It will be part of a base system it won't be a wart that is added on. So if we can get that goal it will be fantastic and it will be done much earlier than expected actually if we can get directly into X11r6.

Earl - Do you want to define Mouse and ToggleKeys.

Will - MouseKeys provides you with the ability to control a mouse from the keyboard for people who cannot control a mouse but do have control of keyboard. You can emulate the mouse from the keyboard including button presses, dragging and dropping and double clicks. ToggleKeys is a way to provide auditory feedback for a key that is locked like a lock key, shift lock. BounceKeys is a way for some people, when they press a key and as soon as they release it they will press the same key again unintentionally (i.e. tremor). BounceKeys provides you with a way to ignore the second (extra) keystroke(s). SerialKeys allows people to connect alternative communication devices to the serial port and have those devices act as though they are keyboard or pointing input devices or even connect with software packages to do things like on screen keyboard scanning.

Gerhard - Is this according to the Trace Center GIDEI?

Will - Yes, we need to work with the GIDEI standard. SerialKeys can be implemented in such a way that we don't have to make modifications to the server itself. It can be implemented as the X Windows client. So we really haven't been sweating that one terribly. We have been sweating the AccessDOS type functionality more.

Bob - So, all the mechanisms that you enumerated. Which are the ones that you are expected to do an implementation of, all of them?

Will - We expect to do an implementation of all the AccessDOS functionality except SerialKeys. Out intent is to get that all into the server, which is SlowKeys, MouseKeys, StickyKeys, BounceKeys, ToggleKeys, RepeatKeys.

Will - The good news is that we aren't starting from scratch. We already have prototypes, which are written in many different ways and the Trace Center already has code from AccessDOS.

Earl - This is the stuff that was highlighted in the X Resource Book, the articles that Will was talking about. Both Will and Mark have done a lot of investigation into various modes of implementation, so they looked into various aspects of implementing the code.

Beth - What did you decide about the audio for the ToggleKeys? Is that going to be client side configurable, or is that going to be beeps in the server?

Will - For audio feedback, like when you select the ToggleKeys on or off, you may get an up or rising siren when turning on or a down or falling siren when turning off. The problem with the X Window server is that most keyboards only have an ability to beep or click. They don't have a siren up or a siren down. To do advance audio functions like that you will have an audio server. The way audio has been implemented by many vendors or by several vendors is a client server model, an audio model very similar to the client server model for the X Windows. What that means is that you cannot really access or you shouldn't really access the audio model client server model from the X Window server. So in order to implement audio, one way to do this is to have your server modifications generate events that go back to a client. What this means is you can write an X Windows application which just sits there and listens to the state of things changing and that X Windows application can connect to the audio server. What that will mean is that you can have an application independent implementation. So suppose SUN decided to implement an audio server in a manner different from what Digital decided to do. The client could be tailored toward SUN or DEC and again just a client not an entire server.

Paul - Will, when you say we should have this done by April, who specifically by name is the we that you are referring to?

Will - By name I can say it is me, Mark Novak, Earl Johnson, and Darin McGrew.

Paul - So you will all be working on this project?

Will - That's the beauty of DACX. Competitors such as SUN and DEC can get together for a common goal.

Paul - That brings up the next point I was going to make and that is at the first meeting we divided up this workload between the number of people that were there which is about half the number of people in this room. I am wondering if there is anybody else in here that could contribute any time or effort or money or anything else.

Jim - We have just gotten these kinds of requirements into our requirements database, the planner is looking at it, and I think what I will do for Will and Earl is make sure that they get connected with you guys because if they are going to put resources on it, they might as well do it in concert with your people too. I don't have any programmers directly under me to do the work. I've got to get it through the X Windows planning people and may get to code it that way.

Mark - Just a reminder, you have all the email addresses now, and you can be talking right now within DACX.

Greg - Workstation vendors are going to have to deal with TTY interfaces as well as X Windows and a lot of the issues that we are talking about with the AccessDOS functions are a lot easier to implement from the UNIX kernel. For example, in the keyboard device driver, I am wondering how for example if we as IBM go ahead and put StickyKeys in the keyboard device driver in the kernel, how is this going to affect StickyKeys in X Windows. Have you talked about issues like that?

Will - The thing is you can turn StickyKeys off. You don't have to have it always running in the X server. So if you choose not to turn it on in the X server because you have it implemented in the kernel. The reason we (DACX) chose to implement all this in the server of the X Window level is that because it provides a common platform for all the vendors. When you start implementing things in the TTY level or the driver level of the kernel, it starts varying from vendor to vendor.

Greg - Do you have a strategy to implement StickyKeys in the device independent layer?

Will--Yes, that is strictly device independent, right. So to make sure it works for everybody.

Earl - Will's prototype has been in the device independent. SUN's was in device dependent code simply because it was easier. It was more proof of concept.

Before the meeting agenda was opened up, there was a short discussion of the next DACX meeting date. The group felt that a meeting at RESNA in June was too soon, and not enough of the group would be attending ACB in July, so we tentatively agreed to meet at Closing-The-Gap, in October. Mark will see that a meeting announcement is sent out.

After the agenda items were discussed, an open group discussion moved into the area of standards and how DACX as a group should be involved.

Will raised the issue of providing the hooks for developers and let them apply and use them.

Peter stated screen reader developers would all be then able to use the hooks.

Beth stated again the users point of view on standardization.

Greg pointed out that standards are many times fixed by the user(s) request, but they should be allowed to pick and choose.

Peter suggested that maybe as a group we could form standards as to what commands should do, since in X it is very easy to apply resource files, so essentially anyoneís screen reader could be made to have the keyboard functionality of anyone else's screen reader, once they provided the resource file. However, another area which the group might wish to address has to do with bit patterns (i.e. icons). How about standardizing on how vendors attach names to their icons, or at least have the description in the same format.

Will asked if DACX should be looking into guidelines for accessibility?

Peter stated that BSI has started just such a list.

Beth commented that various widget sets can be standardized.

Jim raised the legal issues involved, although in the accessibility field, such issues seem to be relaxed.

Darin talked about work at SUN to come up with standards for icons, which can also be a problem for different cultures, but he thought a work around existed.

Paul asked if we couldn't just request icons to announce themselves?

Peter answered with some of the differences in the MAC and X World, and some of the difficulties in making changes.

Paul suggested that DACX as a group look at standards after the current deadlines for X11r6 work.

The group learned of the announcement from the UniForum Conference (going on simultaneously with CSUN in San Francisco) that several of the major workstation vendors were standardizing on the Motif GUI. This was good news in that one particular widget set could now be the focus for what Beth needs in the screen reader area. Discussion swung back to issues related to current screen reader work at IBM.

Greg talked about his work at IBM and how they are approaching X, and in what internal things they have been able to identify (i.e. basic screen access, read text, etc..), and what areas need work as yet (i.e. highlights, scroll bars, etc.). Beth offered to work more closely with IBM since she had basic object identification code that IBM could make use of.

Jolie Mason discussed her work at Paramax (i.e. defense side of Unisys). Jolie has been successfully using command line UNIX despite a visual impairment by applying a variety of clever tricks, including the development of a Braille font (by her assistant, Nina), and offered to make their work and software available to the X Consortium.

John G. raised the question of being able to read (screen reader) math symbols and math equations within X?

Bob suggested some methods by which to recognize the symbols, or snoop at the appropriate level.

Peter stated that if the symbols were drawn using standard methods, a screen reader ought to be able to patch in and review and understand the symbols, but they need to be done in a standard fashion (i.e. drawn using system tools).

John G. further discussed some of the current limitations, and how things or math symbols might work (i.e. be arranged) inside a word processing application.

Beth and Bob both commented on the fact the X would not be able to assist any application which did its own line rendering or drawing since the commands to do so would not traverse the network.

Peter asked if it would make sense to add a protocol to define things which cannot be standardized upon, perhaps a new API? For example, hand writing recognition systems may not have an intermediate format prior to display on screen.

Will asked if perhaps these items wouldn't require descriptive text.

Discussion kind of ended without any real solution to John's original question. This maybe something the group needs to address in future meetings.

One last question came from Jolie about the ability or inability to get electronic versions of the O'Reilly & Associates X Books.

Bob replied that they should be available, but suggested that interested parties contact O'Reilly & Associates at tim@ora.com or the editor, Adrian Nye at adrian@ora.com and request the books in electronic form.

Meeting adjourned at 9:00 pm

Attendance list

Mark Novak, Trace Center
Earl Johnson, SUN
Paul Fontaine, Access Solutions
Darin McGrew, SunSoft
Bob Scheifler, MIT
Peter Korn, BSI
Beth Mynatt, Georgia Tech
Will Walker, DEC
Leedy Day, DEC
John Mattioli
Jim Caldwell, IBM
Ann Ruth, Ruth Enterprises
Andrea Mano, C. Gerald Warren & Associates
Kuo Ping Yung, John Hopkins
Linda Giammattei Smith, GTE Govt. Systems
John Gardner, Oregon State University
Arnold Schneider, Swiss Federation of the Blind
Christian Hugentobler, Swiss Federation of the Blind
Gerhard Weber, Institut fur Informatik, Univ. of Stuttgart, Germany
Eric Bergman, SUN
Greg Pike, IBM
Jolie Mason, Paramax
Nina Mangan, Paramax
Ted Henter, Henter-Joyce
Glen Gordon, Henter-Joyce


BACK to DACX NOTES