Update from CSUN – part 3

If you have an interest in tactile maps, as Josh Miele does, or more to the point – simply getting around and accessing information – there is a lot to explore at CSUN and Josh is hitting it all: “Service Based Approach to the Construction and Delivery of Audio Tactile Diagrams”, “Crosswatch: A System to Help VI Pedestrians Find and Traverse Crosswalks”, “Tactile Maps of Montreal Subway Stations”, and “Audio-Tactile Interactive Computing with the Livescribe Pulse Smartpen”. The Quick Guide I was relying on didn’t provide many details, so when I apprised Josh that Livescribe was presenting he was like “Wha? I never heard about that. I’m their man. They should be calling me”. Turns out that this was Josh’s presentation. Doh!

A smartpen is a device with a small computer in it that writes on special paper and does two things ordinary pens can’t: It captures handwriting digitally and captures an audio recoding that is synchronized with the writing.

Josh and Steve Landau have developed a smartpen based system for producing tables, graphs, diagrams and other graphics in an audio-tactile form. Tactile can be over-cluttered. The more densely packed, the harder it is to interpret. More is less in the tactile world. Adding audio allows for much more complicated labeling than could be achieved with tactile alone. Adding a smartpen allows for even more functionality, elaborating on earlier tactile technologies.

With the smartpen there is no need for calibration (letting the computer know where it is), or sheet identification (a process you have to run through each time you change a sheet). At $150, it’s about a quarter the cost of existing systems. And the smartpen stands alone – it doesn’t need a computer or host device to work. Once the software is loaded onto it you can just run around with the pen in your pocket. It’s also much more accurate than using your finger, which is a lot bigger than the tip of the pen. Therefore the active areas can be much smaller, allowing you to do much finer granularity on the information that is provided on a tactile figure.

This is a technology developed in Sweden by a company called Anoto. The Pulse Smartpen is made in Oakland CA by a company called Livescribe. The pen has a computer built into it. It is a mainstream product developed for sighted college students to take notes with. They take notes in a notebook that is sold with the Livescribe pen. As they’re taking notes the pen is also recording the lecture as audio. When they go back later and tap on the notes in the notebook, it jumps to the spot in the recording when that note was written. Every page that you use with the pen is printed with a very high resolution field of black dots. The pattern of dots is unique for every spot of every page.  Because of the unique dot patterns the sensor in the tip of the pen is able to see where you are and react accordingly based on the program you loaded into the pen. Livescribe also provides a Software Developer Kit (SDK), so that other people can write applications for the Pulse smartpen.

Josh and Steve demonstrated a number of audio-tactile materials that they’ve developed:  a periodic table of the elements, a scientific calculator, a biology chart, and a sudoku.

Let’s suppose you have this audio-tactile periodic table of the elements. You can feel the raised lines and braille labels, but due to space constraints the only braille labels are V for vanadium, H for hydrogen, LI for lithium, and so on.  Say you don’t remember what LI stands for. Tap it with the pen and you hear “Capital LI, Lithium, atomic number 3”. That’s already more info than could be squeezed into the one inch square on the page. Tap it again and it says “Atomic weight is 6.94”. If you keep tapping on LI you’re going to get more information:  Density of STP, Melting point, Boiling point, etc. These are all layers of information underneath the top layer. Not only do we have all the info that a chemistry student might want, but it’s laid out on a piece of paper that fits into a standard binder.

In addition to containing several layers of information each audio-tactile has a few controls. In the periodic table, you can scroll though a list of index by name, symbol, and atomic number.  And there is a lock feature. Say you’re interested in boiling points, but boiling points are buried several layers. You don’t want to have to drill down to boiling point on each element. Drill down to boiling point once, tap on the lock button and now tapping anywhere else on the periodic table gives you boiling point for that element. And very important: tapping anywhere blank on the audio-tactile shuts the thing up.

They also demoed a scientific calculator. Every button is labeled in braille and large print, and they’re grouped nicely on the page to give you the idea that These are the financial functions, These are the statistical functions. (It may be the only accessible scientific calculator with financial functions). The trigonometric functions are separated out from logarithmic functions and so forth. It’s a fraction of the cost of a conventional scientific calculator. If you don’t know what a button is or does, tap on the help strip to find out. You can hit this at any point in a calculation without screwing up the calculation and without losing your orientation.

After hitting the textbooks sometimes you need a break, so they put a bit of effort into developing an audio-tactile sudoku board.  This example was a six by six grid. When you load a game numbers and blanks are applied to all 36 squares, but visually the board is a blank grid. As a sighted person, I can tell you that playing blind sudoku is a real mind bender, but there are a significant number of blind sudoku players who will have no trouble with this.

This idea has applications outside of the ones demonstrated. One of the other applications they are exploring for this tool are BART system and station maps, a project that’s under contract with the LightHouse in San Francisco to map all 43 BART stations. It’s important to note they’re not just doing a map of the system, but doing station maps because you want know not just where trains go, but how to get in and out of, and around the stations. To date, the only way to do that is trial and error, so you have to schedule a couple extra hours to get to that job interview, or you need to go through each unfamiliar station with an assistant to help you get oriented. The idea of these station maps, which will be smartpen enabled, is to give three views: street, concourse and platform. Use the pen to touch different things on the map to find out This is a taxi stand, This is a bus stop, These are the stairs going down to the concourse level, This is the fare gate, The escalator going down the platform, etc. They can even program in schedule or route planning information. There is the hope that this is going to be exciting enough for transit systems around the country to say “Hey, we want one of them there talking maps too.”