                                Tactile Graphics
                          Charles E. Hallenbeck, Ph.D.
                                  KanSys, Inc.
                                 chuckh@idir.net
             From 1967 to 1969 my title was "Senior Post Doctoral Fellow
        in Applied Mathematics and Computer Science" at Washington
        University in St. Louis. I spent the first year learning to
        program and operate the IBM 1401 computer, with its 1403 printer,
        which could easily be persuaded to print braille. The 1403 was an
        impact printer, striking its hammers with some force against the
        paper. If the paper were cushioned from behind with a length of
        elastic and the printer printed only periods and blank spaces in
        the right places, one got a reasonable facsimile of braille on
        the back side of the page.
             The periods could be printed in 132 positions per line, and
        using an eight lines per inch setting, 88 lines fit on each page.
        Printing "period period blank, period period blank, period period
        blank, etcetera" across a line, then printing three such lines
        and skipping a line, three more such lines and skipping another,
        etcetera, gave you a page of full braille cells. To produce
        braille text, the software had to format a line of 44 cells
        (132/3), a page of 22 lines (88/4), and had to write from right
        to left, just as one does with a slate and stylus.
             Why print only periods? Why print only braille cells? What
        if the page were regarded as a plane of 132 by 88 locations where
        periods, hyphens, slashes, asterisks, or anything else might be
        printed? How many such characters could be told apart by touch?
        Did it help if they were printed in isolation or grouped together
        to cover a solid area? Since my earlier training could not be
        completely shaken off, I experimented with producing bar graphs
        representing psychological test profiles (remember the WAIS and
        the MMPI?). The bars were scaled to fill the page, outlined, then
        completely filled in. Adjacent bars had to use different filler
        characters, or had to be separated by a line of blanks or special
        border characters. Tactile graphics were on their way.
             Late in that first post-doc year, my mentor (Ted Sterling)
        tried to drag me away from my wonderful 1401. He wanted me to
        work with the more modern IBM 360 and its PL/I language. I was
        miserable. One day while attending a lecture lightning struck.
        The lecture was about the "Four Color Map" problem in
        mathematics, which was then unsolved. The problem is to prove
        that no more than four colors would ever be necessary to color
        the various regions of a map in such a way that no two adjacent
        regions share the same color. A checkerboard plainly needs only
        two colors; more complex surfaces cannot get away with two, and
        might need three or four. The lecturer claimed that no plane
        surface could be devised that needed more than four, although no
        one had been able to prove or disprove that claim. I am told it
        has since been proved.
             If the idea of colors could be applied to textures on a
        tactile surface, then no more than four different textures would
        be needed to distinguish adjacent regions on a tactile map. The
        blank space and the period were obvious choices for two textures,
        and if only two others were needed, then surely two easily
        recognizable shapes could be found: probably the hyphen and the
        slash, or maybe the digit one or the lower case l. With great
        effort I waited until the lecture was over, and then rushed off
        to spend the next year writing MAPSYS, a PL/I program with
        braille display, designed to let the user generate a "picture"
        which would automatically be "colored" with suitable tactile
        qualities. MAPSYS took input from punch cards, as most programs
        did in the 1960's, and produced braille pictures on a suitably
        cushioned impact printer. I was thrilled, my mentor was happy,
        and soon the second post-doc year was over and I had to go back
        to work for a living.
             I accepted a faculty position at the University of Kansas in
        1969, from which I retired in 1994. I learned that while their
        main computer was a General Electric 635, a major time-sharing
        pioneer, they also had an IBM 1401 and would not mind my
        occasional use of its 1403 printer. We worked out a way to
        request a "special forms" when submitting a print job to the 635,
        so that braille output could be obtained as a routine operations
        procedure. Then I discovered that PL/I was not supported at K.U.
        and MAPSYS was in deep trouble.
             The solution was made possible by a research grant from the
        Vocational Rehabilitation agency in Washington, to promote what
        we then called "Computerized Tactography" as occupational aids
        for the blind (at least  for myself). We hired Noel Runyan, a
        talented young electrical enginnering student from the University
        of New Mexico, who happened to be blind, to translate our PL/I
        code into Fortran for the Kansas GE system. Upgrading from PL/I
        to Fortran? Progress is sometimes made by one step backward. The
        results of this research were published in the Research Bulletin
        of the American Foundation for the Blind.
             Not all steps backward lead to progress however. The 1401
        was soon abandoned, the 1403 printer gave way to higher speed
        non-impact electrostatic printers, and braille quickly went out
        of style at Kansas. Our attention during the 1970's turned to
        speech access and culminated in our first talking desktop system
        in 1977, with speech access to the campus time sharing computer
        in 1978. However, the long term effect of those Camelot days were
        hard to erase. The modern era of computerized tactography began
        in 1982 when Tim Cranmer designed a graphics mode into the
        Cranmer Modified Perkins Brailler, and other manufacturers soon
        followed suit. Nowadays we have the Personal Data Systems PicTac
        program by Noel Runyan, and our KanSys Inc. LowRez program, which
        owe their inspiration, if not their algorithms, to those heady
        days nearly 30 years ago.
