                            CHAPTER 7
                                
                    PSYCHOLOGICAL PROPERTIES
                      OF TOUCH PERCEPTION

1.     INTRODUCTION

2.     PASSIVE TOUCH, STUDIES OF SEPARATE TOUCH
SENSATIONS

3.     ORGANISATION OF THE HAPTIC SYSTEM
  The Skeletel System
  The Neural System

4.     DIFFERENCES BETWEEN PASSIVE AND ACTIVE TOUCH

5.     THE THEORIES OF J.J. GIBSON
  Perceptual Meaning
  Information Pickup
  Verbal Meaning
  Evaluation of J.J. Gibson's Contribution

       6. SOME EARLY INVESTIGATIONS INTO BRAILLE READING
       BEHAVIOUR
       
  7. PERIPHERAL MECHANISMS
     Structure of the Glabrous Skin of the Human Hand
     Receptive Fields
  
  8. PSYCHOLOGICAL STUDIES OF PERIPHERAL MECHANISMS
     Roughness Discrimination
     Pressure, Vibration, and Shear
  
  9. CENTRAL MECHANISMS
     Parts of the Brain Involved in Touch Perception
  
  10.     PSYCHOPHYSICAL STUDIES OF CENTRAL MECHANISMS
     Hemisphere Asymmetry
     Convergence
     Memory
  
  11.     THE INTERLOCKING STRUCTURE OF VARIABLES
  
    1.   INTRODUCTION
  
  This short introduction to the psychophysiology of touch perception as it
  relates to using the braille code is included because an understanding
  of the functions involved in braille reading is incomplete without it.  For
  many years educational psychologists and teachers investigated reading
  habits of blind children, and the studies of Ashcroft (1960) and Nolan and
  Kederis (1969), for example, showed the effects of different reading
  situations such as dot numerosity and the use of contractions in different
  parts of the words.  Investigations, since approximately 1970, have
  highlighted a greater understanding of the peripheral and central
  mechanisms of neural input, which can be applied to the complex
  procedures involved in braille reading.  However, as so often happens,
  this new knowledge leads to the realisation that so much more is still
  waiting to be discovered about tactual reading.  Many investigations are
  being made, but direct application to braille reading still needs
  considerable expansion.
  
  To many the code has proved difficult to learn and to use for several
  reasons.  Amongst these are problems of perception, and the many rules
  made necessary because only 63 configurations are possible using
  Braille's 3x2 matrix.  The same sign may have to represent up to 8
  different meanings according to position within the matrix, within the word
  and when used for punctuation.  As a result of such difficulties it is a very
  slow medium to use, for the average number of words read per minute is
  approximately only a third of those read in a visual medium.  The past 50
  years has seen a succession of attempts to alleviate this situation by
  means of observation, experiment, and training in endeavours to improve
  the accuracy, comprehension and rate of braille reading performance. 
  The more important of these efforts will be discussed.
  
  To be of most help these studies need to be read in the light of an
  understanding of touch perception as it applies to the reading of braille. 
  Much has been written about the psychophysiology of visual and auditory
  systems, occasionally using the touch modality for contrast, but little has
  been understood about the latter until the writings of Katz in 1925.  His
  work was not translated into English in its entirety so was comparatively
  little known.  It was therefore all the more impressive when the works of
  J.J. Gibson (1962; 1966, published in Britain, 1968) became known in
  America and Britain.  In a personal comment on Katz' monograph "Der
  Aufbau der Tastwelt", Gibson said "I owe more to it than I have
  recognised recently" (Krueger, 1982), but such a comment does not
  detract from J.J. Gibson's work for all research should be carried out with
  due recognition of what has already been discovered.  Since then there
  have been several studies on different, specialised aspects of touch
  perception.  Much work still needs to be carried out in these areas, but
  meanwhile the present knowledge needs to be linked up with what is
  known concerning the braille code and reading in braille.
  
         2.   PASSIVE TOUCH, STUDIES OF SEPARATE
       TOUCH SENSATIONS
       
  "For a long time, two assumptions have been made about the senses,
  first that they are the only sources of knowledge about the world, and
  second that they are the channels for special qualities of experience"
  (Gibson, 1968, p.47), yet as late as 1973 Taylor, Lederman, and Gibson
  noted that "It is remarkable how little is known about perception by touch
  after more than a century of experimental psychology" (Krueger, 1982,
  p.4).  Reasons that have contributed to this lack of knowledge include:
  1. there is no single sensory centre for the touch modality as there
       is for vision, hearing, taste, and smell;
       2. psychologists have carried out many experiments concerning
       vision and hearing sometimes using touch perception as a
       contrast, and this has tended to lead to an underestimation of the
       possibilities of the touch perception when other modalities,
       particularly vision, are missing;
       
       For a century from approximately 1830, tactual perception was studied
  as cutaneous sensitivity.  Parts of the skin were probed in efforts to
  determine reactions such as awareness of pain or cold, and it therefore
  seemed reasonable to suggest that each sensation corresponded to a
  nerve ending which, when excited, would convey the information to the
  brain.  An attempt to list the mosaic of sensations proved impossible
  because parameters of the sensations were difficult to define.  For
  example, "It was argued that temperature was a different quality from
  touch, and that pain also was different" (Gibson, 1968, p.98), that warmth
  must be separated from cold and that pressure differed from prickly pain. 
  Early in this century it was thought that touch could be divided into 5
  senses of pressure, warmth, cold, pain, and kinesthesis, which is an
  awareness of movement (ibid., p.98).  All these enquiries were confined
  to the study of passive touch rather than active touch.
  
  Revesz (1950) wrote "the sighted organize space mainly in terms of
  external spatial co-ordinates.  The congenitally totally blind rely instead
  on haptic (touch and movement) space" (Millar, 1994, p.19).  This seems
  highly probable and will be discussed later in the chapter.
  
  3. ORGANISATION OF THE HAPTIC SYSTEM  
  
  The Skeletal System
  Katz suggested that the hand should be regarded as the organ of touch
  (Krueger, 1982, p.17) because of its versatility.  Its movements make
  possible the perception of tactual qualities, the manipulation of objects,
  and it is constantly in use in a variety of ways.  More specifically, Gibson
  wrote (1962, p.479) "When the hand is feeling an object the movement
  or angle of each joint from the first phalanx of each finger up to the
  shoulder and backbone makes its contribution".  That is, the skeleton is
  an organized system simultaneously and successively linked via joints
  and tendons to the central nervous system.  It is "not a collection of
  sensations, but structured perception" (Gibson, 1968, p.118).  In 1968
  (p.110) Gibson included an illustration showing the innervation of the
  muscle of the upper arm resulting in the movement of the arm at the
  elbow.  It would seem reasonable to infer that a similar mechanism must
  be responsible for the flex and stretch of the muscles in the fingers when
  reading braille.
  
  The Neural System
  The neural system is also arranged hierarchically comprising the
  peripheral nerves connected with the spinal column and thence to the
  brain.  Cranial nerves are involved with vision, hearing, tasting, and
  smelling, all within the region of the head, but the touch mechanism is
  more complex and wide-spread.  The receptive units, being affected by
  mechanical energy, are termed mechanoreceptors.  They occur all over
  the body (ibid., p.108), in and below the skin, in joints and connecting
  ligaments between bones, in the muscles and tendons and also wrapped
  around blood vessels.  The afferent (ingoing) nerves lead to nerve
  centres and some continue as far as the brain, and efferent (outgoing)
  nerves from the brain or nerve centres connect with muscles and joints
  responsible for movement (ibid., p.5).  All of these parts are mobile.  For
  example, the skin is deformed as it passes over a surface, the joints
  rotate in their sockets and muscles are contractile (ibid., p.118).  Gibson
  (1988, p.5) regarded the system as a series of active neural loops which
  are "ways of seeking and extracting information about the environment
  from the flowing array of ambient energy" that seem to function at
  different levels.  From Gibson's list (ibid., p.37) the following are
  extracted as being most involved in touch perception:
  (a)     lower proprioceptive systems responsible for posture and
       equilibrium linked with gravity;
       (b)     higher proprioceptive systems which are muscular and in which
       the receptors are probably excited by tension and probably
       register effort but not movement;
       (c)     articular, in which the receptors are in the joints and possibly the
       tendons;
       (d)     cutaneous in which the receptors are in the skin and perhaps also
       in most body tissue.
       
       It is remarkable how the articular and nervous systems co-ordinate so
  perfectly that we are generally unaware of their activities.
  
  4. DIFFERENCES BETWEEN PASSIVE AND ACTIVE
  TOUCH
  
  Katz, a German psychologist, wrote "Der Aufbau der Tastwelt" (The
  World of Touch) in 1925, but only parts of it were translated into English. 
  Meanwhile, J.J. Gibson, who agreed with many of Katz' theories,
  published his own radical ideas about perception in 1962 and 1966
  respectively.  (The latter was published in Britain 2 year later and
  references here are to this edition.)  Both psychologists emphasised
  movement being necessary for active touch to take place.  In touching,
  concentration can be of two kinds:
  1. passive touch - the impression made on the skin which involves
       the excitation of nerve endings, known as receptors, resulting in
       a sensation;
       2. active touch - which involves movement necessary for seeking for
       information by nerve endings which Gibson termed
       mechanoreceptors.  The stimulus energy produced originates not
       with the object perceived, braille text for example, but with the
       pressure of the reader's finger pad on the braille symbol followed
       by movement to acquire yet more information.  Katz wrote "the full
       richness of the palpable world is opened up to the touch only
       through movement" (Krueger, 1982, p.52) and Gibson later wrote
       "learning can be a process of detection and differentiation" (1968,
       p.52), that is, the stimulation is obtained, not imposed.
       
       5. THE THEORIES OF J.J. GIBSON
  
  Perceptual Meaning
  Gibson believed that the information that is detected via neural loops has
  intrinsic qualities which never vary; for example, a hard surface may
  always be recognised as such and a ball is always a spherical shape. 
  He realised that there was insufficient evidence to be certain about how
  these invariances from the external world could get into the nervous
  system and that much more research was needed.  He stated that the
  sources of stimulation are in the environment, but the actual stimuli are
  "patterns and transformations of energy at the receptors" (1968, p.28). 
  He compared active awareness to tentacles or feelers and that "the
  function of the brain when looped into its perceptual organs is not to
  decode signals, nor to interpret messages, nor to accept images. 
  Instead, the entire neural loop, including the brain, is involved in seeking
  and extracting information" (ibid., p.5).  He went even further from
  orthodox thinking by suggesting that "perception requires neither memory
  of past events nor inference from sense impressions".  We are left
  wondering how learning takes place, how it is remembered and how the
  information is retrieved.
  
  Information Pickup
  The input from the external world is limitless so there must be some
  mechanism of selection.  Gibson suggested that instead the perceptive
  system uses attention to explore and select, seeking for clarity, and
  ignores unwanted or unprocessed information.  Perception also involves
  learning by association (Gibson, p.273).  This ability to select and refine
  improves with use and with growth.  Information that is adjacent can be
  detected within the span of attention, but a problem arises when
  information is successive beyond this span and therefore involves time. 
  It would seem that the one involves only perceiving and the other both
  perceiving and remembering.  The latter is often referred to as "short-term memory", but according to Gibson "learning does not depend on
  memory at all, at least not on the re-arousal of traces or remembering of
  the past".  Instead, he thought (ibid., p.262) that "the development of this
  attunement, or education of attention, depends on past experience but
  not on the storage of past experiences".  In other words, instead of
  dependence on memory, recurring attention will result in recognition, not
  only in the detection of finer details, and the span of attention will also be
  increased (ibid., p.270).
  
  Verbal Meaning
  So far the references above have been to perceptual meaning where a
  stimulus extracts invariant information from the environment leading via
  resonance in neural loops to a percept of the environment.  For learning
  to be possible there must also be indirect responses to sources produced
  by thought, and by responses to other people, either in speech or in the
  written word.  This is second hand information about the world. Man has
  invented symbolic speech which has to be learned before communication
  can take place.  Two stages are necessary, a knowledge of the coded
  signal and also what it represents.  For verbal meaning the item referred
  to, by social convention, is given a symbol or word and by association
  this leads to thought.
  
  Evaluation of J.J. Gibson's Contribution
  Gibson's theories seem to suggest that experiences are built on as they
  occur, so that by repetition they become clearer.  He believed that
  exterioception and proprioception work in conjunction in the same neural
  loops and referred (ibid., p.284) to "calibration" of the ranges of inputs
  from different perceptual organs, and that this might be learning of a
  higher order.  The higher order invariants go direct to the brain.
  
  Together with Katz, the main contribution made by Gibson to an
  understanding of touch perception was his belief that touch can be active
  as well as passive.  For approximately a century before his work became
  known touch perception was thought of in terms of sensations produced
  by cutaneous stimulation, and this led to the supposition that there were
  separate nerve endings in the skin running direct to the brain.  His work
  opened the way for psychologists to progress to the discovery of new
  facts about touch perception.  Some investigations were concerned with
  the peripheral mechanisms of touch perception such as pressure,
  vibration and shear force leading to further information on discrimination
  of roughness (Lederman, 1982), while others specialized in aspects of
  the central mechanism of the nervous system, such as asymmetry
  (Hermelin and O'Connor, 1971), memory, convergence, and cross modal
  function (Millar, 1994).  Both parts of the neural system are
  interdependent, and a greater knowledge of their functions should lead
  to a better understanding of the processes that underlie behaviour in
  braille reading.  Where appropriate, some of the major braille studies will
  be included.  Many of the latter have been carried out during the past 3
  decades and give little or no indication of the underlying psychology of
  touch perception.
  
  6. SOME EARLY INVESTIGATIONS INTO BRAILLE
       READING BEHAVIOUR
       
  It is an interesting facet of the use of braille that there is a large body of
  experimental work, known to educational psychologists, some of which
  was carried out before the work of Katz and Gibson became more
  generally known (Burklen, 1922, translated into English in 1932; Holland
  and Eatman, 1933; Holland, 1934; Fertsch, 1946, 1947; Kusajima, 1961). 
  These investigations mainly involved various aspects of braille reading
  behaviour, basically with the hope that such knowledge would help
  reading achievement.  There was a substantial increase in the number
  of investigations during the 60's and 70's in America when grants were
  more readily available, and because the reading of braille is a multimodal
  activity, this continues to be an area where there is scope for further
  investigation.  There must be some reason or reasons why the braille
  code has needed several reconsiderations whereas such is not the case
  for alphabet forms used in visual reading.  In general, it is true to say that
  most of the investigations connected with braille reading until the 70's are
  geared towards the improvement of reading rate, which is measured in
  terms of words read per minute (w.p.m.).  Part 1 has shown the constant
  need for revisions of the code to be made in endeavours to make the
  code easier to read and to use.  Certain changes have been necessary
  from time to time for several reasons, viz. attempts to choose or alter the
  contracted versions of certain groups of letters in endeavours to help
  recognition, simplification of the many necessary rules, and an overview
  from time to time to keep braille usage compatible between countries
  using the same language.
  
  Burklen, a German psychologist, contemporary with Katz, published a
  study on touch reading by blind people (1922), but it was not translated
  into English until 1932.  His observations comprised the first detailed
  research on braille reading behaviour since the less scientific, but
  nevertheless worthy attempts to study embossed reading, carried out
  under the auspices of the American Association of Workers for the Blind
  from 1907 through 1913.  Burklen's studies covered such aspects as the
  characteristics of symbols, pressure, the use of left and right hands and
  speed of reading.  His work was not replicated because the braille
  reading was tested under rather artificial conditions.  For example, he
  used nail heads for braille dots, and later ones made of tin to save
  continual replacement of embossed paper because of the dots becoming
  pressed by constant use.  In addition, each student wore a
  "tastschreiber" on the reading finger which may have caused some
  discomfiture.  The device was bent round the reading finger and
  extended beyond the embossed material to smoked paper on which
  finger movements were recorded.  Even so, Burklen's work provided the
  fillip which encouraged further experimental studies on braille usage.
  
  Holland and Eatman (1933) compared the silent reading habits of good
  and poor readers in school grades 3 through 11.  These basic general
  observations were needed before more detailed examinations of reading
  habits could be carried out, hopefully leading to some means of
  improving performance, particularly in the field of rate of reading.  Moving
  pictures were obtained by mounting a camera above the subject's hands
  to photograph the fingers as they moved along the line of braille, and by
  means of a projecting device the records were later superimposed upon
  the material read.  Some of the time-consuming complexities of braille
  reading are indicated by the following list of information obtained: the
  total number of exposures per line, the average number of braille cells
  read by the left and right hand independently, the time taken by the
  subjects at the beginning and end of each line, the number of regressive
  movements made when facing difficulties of interpretation, and the time
  taken in making "return sweeps" to find the beginning of the next line. 
  More detailed reference will be made to hand use in the section on
  asymmetry later in this chapter.
  
  Holland (1934) next investigated the relation of pressure to the rate of
  reading by good and poor readers using a device for measuring
  pressure, a timing mechanism and a kymograph which recorded results
  on smoked paper.  The apparatus was not intrusive to the reading
  situation.  The results showed (ibid., p.17) that fast readers tended to use
  less pressure than slow readers, the amount of pressure varies within a
  given line, and poor readers showed a tendency to increase the amount
  of pressure as they read from the beginning to the end of a given
  paragraph.  The cause of these variations by slow readers was regarded
  as being largely due to difficulties in interpretation of meaning.  Holland
  regarded the study as "an hypothesis rather than an absolute truth" (ibid.,
  p.17) because of the small size of student participation.
  
  Similar general observations of braille reading to those of Burklen were
  carried out by Kusajima in Japan in 1961.  He recorded observations on
  the movements of the reading finger, and the function of the
  accompanying finger of the other hand.  As in Burklen's experiments, the
  students wore a tactual recorder on the reading finger.  In addition, he
  compared the differences between visual and touch reading.  He
  replicated his experiments with further detail in 1974.
  
  It is noticeable that because so little was understood about touch
  perception before 1960, all these early braille experiments observed
  reading behaviour and the knowledge was important, but there was little
  psychological explanation given on why such behaviour occurred.  In
  many instances the link still has to be made.
  
  7. PERIPHERAL MECHANISMS    
  
  Structure of the Glabrous Skin of the Human Hand
  During the 70's investigations were carried out to determine in detail
  what contribution is made by the neural units in human fingers.  In this
  area the finger-pad is thicker than other cutaneous surfaces of the body
  (Quilliam, 1978, p.5).  Quilliam also gave the following information (ibid.,
  p.12).  The skin consists of several cellular layers, and the surface has
  ridges, well-known because their impressions are the finger prints used
  for legal purposes.  Sweat glands occur along the upper surfaces of the
  ridges in greater profusion than anywhere else on the body surface, and
  because the ridges are arranged parallel to each other, the "channelling
  effect" distributes the sweat evenly over the fingers.  Elsewhere on the
  body shearing force when applied to the skin results in wrinkling, but in
  the fingers (and toes) the outer and inner layers of the skin are attached,
  and it is possible that the sweat glands between the layers also help to
  bind the surfaces together.  Fat cells also contribute to make the firm,
  cushioned surface typical of the finger-pads.  These factors combined
  with the presence of an accumulation of sensory units concerned with
  temperature, pain, and touch perception, demonstrate the high quality of
  fingers as sensing mechanisms.  The implications for braille reading are
  obvious.
  
  Practically speaking, sensory units concerned with temperature also play
  a part in this activity.  From comments made by blind children and blind
  colleagues, it is known that braille dots cannot be successfully sensed
  when fingers are cold, and likewise, braille reading becomes difficult
  when fingers become hot and sweaty.  A cold surface also impedes
  reading for it "feels smoother than neutral or warm ones (Lederman,
  1982, p.141).
  
  Knibestl and Vallbo (1970) were the first to demonstrate that there are
  4 main types of mechanoreceptors in the glabrous skin of the human
  hand (Vallbo and Johansson, 1978, p.32).  So far the function of these
  four types of nerve endings is not fully understood, though there have
  been several tentative suggestions (Vallbo and Johansson, 1978, p.33,
  p.36; Lederman, 1982, 143-145).
  
  Receptive Fields
  Receptors respond to mechanical energy, on/off units firing bursts of
  impulses at the beginning and end of excitation.  Two types of
  measurements can be made.  Mechanoreceptors can be rapid or slow
  responding and also vary according to the size of their receptive field. 
  Rapidly adapting units with small receptive fields are appropriate for
  braille reading, and indeed, the finger pads have been shown to be rich
  in these particular units (Vallbo and Johansson, 1978, p.44).  The second
  type of measurement registers their sequence of responses resulting
  from sustained indentation.  These investigators commented (p.48) that
  "particularly striking is the very high density of these two unit types at the
  finger tips, indicating that this is a skin area with outstanding qualities for
  tactile spatial analysis".  It would seem that together with the versatile
  movements of the hand as a whole, movements in this part of the body
  are well adapted to tactile activity.
  
  8. PSYCHOLOGICAL STUDIES OF PERIPHERAL
       MECHANISMS
       
  Roughness Discimination
  Because braille consists of raised patterns of dots it is possible to think
  of the reading finger moving along a line of characters in a continuum of
  changing textures.  Lederman (1982, p.131) wrote, "the perception of
  texture may be thought of as a microcosm of the entire spectrum of
  perceptual activities", and (ibid., p.135) "texture perception by touch still
  remains relatively unexplored to-day".
  
  Roughness is an aspect of texture discrimination and this fact was used
  by Nolan and Morris (1965) in the realm of braille reading.  They
  published a test which was intended to show the development of the
  ability of young blind children "to utilize the tactual receptors and hands
  in a co-ordinated fashion", this being critical for the reading process.  The
  test included comparisons between sandpaper of different grades of grit. 
  No relation was found between ability to discriminate degrees of
  roughness and chronological age, but was positively associated with
  level of grade assignment.  An important finding was that growth in this
  ability appeared to level off after Grade 3.  The test was used in schools
  in America to predict likely ability in the use of braille for reading and
  writing, and was one of the tests used by Nolan and Kederis (1969, p.88)
  when selecting subjects for testing "the influence of number of dots and
  position of dots on recognition thresholds for braille words".  Sandpaper
  shows an unregulated mass of texture compared with the more regular
  positions of dots and spaces which make up braille configurations.
  
  Pressure, Vibration, and Shear
  It has already been shown that pressure applied to a surface results in
  a passive impression, but that movement in active touch is dynamic and
  includes vibration.  Katz was more interested in surface structure rather
  than its shape (Krueger, 1982, p.41) and during investigations of
  movement considered that "the vibration sense represents temporal
  holism.  The hand as a unitary organ ... represents spatial holism (ibid.,
  16-17).  That is, both time and space are involved.  Put another way, 3
  aspects may be recognised, the 2 surfaces involved, finger force, and
  speed of movement.
  
  Vibrations are set up in the skin when movement takes place.  As the
  finger traverses a line of braille, the skin is squashed upwards towards
  the deep-seated part of the finger-pad and sideways as a result of lateral
  movement.  Quillim (1978, p.12) suggested that when this occurs a
  secondary type of vibration may also be present caused by movement
  across the ridges and spaces on the cutaneous surface, thus producing
  multiple exposure to the stimulus (ibid., p.12, 1978, p.12).  However, the
  clarity of perception is blurred to a certain extent by the "oiling" by the
  sweat glands whose function affects the shearing action between the
  finger and surface being explored.  Katz (Lederman, 1982, p.133)
  showed that by smearing the sensing finger(s) with collodion perception
  was improved, and Lederman (ibid., p.138) obtained a similar result when
  thin paper was placed between the finger and the rough surface.  Both
  methods would neutralise the effect of the layer of sweat.  The degree of
  callus on elderly or work roughened fingers would have the same effect.
  
  Shear force, speed of movement, already studied by Holland (1934), and
  the effect of temperature may also be involved.  Lederman included
  these aspects when she made a series of systematic studies on
  roughness discrimination using "aluminium plates with linear gratings of
  rectangular cross section cut into the surface" (Lederman, 1982, p.136). 
  Concerning the surface of the plates, results showed that the ratio of
  groove to ridge width does not affect perceived roughness and neither
  does the fundamental spatial frequency of the stimulus grating.  Finger
  force proved to be the second most influential factor, perceived
  roughness increasing with increases in force applied perpendicular to the
  surface; and for hand speed, perceived roughness decreased slightly
  with increasing speed, but this was negligible relative to groove width
  and finger force effects.  Lederman (ibid., 136-137) argued that if the
  effect of hand speed was negligible the actual movements of the skin are
  unimportant and therefore temporal pulse frequency to ratio of skin
  displacement plays no role.
  
  Lederman herself questioned whether such results could be used in
  comparison when other types of surface are involved (ibid., p.142).  This
  aspect must surely be taken into account before comparison can be
  made with the sensing of a braille surface, for a metal surface with slits
  and ridges will give a very different "feel" from the use of plastic or paper
  surface covered with domed dots.  For example, in the one the finger will
  slightly penetrate the slits, whereas in the other it will tend to slightly fold
  over the protuberances.
  
  It has been said here that roughness is a part of the structure of texture. 
  For braille, texture has added meaning, and therefore this aspect will be
  addressed in the section on central mechanisms.
  
  9. CENTRAL MECHANISMS
  
  Parts of the Brain Involved in Touch Perception
  When the neural inputs from peripheral regions involving touch reach the
  brain they join the cerebellum which lies under, and towards the back of
  the brain.  Overlying the cerebellum and anterior to it is the mid-brain and
  overlying that is the cerebral cortex.  The latter, consisting of much
  convoluted soft tissue, and therefore giving a much enlarged total
  surface, is divided into 2 hemispheres connected by fibres which are
  together known as the corpus callosum.  From the cerebellum the
  information is linked with a network of intercommunications between
  different, specialized regions of the brain.
  
  Each of the hemispheres consists of the occipital lobe at the back, the
  parietal lobe further forward with the temporal lobe beneath, and anterior
  to these is the frontal lobe.  When the neural impulses reach the corpus
  callosum most of them diverge to opposite sides of the brain, so that
  information from the left side of the body is controlled by the right
  hemisphere and the left hemisphere deals with information from the right
  side of the body.  There is considerable specialization in different parts
  of the brain.  Sherrick and Craig (1982, 60-63) showed that the touch
  sensitivity in monkeys is located in the parietal lobes of both
  hemispheres, and Millar has stated that in humans touch information is
  represented in the anterior part of the parietal cortex with spatial coding
  represented in the posterior part of the parietel lobe (ibid., p.53).
  
  10.     PSYCHOPHYSICAL STUDIES OF CENTRAL
  MECHANISMS
  
  Hemisphere Asymmetry
  Some of the information concerning the relative functioning of the 2
  hemispheres has been gained from observations of malfunctioning due
  to illness, operations, gunshot wounds and the like.  For example, it was
  found that when the left hemisphere was damaged, speech was
  sometimes affected, although "some aspects of language are also
  represented in the right hemisphere".  In a similar way it has been found
  that the right hemisphere is involved in space recognition, yet not
  exclusively so.  Concerning activity in the hemispheres, Millar (1994,
  p.58) wrote, "The systems do not simply duplicate each other; they are
  sufficiently specialized to provide additional functions, as well as forming
  a basis for fail-safe multiple representation".
  
  Before 1971, investigations concerning the best hand or hands for
  reading braille had proved inconclusive (American Association of
  Workers for the Blind, 1913, both hands; Burklen, 1932, left hand;
  Fertsch, 1947,l right hand).  Teachers were, therefore, inclined to leave
  preference to the readers.  Hermelin and O'Connor (1971) were the first
  to apply a psychological explanation for these different findings.  They
  quoted Kimura (1971) who had shown that numbers of dots shown
  visually were identified better by the left than the right visual field, that is,
  by the right hemisphere.  Conversely, letters were identified better by the
  right visual field, that is by the left hemisphere (Hermelin and O'Connor,
  1971A).  As braille reading involves both dots and representations for
  letters, that is, space as well as language, Hermelin and O'Connor
  questioned which hemisphere and therefore, which hand, was best for
  reading punctiform characters (ibid., 1971A).
  
  Fourteen children, aged between 8 and 10 years read separately with the
  index and centre fingers on the left and right hands respectively.  The
  centre finger reading had been included to obviate the practice factor. 
  Results showed that left-handed reading (right hemisphere - space) was
  significantly faster than right-handed reading (left hemisphere -
  language).  This fact suggested that "the brain treats input such as braille
  reading material, as spatially arranged items, to be more efficiently
  analysed by the right hemisphere before or while verbal coding of the
  material occurs in the left".  A similar experiment was carried out with
  adults (ibid., 1971B).  No difference in speed was found between the use
  of each hand but fewer mistakes were found in reading with the left
  middle finger.  It was therefore assumed by teachers that pupils who read
  with the left hand had more advantage than those who used right handed
  reading.  Further experimental work concerning right, left, or both handed
  reading of braille is explored in Chapter 8.
  
  Convergence
  So far only a general account has been given of the parts of the brain
  involved in touch perception.  Input from the peripheral regions is co-ordinated in the cerebellum and sent on to specific regions, mostly in the
  cerebral cortex, where it is encoded, and responses when required are
  sent to the motor output neural systems.  The paradigm of separate
  specialised inputs being the sole means of coding inputs must be
  rejected as being too simple an explanation.  For example, it has been
  seen how information concerning space and word recognition combine
  to help determine hand use when reading braille.  Major specialised
  regions of the brain have been recognised (e.g. Longman, 1982, p.656,
  diagram) and these are linked up by a complicated network of neural
  pathways; indeed Millar (1994, p.54) has hypothesised that as
  knowledge of the cross-modal functioning of the brain becomes better
  known, it is likely that a finer more inter-relating network of pathways will
  be discovered, and that the information centres will be recognised to be
  sub-divided into smaller, more specialised units.
  
  It could be thought that for those lacking the sense of sight all the
  enrichment that comes via the visual pathways is lost.  This impression
  may be enhanced by the fact that many psychologists have used
  comparisons of those without vision as controls, in order to find out more
  about visual conditions, thus strengthening the negative side of the
  deprivation.  Millar (1994, p.84) suggests that, being without the
  information coming via the visual pathways not only causes greater
  dependence upon the remaining senses, as would be expected, but that
  extra neural links may, in consequence, be formed between the
  specialised areas.
  
  Information will usually be selected from several inputs.  For example,
  when moving across a letter U some of the inputs which combine for
  recognition of this braille configuration, may include the number and
  position of the dots, the outline shape of the letter and the likelihood of
  it being U because the previous letter was Q.  Sometimes the information
  from more than one specific area will result in redundancy, making the
  impression stronger, and "one condition in which correlated information
  from another source facilitates recognition, is when perception from one
  of the sources is difficult, or clarity is reduced" (ibid. p.43).  Millar (1994,
  p.15) suggests that this cross-modal functioning can lead to a partial
  overlap of information, that is, "the sense modalities are sources of
  specialised, but complementary and convergent information" (ibid., p.15). 
  Since this partial overlapping information comes from differently
  specialised inputs, theoretically this partial redundancy should lead not
  only to more accurate input, but even to "improved tactual recognition of
  otherwise difficult patterns (ibid., p.44).
  
  Warren's opinion (1982, p.123), though writing in general terms, may well
  apply to braille reading: "... there is much research to be done on the
  encoding and retention of haptically gained information as with most
  developmental research, careful attention will have to be paid to the
  comparability of experimental tasks across age groups.  Converging
  operations, in which the same issue is approached from several
  experimental paradigms, will be necessary before firm conclusions can
  be reached".
  
  Memory
  Contrary to early beliefs, there is no one centre of the brain labelled
  "memory".  J.J. Gibson (1968, p.264) wrote "... a kind of memory in a new
  sense of the term is definitely required if we are to explain not
  apprehension over time, but repeated apprehension over time.  For the
  fact is that an observer learns with experience to isolate more subtle
  invariants during transformation and to establish more exactly the
  permanent features of an array".  Memory is part of the whole process of
  input from peripheral regions, and encoding (that is organising the
  information so that it is available for synthesising with other input), makes
  retrieval possible when required.
  
  Problems arise when attempts are made to discover more about haptic
  memory, together with speech, and, at a higher level of thought.  Warren
  asked the questions, how is the information encoded, and what is stored,
  suggesting four possible approaches (Warren, 1982, 118-122):
  1. asking for haptic reports;
       2. instructions given in an attempt to influence coding of haptically
       gained information;
       3. experimental interference with coding during retention period;
       4. comparison of performance by groups that are assumed, a priori,
       to differ in ability to perform a certain kind of coding.
       
       Warren (ibid., p.122) considered the third option was promising, but had
  not yet been sufficiently tested to give a good indication of its potential. 
  Even so, Millar (1974) assumed that this programme might give results
  for blind children's reading.  First (1974) she tested subjects aged
  approximately 10 years using four duplicated sets of three-dimensional
  nonsense shapes; the interval activity involved unfilled delay, verbal
  distractor, movement distractor and movement rehearsal respectively. 
  The last condition used finger tracing of the shapes from memory.  Both
  blind and sighted children were included in the samples.  It was argued
  from the results that "tactile short-term memory involves both decay of
  tactile impressions with time, and interference by additional activities with
  a longer-term process" (ibid., p.263).  Further tests were carried out in
  1985 involving braille letters, which are described in the next chapter. 
  Both experimental interference and instructions given in an attempt to
  influence coding were involved.
  
  11.     THE INTERLOCKING STRUCTURE OF VARIABLES
  
  So far the mechanisms of perception in both the peripheral and central
  regions have been described, but what makes the mechanisms work? 
  A car cannot drive itself; intelligence and ability are required by the driver
  of a car; but these qualities are still not enough.  A driver has to learn
  how to use the mechanism before competence can be established, and
  so it is with learning how to recognise braille characters and use them in
  words and sentences.  For children, there is the added complication
  concerning the rate of development at different stages during the
  learning process.  These aspects all interact and need to be understood
  for the most successful teaching and learning situations.
  
  The use of standardized tests is one way to monitor progress.  The
  Williams Intelligence Test for children with defective vision (1956) has
  been proved to have a satisfactory overall test/retest correlation over a
  two-year period (Tobin, 1994, p.40).  Gomulicki (1961) investigated the
  basic learning capacities of blind children, and his tests included
  observations of manipulative ability and tactile discrimination.  In each
  case (p.22 and p.24 respectively) intelligence played a significant role. 
  In his concluding remarks concerning all the investigations he stated
  (ibid., p.52) "... the net correlation between intelligence and performance
  is nevertheless significant at a higher level for the blind than for the
  sighted".  Nolan and Kederis (1969, p.44) considered that "mental ability
  is a limiting factor" and suggested (ibid., p.48) that "for students whose
  IQ is below 85 braille is an extremely inefficient medium of
  communication and that the necessity of mastering it may constitute an
  additional handicap".  Tobin (1971, p.52), when considering some
  teaching and psychological variables, wrote "the correlational part of the
  study was aimed at uncovering something of the interlocking structure of
  organismic and personality variables associated with success in learning
  braille" and as an integral part of this interlocking, has stated (personal
  comment) that the correlation between intelligence and braille is higher
  than the correlation between intelligence and print reading.
  
  Aspects of braille reading can be confidently assessed using
  standardized tests such as those devised or adapted by Tooze (1962),
  Lorimer, J. (1962), and Lorimer, J. (1977).  These tests are all referred
  to in more detail in the next chapter.  They are all used individually and
  scores gained may be compared with the norms provided for each age
  group.  Such tests give general markers of progress achieved and some
  also have diagnostic value.  However, for unstandardized but more
  detailed information on specific aspects, such as hand use and strategies
  used in character recognition, it is necessary to become familiar with
  investigations which have been carried out by educational psychologists
  over the years.  A selection of the more important of these is presented
  and reviewed in Chapter 8.
  
  The American Association of Workers for the Blind made the first studies
  of how braille reading is carried out (1907-1913), and from then until the
  present day most investigators have been aware of the necessity for
  comparing the achievements of different age groups and differing
  abilities and linking this information with stages of development.  It would
  seem to the writer that more information is needed to show in detail the
  effects of learning without the major sense of sight; how the individual
  copes with this deficit; and, more specifically in relation to the use of
  braille, the effect of age of onset has on individual progress.  "It should
  be obvious, then, that there are some important unanswered questions
  in the development of haptic perception.  The questions are both
  theoretical and practical.  Vigorous focal research is needed to answer
  them" (Warren, 1982, p.126).
  
  