| 
  
    | Burroughs' Future in Electronics 
 MRS. MARY HAWES*               
      -  NMAA**
      1959 Proceedings  - Data Processing (*Mrs.
      Mary Hawes is Senior Product Planning Analyst with the Burroughs
      Corporation -Electro Data Division.) 
 |  
    | DATA PROCESSING Page 20 Burroughs' future is electronics.
      In fact we don't have to wait for the future, Burroughs' present is
      electronics. We have so many developments in progress that I could easily
      spend the next hour, my allotted time, describing some of the very
      interesting phases of our work. I could even limit myself to those
      involved with data processing equipment without ever touching on some of
      the problems with which I believe you are most vitally concerned. What I
      should like to do before discussing my favorite subject of data processing
      and the use of electronic computers, is to take you with me for the next
      five or ten minutes during which time we will forget our present problems.
      You can even forget. you are in this room if you wish. You" see, I
      had the privilege of having a math professor who used to take us on such
      jaunts in which he would describe how various businesses, industries and
      government agencies were using mathematical tools about which we were
      studying or which might be under development. It was great fun and
      furthermore, it broadened our horizon and helped us to realize that our
      problems, though large in themselves, were part of a much larger group of
      problems. Slightly more than ten years ago today,
      the guiding minds of Burroughs recognized that Burroughs' future lay in
      electronics. They started then setting up the research laboratories and
      obtaining the facilities which they knew would be required if we were to
      develop the electronic tools which business, industry and our national
      defense would need. I am well aware that most of you are in
      business or industry. Before you set up a mental block against my
      including "national defense" so much in my introductory remarks,
      I would like to emphasize the very important role it plays in the overall
      scheme. The most important requirement to our national defense is that we
      have equipment that will do a specified job in no more than a given
      elapsed time, with the utmost in reliability; and that such equipment be
      ready for this job no later than a given date. In other words the costs
      involved with not being able to do a job or not being able
      to do a job in the specified time, is of much greater significance than
      the cost in dollars and cents of doing the job. Business and industry come
      into the picture when we have the equipment that can do the specified job
      for no more than "x" dollars and cents. Perhaps now you can
      understand why it is that various developments are first made available to
      our national defense. There may be a considerable time lag before they are
      offered to you. A great amount of additional research, including the
      development of different techniques, usually must be spent to make some of
      these tools practical for your consideration.   |  
    | DATA PROCESSING Page 20 All of you know something about the very large data
      processing national defense projects known as the Sage and Atlas programs.
      The Sage project is the Air Force radar and data processing network
      developed to protect us against air attack. Burroughs has been principally
      concerned with the problem of developing data processing and transmitting
      equipment for this system. Again with the large scale data processing.
      equipment for the guidance of the Atlas missile, first at Cape Canaveral
      and now around the nation, Burroughs is playing a leading role in
      developing computers which can control the path of a missile even after it
      has been fired. Furthermore, since the control comes from a program,
      improved guidance mechanics can be incorporated into the system. To give
      you some idea of the size of these programs, Burroughs contracts on these
      two program alone amount to something in the neighborhood of $220,000,000. From the very large electronic data processing systems
      we can go to the very small and compact computing systems for submarines
      and aircraft, and to electric timing devices which could be used in the
      warhead of a missile. We are doing a great amount of work on
      miniaturization, sub-miniaturization and micro-miniaturization.
      Miniaturization has been explained as the process by which a lot of little
      pieces spread over a lot ofsquare inches are all
      crammed together in the smallest possible sealed container. For example, a
      one cubic inch plastic block may contain as many parts and pieces as the
      average portable radio. Yes, I know you are already asking yourself, how
      do you get inside when something goes wrong? The answer is, you don't. To
      begin with, it isn't necessary to "get inside" very often; and
      if you should, you throw the offending unit away and replace it with
      another, Life tests run on some of these units indicate you should expect
      to replace an offending unit every ten years or more. We are carrying on some very interesting
      work in the area, of memory devices. Even though the
      results of some of our research using some of these memories on a small
      scale seem fantastic even to us, we realize it will probably be
      considerable time before we can make them available to you in the size you
      will require. To come a little closer to home, I am sure most of you
      have. heard of the 'work we are doing in the area of electrostatic
      printing. A prototype Whippet Printer was built for the Signal Corps which
      prints 3,000 words per minute. However, the electrostatic technique makes
      possible the printing of alpha-numeric information at speeds up to 30,000
      characters per second. A great amount of research has gone into developing
      techniques by which printed information, such as that recorded on bank
      checks with magnetic ink, can be automatically read by electronic devices.
      The ability to automatically read this information together with the
      ability to control the rapid movement of pieces of paper varying in size,
      permits us to sort bank checks at speeds of 1,500 checks per minute. This
      development is a major key in making practical the use of electronic data
      processing systems for banks, including the area of commercial
      bookkeeping. The value of research has always been recognized by
      Burroughs. Many |  
    | BURROUGHS' FUTURE IN ELECTRONICS Page
      21   of the projects initiated in the
      laboratory today will not reach you for two, five or even ten years. Last
      year, almost 60% of our sales were for products not known ten years ago.
      It is expected that an even larger percentage of our income ten years from
      today will be from products and developments now in the laboratory or not
      yet on the drawing board. But enough of the future and near
      future. I have not begun to cover the many projects in which Burroughs is
      currently participating, nor have I even mentioned any of our electronic
      computer systems with which you may have worked in the past or with which
      you are currently working. In fact, from this point on, any
      reference to equipment will merely be to help put across a point. I wish
      to discuss with you what we believe to be some of the problems you have
      encountered or will encounter in your data processing using electronic
      computer systems. It was in 1886 that William Seward
      Burroughs developed the first practical adding machine. Burroughs still
      makes adding machines but I dare say there is as little resemblance
      between the appearance and anatomy of that adding machine and the one our
      salesman offers you today, as there is between the personnel who used the
      equipment in 1886 as in contrast to today's user. However, the reason for
      the development of such a tool is not so different. Business and industry
      recognized that more accurate control was necessary if their businesses
      were to grow and prosper. In the early days of electronic data
      processing computers, a great amount of effort went into proving that we
      could use them effectively. Having put data processing applications on
      these electronic computer systems and also having gone into companies and
      seen the same jobs being performed principally by young ladies, I have
      never ceased to be amazed. I am certain that it would take me a long time
      to be as dexterous as some of these young ladies are as they sort pieces
      of paper or insert and withdraw cards from a file. I have even seen row
      after row of persons operating desk calculators with one hand and
      recording results with the other. However, it is also very alarming to
      realize that the greatest speed and accuracy is obtained when their
      actions become automatic and all but bypass the thinking mechanism of the
      human being. I well recall one of our early studies
      in which we were asked if we could process 50,000 transactions per hour
      against a master file, creating an up-to-date master file together with a
      comprehensive tabulation of the results. We were finally permitted to
      actually put the problem on the computer and demonstrated we could process
      117,000 such transactions per hour more than twice as many as required. We
      were immediately asked if we could introduce a large rate table, which was
      in the same sequence as the file, and compute various required values as
      we processed the transactions. The procedure was altered, with the result
      that our rate of processing transactions now dropped to approximately
      105,000 per hour. Although we had been given a week to incorporate this
      change, we were able to do so and check out the altered procedure in 8
      clock hours, including one hour of computer time. I shall never forget the
      meeting which followed, for it was then I was told, "Mary, it is far
      more important and significant that   |  
    | DATA PROCESSING Page 22 you were able to incorporate such a major change in
      procedure in so short a time, than it was that you could process the
      transactions at the rate of 117,000 per hour rather than 50,000 per
      hour." Normally about 30 persons would have been involved with this
      phase of their data processing because of the large volumes of data
      involved. The length of the training period required in effecting a major
      change in procedure, including the correction of the errors made en route,
      could easily have taken a major part of a year. I might add one of the
      major headaches in effecting new procedures where a number of persons are
      involved, comes with the way exceptions are handled. And as you know so
      well, data processing problems sometimes seem to be one exception followed
      by another exception. Of course, it isn't quite this bad. But it must be
      remembered that most of our data processing is concerned with
      "SERVICE" of one type or another and "SERVICE" is very
      closely allied to human beings. Where human beings are involved, you will
      always have exceptions. Thus, the most important feature of electronic
      computer systems is the ability to change. Let us carry this idea a bit farther and say that one
      of the prerequisites of data processing is the ability to change. Business
      is never stagnant, it never remains the same for long. A rather striking
      example of this that comes to mind is the company that was making foam
      rubber sofa pillows, . and mattresses. They had a very successful business and
      decided to automate their production line to speed up their production and
      reduce their expenses. In a relatively short period of time, their
      sales of sofa pillows zoomed to new heights, while the sales of mattresses
      decreased slightly. Production was adjusted; the same amount of raw foam
      material was used but a much larger percentage went into sofa pillows. The
      only difficulty came near the end of the fixed production line where
      pillows were coming at such a rate that it was impossible for the girls to
      handle them fast enough in the space available. A sea of pillows soon
      developed. You can imagine the results of having built a production line
      without full realization of the need to change. In this case they brought
      in more women and set up an inefficient but workable system to stem the
      tide until they were able to cope with their "changing" world. Somewhere, in a large percentage of the' articles you
      read today on electronic data processing, you will find the term
      "systems approach." This is not just a catch phrase; it has a
      very real and deep significance. I am tempted to say that the need for a
      "systems approach" is no greater today than it was ten years
      ago. This would not be true. The world in which we live today is not the
      same as it was ten years ago. The requirements in business and industry
      have altered in proportion to our personal lives. Once it was sufficient
      to produce a better mousetrap.. Today you must also merchandise it more
      cleverly and be prepared to service it more effectively than does your
      competitor. However I do believe, looking backward, that the need for
      seeing your requirements as a whole, has always been present. But when you
      go into electronic data processing, you are face to face with making many
      decisions in advance. It sometimes appears that because of this
      requirement to plan "What will we do if ---?", has forced you to
      examine your    |  
    | BURROUGHS' FUTURE IN ELECTRONICS Page
      23   problems more closely than you have for
      a long time. So many of these "little" questions: are confounded
      with company policy. When you realize that every time a certain exception
      presents itself, it will be processed in the same prescribed manner, you
      find yourself giving it more thought than you might have on a
      "one-of-a-kind, every-so-often theory." This scrutiny at a detail level can be
      the eye opener. Once you have started the analysis, it is not too long
      before you realize that the large savings in paper work management come at
      the systems level. You also realize that now you have a tool which not
      only lends itself to a systems approach but which can produce a far more
      effective piece of work if a systems approach is used. I have not heard of
      a single complaint from a company that has spent a large sum of money
      looking into its data processing procedures and requirements as a part of
      an electronic data processing system evaluation, since they have found
      that money was well spent whether they decided in favor of or against the
      acquisition of such a system. Not so long ago, I read of a company who had
      decided in favor of an electronic data processing system. They also
      decided not to wait until the electronic equipment was installed to
      incorporate their overall systems approach into their methods and
      procedures. They estimated the savings resulting from changed procedures
      would not only pay for the study but also for the first year's rental on
      the new system. It is my contention, and I have said it in front of many
      an audience, that the majority of users of electronic data processing know
      more about their problems, at both the overall and the detail levels, at
      the end of their first year of operation than they had ever thought of. In
      the majority of cases, the computer has been used for procedures not
      possible with manual methods or with tabulating equipment. The results
      have been extremely interesting. It has been estimated that between 65%
      and 85% of the time spent in getting a data processing problem ready for a
      computer is spent in the definition of the problem. I have experienced
      running a "realistic" demonstration analysis on a
      "system," having this system undergo six major revisions during
      preparation for actual installation (meaning we now incorporated the
      various exceptions and idiosyncrasies which had not been defined as part
      of the original problem), and one more major revision after installation.
      I should add that this system naturally enough included a file maintenance
      job and that the creation of the master file from a composite of several
      card files, turned out to be almost as difficult a job as the data
      processing system once the master file existed. In attempting to
      explain why the problem definition seems to be so involved, it is my
      opinion that the difficulty lies in the fact that so many different people
      know only a small fraction of the job. In some instances, pieces are
      missing; in other instances, functions overlap and may even be handled
      slightly differently depending upon who does it.
      Data processing is never simple. The only job that is really simple is the
      one you haven't yet tackled. I well recall the first file maintenance
      system we developed as a demonstration problem for an electronic data
      processor. It was a beautiful system; and it worked fine on paper. It was
      one of those "straight forward" life insurance jobs where   |  
    | DATA PROCESSING Page 24 we took care of everything since we allowed every field
      to vary. It took us several man years to do the job and naturally having
      "received the problem," we didn't have to work too closely with
      the people who knew the problem more intimately. Even today I almost blush
      at our innocence and ignorance. The processing procedures were all right
      as far as they went, but they didn't go far enough. We had attempted to be
      very sophisticated and achieve the utmost as far as computer usage was
      concerned. However, the problems we created for the operator were
      fantastic. We had not faced up to the control and general flow of
      information or paper work within the company whatsoever. In other words,
      we had taken a very real application and put it on an electronic data
      processor without looking at the entire system or how we were to control
      it in a day-to-day environment. We had a group of procedures each of which
      was fine in itself; but we had no system and very little control. Time is so very important today. There never seems
      to be enough of it. As a result of too little time, of attempting to make
      use of a new tool before the ground has been prepared, a number of
      electronic data processing systems have been installed prematurely. This
      may have resulted in an expensive lesson but it may also have helped us to
      learn our lesson earlier and better than we might otherwise have done. A
      greater number of persons have found their problems to be larger and more
      complicated than to be smaller and simpler. Also they have found that as
      the complications rise, the cost also rises. On the other side of the
      ledger, they have discovered the potential of their hew-found tool for the
      solution of problems they could not have tackled otherwise. But perhaps
      most important of all, they have found that by using an electronic data
      processing system, they obtain control to a degree not possible
      heretofore. We talk about management reports, about management by
      exception. We talk of maximizing profits and minimizing costs. We dream
      about the time it will be possible to have the month's financial report
      available on the first, or at most the second of the following month; or
      furthermore, for the Manager to be able to dial into, a computer from his
      desk, and ask for certain detail figures on a particular part of that
      report. He might even ask for an analysis of the data related to this
      particular part of the system over say the past 3 or 6 months. He might
      ask that the system analyze what the results would have been if "such and such" had happened 6 months ago, 3
      months ago, or predict what would be the result of incorporating a
      particular change assuming everything else will follow its current trend.
      There is no question but that being able to obtain this type of
      information when you want it, is worth a very great deal. How many dollars
      and cents? No one has been willing to estimate it for me. I was talking to an industrialist less than a year ago
      about what he thought the computer climate might be four years in the
      future. He forecasted that his company would rate priorities to computer
      applications as: A. Design Applications--20 %B. Optimization Problems--40 %
 c. Data Processing Applications--40 %.
   |  
    | BURROUGHS' FUTURE IN ELECTRONICS Page
      25   He went on to say that the reason for the priorities
      was that the design problems just had to get on the computer. The savings
      accruing from computer usage for these type problems, in time, and
      dollars and cents meant life blood to his company. In the area of
      optimization problems, he told of a relatively recent experience in which
      they had put a given problem on one of the faster computers with a large
      memory. Never before had it been possible to analyze such a large amount
      of related information; what was more, the results from this particular
      problem were available to management before they made their decision
      relative to the problem area. The story went on. Management took time to
      evaluate what their decision would have been had Fhis information not been
      available, in contrast to what their decision actually was. The saving to
      the company was a low order six digit dollar figure. They place data
      processing applications last only because the others must be placed first. The number of computing systems being sold for data
      processing applications is steadily increasing. The amount of paper work
      is still increasing in volume while the number of available competent
      clerks is decreasing. But still more important than the requirement to
      process paper work is the requirement for more accurate, up-to-date, and
      relevant information to form a better basis for management decisions. I have talked in very general terms about certain overall data
      processingrequirements including: 1. Approach your problem as a whole.2. Allow for changes in procedures.
 3. Prepare in advance for exceptions.
 4. Do not overlook operational environmental requirements.
 5. Recognize the need for management information and control.
 6. Make your tools more effective through research.
 To me, ,the two greatest problem areas for data
      processing applications lie with input-output and with the development of
      automatic programming techniques that will permit us to do the job at hand
      effectively and efficiently. You are hard headed businessmen who want all
      that I have mentioned and ,then some. You want answers to all those
      problems which have been bothering you for the past urn-teen months but
      for which you see no solution; in some cases you haven't been able to
      formulate the problem but you want the answer nevertheless. What is more.
      you want all of this, faster and more accurately than you have been
      getting your work done at a price less than you are currently paying. The
      only trouble with this is that too many of you are sitting back with a
      "show me" or a "watch and wait" attitude. For many years there were very few who felt that data
      processing type problems would lend themselves to automatic coding
      techniques. This was due primarily to the fact that there are seldom any
      two problems which are identical. Not only are the problems different, but
      the sub-sections are different. However, the approach to many problems is
      similar and the structure of the sub-sections is similar. The solution
      lies in the use of what we call   |  
    | DATA PROCESSING Page 26 generators rather than the fixed-subroutines which are
      common in the mathematical type problems. In using a generator, you start
      with a structureof skeletal coding and build a
      particular subroutine to fit a particular set of specifications or
      parameters. You might say you make use of generators to produce
      "custom coding." It is interesting to note the difference in the reason
      for the development of automatic coding techniques for mathematical type
      problems as opposed to the data processing type problems. Automatic
      techniques for the mathematical type problem were of direct benefit to the
      person doing the programming as a labor saving device. The programmer
      found he could use the same section of coding, with only a few address
      modifications, in another problem or many problems. He found it more
      interesting to let the computer incorporate these sections or subroutines
      not only because the computer could do it faster and more accurately than
      he, but also the programmer was now free to tackle new problems. From
      here, the next step was to devise methods by which the programmer could
      state his problem in general mathematical terms to eliminate still more
      coding. The form in which the problem was stated incorporated, to a
      certain extent, the idiosyncrasies ofthe particular
      computer for which the compiler was written and also of the individuals
      who developed the compiler. This necessitated restating the same problem
      in a slightly different form for different computers. Especially was this
      true when the computers were developed by different manufacturers. All of
      you know of the efforts which have been expended recently toward defining
      a common algebraic language so that it will be possible to state a
      mathematical problem so that it can be understood not only by many
      individuals, but also by many different computers. For the data processing type problems, first came some
      generalized routines such as sorts, merges, and report writers. These were
      separate routines in which the parameters were specified so that various
      addresses and switches could be modified by the computer to make the
      resultant routine work on data consistent with the specified parameters.
      These are generators and were the forerunners of the generators which are
      now used to develop subroutines or larger segments of routines. It was
      recognized at a relatively early date that the major difficulty
      encountered with data processing problems was with handling of information
      or the flow of information in and out of a data processing system.
      Furthermore, the greatest percentage ofinefficiency
      lay in this area. From the operational standpoint, the greatest amount of
      difficulty lay in the same area since the input-output equipment, being
      electro-mechanical, was more subject to down time than were the electronic
      circuits. Coupled with all of this was the necessity to incorporate
      restart points not only because of the length of time a particular problem
      might take, but also for recovery in the event of malfunction on the part
      of the data processing system. Towering above all these considerations was
      the fact that the same routines will be run by different operators and
      that changes must be incorporated from time to time within these routines,
      most often by someone other than the original programmer. Furthermore, the
      work must be accomplished on time. Paychecks must be delivered on time.   |  
    | DATA PROCESSING  Page 27 Vendors must be paid within prescribed periods of time.
      Orders must be filled on time. Therefore, if changes are to be applied,
      they must be applied correctly. 'Not only must the corrections be applied
      correctly within one routine, but in all routines where the change is
      reflected. Out of this comes the realization that in data processing
      applications, the emphasis must be based on the entire system and' upon
      its maintenance. Much time, analysis and experimentation have been spent
      in an effort to determine the best way to represent a data processing
      system taking into account the difficulties associated with its: 1. Definition2. Solution
 3. Operation
 4. Maintenance
 5. Revision.
 We at Burroughs are now on our way toward developing
      automatic coding tools based upon the use of multi-level flow charts with
      problem definition in English words, algebraic expressions and
      mathematical equations. Operational and procedural standards that
      contribute toward good systems organization are further incorporated by
      use of a library of subroutines, generators and data descriptions. The
      entire automatic approach is biased to favor revision and maintenance
      rather than initial solution. Let me take a few minutes to attempt to indicate some
      of the problems we faced, and which contributed to our approach. In
      overall, an automatic data processing system must recognize that human
      beings initiate the action, human beings control the action, and human
      beings also make use of the end product. Furthermore, the abilities and
      characteristics of the various types of individuals who are part of the
      system must neither be overestimated nor underestimated if they are to be
      able to turn in a "best" performance. Procedures must be kept
      simple to use; if complicated or sophisticated techniques are necessary,
      they must be internal. Emphasis must also be placed on economical
      operation as well as on most effective results. It has been estimated that about 65 % to 85 % of
      the cost of getting ready a data processing problem for a computer lies in
      the definition of the problem It is also one area that does not lend
      itself to the compression of time. If you do ,not do sufficient planning
      and flow charting initially, you add time at an exponential rate later.
      One of the reasons for the complexities involved in defining the problem
      lies with the fact that a properly designeddata
      processing problem normally cuts across department lines. In different
      groups, practices differ, language differs and occasionally similar
      overlapping functions are processed somewhat differently. Since many
      persons collaborate in defining the problem, the language used must be
      understood by all. Not only must' all parts of the problem work, but they
      must work together as a unit The easiest place to locate a logical error
      or weakness is at this time when the problem is laid out in flow chart
      form. First is the general flow chart which includes how it ties into the
      company organization; second is the process flow chart which presents the
      systems organization of the   |  
    | DATA PROCESSING Page 28 problem itself, in terms of its various parts; third
      are the detail flow charts for each part. This detail flow chart is
      organized according to functions which are to be performed and more
      oriented toward the problem than toward the computer. The
      "language" used to define the problem must be understood by the
      person who knows the problem. We find all too often when procedures are
      presented verbally, one is so busy following the speaker that he is not
      free to explore alternate paths which may occur to him. On the other hand,
      if he does explore, he loses some information which is being presented and
      that is not good. Broad written paragraphs usually leave material out
      because of the voluminous writing which is required. The detail flow chart
      organized by function, not so detailed as instruction level, together with
      the larger view flow charts seem to answer the problem with regard to: 1. Ability to obtain overall logic as well as detail logic2. Ability to examine
 3. Detail without too much writing
 4. Standard language makes its meaning clear and concise.
 Now if we add a fifth and sixth advantage: 5. Easy to translate to machine code6. Easy to maintain and revise
 we really may have an excellent start toward reducing some of our major
      headaches. It has been said many times, the greatest advantage of
      an electronic computer is not its ability to do a job faster but rather to
      do it better and also retain the ability to change and improve as we
      develop better procedures and techniques. We must not lose sight of the desirability to process
      our problems on more than one computer. In the early days we recognized
      this by seeing that it was possible to write a tape file on one tape unit
      and read it from any other connected to the same data processor. We soon
      recognized the necessity to read or write tapes on different computers of
      the same model as a requirement for back-up. This was enlarged to include
      different data processing systems by the same manufacturers as the
      installation grew in size. Now we recognize the necessity to be able to
      easily take the same data processing problem and put both it and its
      associated data on different systems. It is possible in a centralized
      operation that the central computing system is fed by several different
      smaller computing systems each of which is doing "practically"
      the Same job. -It would be desirable both from theinformation
      and from the cost standpoint to have the problem defined once and
      only once with the variations added. Again, let me emphasize the maintenance and revision
      problems. It has been estimated by a number of users of large scale
      equipment that as much as 25 % of programming talent may be used in taking
      care of production runs. This consists of supervising the operation,
      updating the system and also making changes that would improve the system.
      This need to change easily and correctly exists long before an operation
      is in production. The chances   |  
    | BURROUGHS' FUTURE IN ELECTRONICS Page
      29   are excellent that at least one major and several minor
      changes will be made in a system after it is "frozen." These
      changes are not unwarranted so, therefore, we need to recognize this as a
      pattern and be able to cope with it. During the last few minutes, I have attempted to point
      out just a few ofthe difficulties you have
      encountered or will soon encounter. They may be
      "old hat" to you because the same type of problem cropped up in
      the last system you were responsible for. With
      electronic speeds, they just crop up over a shorter period of time. Burroughs is not only doing research which allows them
      to produce equipment which will better meet your requirements of
      capability, reliability and cost but also programming tools so you can use
      that equipment moreeffectively and efficiently. We
      recognize that you in the field have the problems which you alone can
      solve. We are attempting to give you the ' best
      tools possible so that together we can contribute to better business,
      Industry and government management and thus continue as a
      leader of the world of tomorrow.   *Mrs.
      Mary Hawes is Senior Product Planning Analyst with the Burroughs
      Corporation -Electro Data Division. |  
    | 
 
 **Editors
      Note:  Who Was the NMAA? In the following paragraphs there is a brief
      overview of  the NMAA, which became  DPMA (my era!) and then
      evolved into AITP  --Ed Sharpe
      Archivist for SMECC History of AITP The year was 1951. Harry S. Truman was President of the
      United States. A 3 bedroom home cost $9,000.00. A new Ford listed for
      $1,480.00; postage was $.03; and a loaf of bread cost $.16. Joe DiMaggio
      retired from baseball; I Love Lucy premiered; and peace talks began in
      Korea. In Chicago, a group of machine accountants got together
      and decided that the future was only beginning for the TAB machines they
      were operating. They were members of a local group called the Machine
      Accountants Association (MAA). The technology was new; something few
      people understood and managing this new technology was a skill that even
      fewer people possessed. The machine accountants recognized the need to
      form a professional support group, a national association, to address the
      growing issues of this new technology. Thus on December 26, 1951, after a
      constitutional convention was held in Chicago, the State of Illinois
      granted a charter and the National Machine Accountants Association (NMAA)
      was founded. Groups from Houston, Columbus, Wabash Valley, the Twin
      Cities, Penn-Del, and 22 others were the first to join NMAA. Robert L.
      Jenal, systems manager for Toni Company, was elected the first
      International President at the 1952 First Annual Convention in
      Minneapolis. In 1960, the association sponsored a meeting of
      educators and businessmen with the purpose of establishing the Certificate
      in Data Processing (CDP) professional examination program. The first CDP
      exam was held in 1962 in New York. 1962 was also the year that the
      association leaders recognized the changing nature of information
      processing techniques brought about by the introduction of the computer.
      Thus, the members decided in 1962 to adopt a more progressive name, the
      Data Processing Management Association (DPMA), to reflect the changing
      industry. Always striving to promote the continued education of
      the members, the leadership of DPMA created the Registered Business
      Programmer (RBP) examination in 1970. Both the CDP and the RBP exams were
      given annually under the rules established by the Certification Council,
      at test centers in colleges and universities across North America.
      Eventually, DPMA decided to help establish the Institute for the
      Certification of Computer Professionals (ICCP) to stimulate more
      widespread interest and industry acceptance of the examinations. ICCP
      began administering the CDP program in early 1974. The association has always acknowledged the
      contributions of prominent professionals within the Information Technology
      field. Beginning in 1969 with the creation of the annual Computer Sciences
      Man-of-the-Year Award for outstanding contributions to the information
      processing industry, DPMA has established a long-standing tradition of
      honoring IT professionals from every aspect of the industry. This
      prestigious award was renamed the Distinguished Information Sciences Award
      in 1980 and is awarded every year at the Annual Meeting of the Members. As the industry has evolved, so has the association.
      Starting as the NMAA, evolving into the DPMA, and then into our current
      evolution in 1996 of the Association of INFORMATION TECHNOLOGY
      PROFESSIONALS (AITP), the association has kept pace with the changing
      needs and interests of our members. AITP members span every level of the
      IT industry from mainframe systems, to micro systems, to PC based LAN and
      WAN systems, to virtual systems and the internet. AITP has special niches
      created that cater to the special interests of our members. Our members
      are found in every facet of society as well. They're in colleges and
      universities; banking; industry; retail; the armed forces; local, state
      and federal governments; hospitals; etc. Copyright © 1998 Association of INFORMATION TECHNOLOGY
      PROFESSIONALS To further information, or to become a member of AITP  go to:http://www.aitp.org/
   
 |  
    | We are actively
      seeking information  related to any of the topics presented on this
      page... please email info@smecc.org if
      you can be of assistance! Thanks, Ed Sharpe archivist for SMECC   |  |