The following unedited text was written and submitted by Robert L. Glass, 1952 Culver-Stockton College Summa Cum Laude graduate, and his children and grandchildren-in- law, son David Glass and his wife Julie Nakamura Glass, and grandsons-in-law Mercury Springberry and Jon Philips.

Recollections of What it Means to be a Computing Professional
This is an unusual three-generation family story of the recollections of the computing software field, told by three generations from one family. It spans a period of some 70+ years and therefore encompasses much of the history of that field. It is told from the point of view, not of computing theorists, but of computing practitioners.

In a way, it is an astonishing story. From its primitive beginnings (you’ll be astonished at how primitive the early days of the field were!), it brings you up to the present. This article will, I hope, be of most interest to people studying to prepare to move into those fields professionally. But it may be of interest to everyone else, because progress in the technical fields has been so rapid in the last few decades, and because computing is such an essential part of the lives of all of us these days.

Robert

When I graduated from college, as you will learn below, there was no computing field – because computers essentially didn’t exist!  And there was no way to prepare academically for a computing career because there was no such field and there were – nowhere in the world –any such academic courses. You didn’t even think about preparing for a computing career, since there was really no such thing. If you had been able to think about it, you probably would have majored in Math – or perhaps Music (because creating a musical score is much like writing a computer program, in the sense that instruments performing a musical score is analogous to a computer executing a program)!

I majored in Math. I wasn’t trying to prepare for a computing career, because I wasn’t aware there was such a thing.

As time passed, as I matured into a maturing computing field, I had a couple of children. In the meantime, computers had come into being, and I had become a software professional, and darned if my son David didn’t follow in my newfound footsteps and also move into the computing field. And darned if he didn’t marry Julie Nakamura, who was also heading for the computing field. And when the time came for them to have children – Amy, Pam, and Mari – darned if they didn’t grow up to marry computing professionals as well. So, by now, there are three generations of us computing professionals in my family. I suspect there are few if any other families who can say the same thing.

What follows is a few recollections emerging from those three generations. First my own, which will strike you as more primitive than interesting. Then those of my children/in-law children, which will still strike you I suspect as ancient. And then, perhaps most interesting to you nowadays, will be the recollections of my grandsons-in- law. I may or may not convince all those kids (now adults!) to contribute to this article, but I will at least have a representative of each generation of computing professionals! Computing in the 1950s, as seen through the eyes of me, Robert L. Glass

I graduated from Culver-Stockton way back in 1952.  Back then there were few computers anywhere in the world. The ones that did exist weren’t laptop things, they were massive room-filling devices that couldn’t function without roomfuls of air conditioning, and where if a computer did exist, it was probably related to the defense industry. In fact, back then, there were no computing courses in colleges and universities anywhere in the world.

The point I am trying to make is that the field you are about to embark in, computing students, is astonishingly N E W.

The first computing course I took at university was in fact at the graduate school I attended after C-S, the University of Wisconsin, and the faculty member who taught the course had no computer at U of W to use, but rather his experience came from the
summers he had spent at the Los Alamos nuclear laboratory in New Mexico working on the atomic bomb.

And, as I finished my graduate education and moved into the US defense industry at North American Aviation, I began my computing career operating not a computer but a desk calculator.  Computers were still, then in the mid-1950s, not in common use even where they were most needed.

In fact, back in those days (the mid to late 1950s) the origins of the data processing, business-focused aspects of computing were very different from the scientific ones. Business data processing people used IBM punched card equipment and told the
equipment what to do by wiring a board which was placed in a control panel at the side of the machine.  For example, one common board was the 80/80 board, which caused the equipment to make an 80-column copy of a deck of cards. Meanwhile, the scientific folks were using manually operated desk calculators. The two origins, eventually to merge into one computing world, had little or nothing in common back in those days. Even when computers arrived and the punched card equipment and desk calculators began to hit the waste bins, data processing used decimal computers.  Scientific folks used binary computers, which as it turned out were more efficient and faster.  As the fields evolved, and eventually joined, those decimal computers phased out and all the applications were done using binary.  Interestingly, back in the days of those decimal computers, “word length” was determined by the programmer, who defined fields by a “set work mark” command.  Meanwhile, of course, binary computers had a fixed word length, originally 36 bits and eventually 32.

It was not uncommon for political issues to arise as the two “computing” fields gradually evolved into one.  For one thing, the question arose as to which computer manufacturer would get the order for the new-fangled computing equipment.  The most common answer to that question was “IBM,” but a religious schism developed over that decision. IBM was a “safe” choice – if your decision somehow failed, you had made the most common choice that most others made, and no one could fault you. Other, more daring managers, chose a different path, going the “anything but IBM” route, to such alternative manufacturers as Univac and Control Data.

The religious wars were so intense that one company where I worked decided on a software strategy that would allow them to make a free choice as a computing manufacturer. They elected to construct their own operating system and programming language compiler in order not to have to depend on those provided by the computer manufacturer, as I said most commonly IBM.  A free-thinking manager and a daring/brilliant technologist at the Aerojet-General corporation elected to construct its own operating system - they called it “Nimble” – and its own language compiler for the language NELIAC. procured from the US Navy – and set about the complex task of building support for such an independent and freeing system.  The story of that effort, which eventually failed, is told in [Annals 1991].

One other interesting anecdote emerged from those political early-day battles.  At the Boeing Company, the data processing people had been using an IBM 7080 decimal computer for their applications, and the scientific people had chosen a Univac 1107 for theirs.  The decision was made to merge the two organizations – a common decision in those days – and as part of that decision the IBM machine was to be phased out.

Programs could be written for the 7080 in one of two modes – in a “macro language,” or strictly in the machine language of the 7080. Data processing management had edicted that the machine language not be used.  When I was hired at Boeing, I was given the assignment of building a translator that would allow those macro 7080 programs to be executed on the 1107.  The task, as it turned out, was simple enough, and in a few months, my team of three had a translator functioning, one that could successfully translate the sample macro programs we constructed.  But when we tried it on “real" programs, it turned out that the data processing programmers had ignored the management directive and used many machine language instructions.  The translator we had developed, as it turned out, was worthless.  (These stories are developed more thoroughly in InBegin 2020).

Early on, and prior to all of these political and religious wars, the computer I did use was, in the late 1950s at North American Aviation, an IBM Card-Programmed Calculator, known as a CPC, to help is with our aircraft dimensioning calculations (my organization at the time was “Master Dimensions,” which computed and tracked all of the important dimensions of each aircraft part that NAA manufactured). You are going to find this hard to believe, but there was no stored program in a CPC, they didn’t exist back then.  The program was instead on punched cards, and it executed as the cards were fed in through the card reader!  That is, the computer operated, believe it or not, at card reader speed!  Ponder, if you will, how one could perform iterative computing on such a device (answer- we simply fed in a copy of the iteration cards, as many times as we hoped it would be necessary, into the card reader)!

Things moved swiftly in the computing field back in those days.  In a year or two the CPC became obsolete, and we got from IBM a model 650 computer, in which the program - believe it or not, and it was hard for us back then to believe it – resided in the computer’s memory.  The field advanced astonishingly rapidly back in those days, and the computers operated vastly faster.

In fact, the advances came so rapidly that we had to rewrite all of our programs every year or two to fit the new model computers coming along, none of which were compatible with what we were using before.  For us “scientific” programmers, that meant a whole series of computers known as the IBM 700 series, then in 1965 they were obsoleted by the brand-new IBM 360 concept, which again had no compatibility with the older IBM models.  You spent a lot of months, back then, rewriting old programs to become usable on the new devices.

The 360, as it turned out, was IBM’s final answer to that data processing vs. scientific dilemma. The machine contained sufficient instructions to allow it to perform for either kind of application.

I still remember that the IBM 360, such a new concept, was presented by IBM in the rented and magnificent movie theatre in the city where I lived, and we who attended that presentation were astonished at how many new kinds of people attended.  I remember
nuns in habits and police/sheriff officers in their uniforms and wondered what it was those folks would need an IBM 360 for. 1980s computing, as seen through the eyes of my son David Glass and his wife Julie Nakamura Glass

David 
I was sitting in the audience at a national computer conference, an accomplished programmer in my own right, listening to my father give the keynote speech. I knew times were changing when he described the age of the industry as a first-generation technology. Of course, this was news to me…he was either telling me he was not really my father or the industry had grown!

The funny thing is, other than first grade when I got hauled in front of my teacher for cheating on a math test, (I peeked at Suzi’s paper to get the only two right on the page), I love math.  As my math skills grew, so did the applied math of computers.  I knew that my dad worked with computers, and every so often he brought home a magical deck of computer punch cards to write notes on and play with.  Though he often worked on secret projects that he couldn’t talk about at home, I knew he had a pretty cool gig going on.  As I continued with my education, very simple computers – essentially calculators that you had to program with tiny punch cards –were introduced.  I found them easy and fun. By the time I reached college, a Math/Computer Science degree was perfect for me.

Looking back, the industry has continued to evolve. When I first started, I was keypunching card decks to submit to the computer, or toggling machine code into the front panel. The mainframe computer, with its refrigerated room and elevated floor (to hide all the cabling) was giving way to the latest technology: the mini-computer! Still substantially larger than computers of today and with less overall power than the average smartphone, the Digital Equipment Corporation and its VAX 11/780 was such a leap forward it changed everything. Although the computer was still in the refrigerated room, we now had computer monitors. We could actually type programs directly into the computer, and run it… right after we compiled it, linked it with external modules, and set
it to execute.

Suddenly we had virtual memory and didn’t have to worry about paging strategies or real memory limitations. We had virtual memory where the operating system rotated a program to core memory to execute, then seamlessly page out to virtual memory on the disk. The computer would run more than one program at a time! We were using mutual exclusion semaphores to eliminate race conditions to create multi-user systems. Of course, the output of these programs were almost always text, the same as the input. I remember one project I had was to patch into the operating system of the VAX, because a customer wanted to use a WORM (Write Once Read Many) optical disk (read prehistoric CD) as a regular magnetic disk drive. To do this project, I had to read the proprietary operating system’s source code from the provided microfiche, then determine how to patch into it so as not to crash the system. Then I had the hard part of making an optical disk act like you could read and write to it multiple times like a regular disk. Of course, the simplest of computers today can do that now with CDs and DVDs, but at that time, working in machine language, it was bleeding edge. Trust me. As a systems programmer, I spent a lot of time programming in the language of the processor, well below that of a typical high level compiled language. Working at the processor level meant dealing directly with the processor’s on-board registers, its processing unit, external data latches and more. As the systems got faster and smarter, floating point processors were mated with standard processors, to create even higher speed systems.

Floating Point Systems was one of the first to do this–but keep in mind these were still at least kitchen cabinet-sized devices. These devices were designed to process vectors of data, making them a natural to process new technologies like the digital signal processing in audio (sonar underwater systems to music synthesizers) or the new technology of graphics. Digital imaging systems like medical CAT and MRI scanners became smaller, faster, and have higher resolution.

Programming these devices at the machine level, there were ways to take advantage of the parallel nature of the processors by “folding” a program inside of itself to take full advantage of all the parallel processing units. This trick involved writing code to process a vector in such a way that in any loop multiple levels of the vector were being processed at once. As a programmer, it was a whole other layer of programming once the main algorithm was in place to get it folded in such a way to have all processing units running 100% at the same time.

Of course, more and more parallelism was created in processors and they shrunk smaller and smaller. The Intel i860 and i960 processors were some of the first on-chip parallel systems. Programming those devices certainly separated the pros from the newbies. It was a great time to be a programmer!

Graphics and video were fast becoming the new frontier. Graphical User Interfaces from NeXT, Apple, and other companies turned programming on its ear. While I never really clicked with that (pun intended!), we found ourselves involved with creating new video compression technologies. At the time, ISO (the International Standards Organization) was defining video and photo compression that would create a whole new world. JPEG, MPEG, and H.264 were leading edge technologies that allowed for high-speed video to
be compressed for digital transmission. My company was developing a software-only version that ran on PCs, while the big companies were creating hardware-based systems.

While it was a technology race, my little company of high-powered programmers continued to produce software that did all that the hardware guys could do–and on a PC at that! We could send 2-way video over a standard modem phone line at a video frame rate. We could play MPEG movies on a PC. Big companies were beginning to take notice of our little group. I had fun with the tech. I received a patent for video email, even though I kind of pissed off a magazine editor or two by filling their in box with a video message. I even had an impromptu interview with a national radio personality on air that was intrigued with the video messages he would receive during his show. It was about this time I found myself sitting in the auditorium during my father’s speech, musing about the changes in the industry. Of course, by now, the industry has changed one or two times over and I am completely obsolete. But when my daughter shares an ultrasound of my new grandson, or I stream a movie online, I feel a small sense of pride, knowing I was there near this tech’s beginning, and I contributed to the future.

Julie

My days of computing seem a lifetime ago. I first became interested in learning about computer programming when I decided to pursue a science degree and realized that I would need to learn computer programming to advance in the data driven field of science. I was introduced to computer programming classes in 1978 at Western

Washington University. I wrote my algorithms out on paper, typed them out on data punch cards in the computer lab, securely banded them together because if one was accidentally out of sequence it was a guaranteed failed outcome. Next, our program was submitted to the computer center and I would have to wait several hours to see if I successfully programmed the assignment. During the next year, we had interactive terminals in the computer lab which made programming faster, easier to edit and submit. We were extremely excited when data punch cards became a thing of the past. The ratio of women to men in the computing classes had a higher percentage of men but there were a fair number of women learning computer programming as well. My first job was working for a company that manufactured telecommunication computers for companies. Previously, a company would have one dedicated machine for sending messages via Telex, TWX, or DDD (Direct Distance Dial or phone line). The company I worked for manufactured computer terminals that combined Telex (60 bps), TWX(110 bps) and DDD(direct distance dial-phone 110,300,1200 bps) sending and receiving capability on one computer rather than three different systems. When the customers requested custom applications to meet their specific needs my job was to create custom software and implement and integrate the software to meet those requirements onto their computer. Our smaller computer systems used a keyboard programming language, and the programming was mostly done by a handful of women for custom applications. We also had a larger system that had an assembly-like language, and it had a mix of women and men programming those applications. What I enjoyed most about working in that area was the problem-solving challenge and programming a successful solution for our customers.

I had the opportunity of traveling around the country installing and troubleshooting hardware and software. I also had the opportunity to go to Venezuela to teach a group of technicians our proprietary keyboard language and basic troubleshooting procedures. It was a unique opportunity to experience a different country and culture.

It amazes me how quickly technology changes as evidenced by the perpetual motion of new products always being promoted. Technology is never stagnant; it is always moving forward.

Computing NOW, as seen through the eyes of Mercury Springberry, who married my granddaughter Mari Springberry, and Jon Philips, who married my granddaughter Pam Glass-Philips

Mercury

My journey into computing began when my mother got a new iMac G3 when they first came out. This was my family’s first desktop computer. I spent hours and hours on it building art projects with clipart and surfing what existed of the internet at the time. My love of computers continued through my adolescence and when it came time to pick a major, I decided on electrical engineering because it combined my love of computers with my love of math

I graduated with an electrical engineering degree in 2013 from Washington State University. During my senior project, I was the team member who took on all the software requirements for our custom System on a Chip (SoC). I had to customize a Linux kernel for our board which involved a good amount of coding. I found I really enjoyed the work.

I was never able to find work as an electrical engineer. I ended up doing odd jobs until the pandemic when I found myself unemployed. Remembering how much I enjoyed coding, I took the opportunity to attend the University of Washington coding Bootcamp and after graduating got a job at Cognizant, a large IT contractor. I was placed at Excellus, a healthcare company that serves upstate New York.

My job was to be the onshore lead for a team mostly based in India. The team’s job was to make enhancements to Excellus’s website. A few example projects include using UPS's public API to verify member addresses to prevent members from making typos (this had been causing mail to be miss-delivered), building out a batch job to send recently enrolled members a welcome email and over- hauling the Enrollment and Billing help tool used by brokers and employers to contact Excellus’s customer service representatives. The team also handled all manner of minor updates to the site. The onshore lead role has a long list of responsibilities. You manage the team’s work, handle code contention between the other Excellus web teams, help team members with technical challenges, address testing concerns and manage deployment of recently written code. From a business side, you handle all reporting to relevant stakeholders both Cognizant and Excellus and help with requirements gathering.

I could never have imagined the efforts required to build and maintain large scale web applications. Some days the amount of stress involved in coordinating with other teams, supporting production defects and meeting tight deadlines feel like more than I signed up for. That said, I still love building things on computers. The satisfaction that I get now from completing a project is the same feeling I got as a kid and it’s what keeps me moving forward in my career.

Jon
From a young age, I was the kid who took apart household electronics just to peer inside and understand how they work, then carefully put them back together. I was my family's tech support, guiding my family through the complexities of our home electronics as if I were a native speaker translating a foreign language.

As I got older so did my fascination with the digital world. I went to school at the Lake Washington Institute of Technology in 2010, Getting my BA in Web Development and Multimedia design. It was there I discovered the captivating intersection of computer hardware, code and design. Computers became not just tools for work, but canvases for creativity and imagination. It was here that I got my first taste of game development, a field that seamlessly wove together art, storytelling, and technology into an interactive tapestry. I knew this was where I wanted to be but it was a tough industry to break into. My entry into the tech industry began in 2010 at Apple, where my role as a Macintosh Technician allowed me to hone my problem-solving skills and understand the consumers perspective and learn conflict resolution and people skills that only later I realized will be invaluable. Over five years, I witnessed firsthand the rapid evolution of technology and its impact on everyday life.

In 2016, my career took a turn when I joined Amazon. Starting as a support engineer, I was excited to join such a rapidly expanding company that seemingly had endless opportunities in every sector. I quickly grew into the roles of program manager and then
technical program manager within the Business Intelligence team. Over nearly eight years, I managed projects, programs, and became a scrum master while learning the intricacies of data engineering and the strategic importance of business intelligence. My
daily work involved managing developer resources, planning sprints, and ensuring that our data science endeavors aligned with the company's objectives.

After a while, I started to get an itch to grow outside of my role and try something new and explore roles in the video game industry which is something about which I am passionate. And in 2023 I got a job at Bungie (known for games like Halo and Destiny)
as a Technical Program Manager for Machine Learning and Analytics Engineering. This was an unpassable opportunity and in my view. The realm of machine learning and generative AI stands at the forefront of what I believe to be the next great technological revolution. These technologies are not just tools, they are the architects of a future where the boundary between the digital and physical worlds becomes increasingly blurred.

At Bungie, I have a ringside view of how these technologies are revolutionizing game development. But the true revolution of machine learning and generative AI lies in its accessibility. With cloud computing and open-source software, what was once the domain of tech giants is now available to startups and individual innovators. It's democratizing creation, empowering a wave of entrepreneurs and creators to bring their visions to life.

My current role draws on all my previous experiences. It's a dynamic environment that requires vigilance and adaptability, ensuring that our engineering efforts contribute to the immersive gaming experiences for millions of players worldwide.

My path has not been a straight line, it's not been easy or very predictable, but I think it showcases the professional landscape of technology today which is a mosaic of experiences, with career shifts being a norm rather than an exception. These transitions
aren't indicative of restlessness but rather a search for challenge, growth, and the chance to make a meaningful impact. Effective communication has been the cornerstone throughout my career, serving as the key to navigating change and fostering collaboration. In an industry where change is the only constant, the ability to articulate complex ideas and unite teams toward a common vision is invaluable.

REFERENCES
Inbegin 2020 – Book: In the Beginning 2.0 Recollections of Software Pioneers, Developer.*, 2020
Annals 1991 – “The Project That Failed That Succeeded, Annals of the History of Computing, Vol 13 No. 1, 1991
Making America Great Again (MAGA) - E Pluribus Unum / Giving Away Secrets of America (GASA) - Trump
5/One Glenny St., Toowong QLD 4066, Australia / 1051 Griffith Point Rd., Nordland WA 98358, USA