Note: This was an assignment I created in Fall 2022 for the class CSCI-INFO 5871-5919. The course title was HCC Survey and Synthesis: Foundations and Trajectories taught by Dr. Leysia Palen.


Grudin writes that “Two fields of research that gave rise to HCI: (1) making human use of tools more efficient and (2) the other on ways to represent and distribute information more effectively.” The vision was for everyone to be able to become ‘hands-on users of computers.’ Grace Hopper also supported this work wanting to ‘free mathematicians.’ As I explored the Grudin paper and thought of Hopper, I also wanted to bring in key dates of things happening the Earth and environmental science information space that have benefited from the HCI field. 

Over the course of the semester, I added additional notes from works that we read that seemed particularly relevant to my understanding of HCI. One particularly interesting trend that I returned to over and over through the course was the idea of AI winters and summers, checking with Figure 1 below, to see if the work was happening in AI winters or summers, so adding this to the decadal reference doc to come back to as needed. 

Figure 1: AI Winters and Summers; Source: Dr. Leysia Palen, INFO 5919 Lecture 1, Fall 2021

1960s – Treat Humans like Machines

HCI was heavily informed by military funding in the 1960’s. 1962 saw the formation of the Information Processing Techniques office IPTO of ARPA. It had visionaries like Licklider, Sutherland, Englebart and Nelson who saw what would be possible 40 years later, but didn’t have the capacity to make it so technically yet. Sutherland and Englebart both pushed what was possible in prototypes and demos. Licklider published ‘Man-Computer Symbiosis’ in 1960. Sutherland published his dissertation in 1963 that seemed to prototype Licklider’s vision with Sketchpad and then took over for Licklider at IPTO in 1964 and funded Englebart. This collaboration led to the launch of ARPANet in 1969, the precursor to the internet.  

In the 1960’s computers were not accessible to everyday people, but the least paid or seen as skilled workers were the ones that actually had access to implementing programs. The transition into the 1960s from the 40’s and 50s saw the emphasis on augmentation and interactivity instead of just on human factors with an AI Winter in the early 1960s, but most of the 1960s were spent in an AI summer. Linking and retrieval is interesting and is imagined early, implemented with microfilm and that is the first way that it works. 

It is interesting to see how domains collide and divide.  Libraries in the U.S. were underfunded, focused on training librarians and for everyone and libraries in Europe were well-funded for specialists. They implemented the rigid Dewey Decimal System for organization and the first Library school was opened at University of Chicago in 1926. Librarianship did not mention information tech. Information science became a discipline in the 1960s with a conference held at Georgia Tech by Englebart. The division of Information Science and librarianship seems important as it carries through today. 

In Earth science, lots was happening (primarily, in Boulder) in the 1960s. The University Consortium for Atmospheric Research (UCAR) and National Center for Atmospheric Research (NCAR) were founded in 1959 and 1960 to support large-scale atmospheric research. The Cooperative Institute for Research in Environmental Science (CIRES) was founded in 1967 as a partnership between NOAA and CU-Boulder. In addition to this activity, the International Council for Science Committee on Data for Science and Technology (CODATA) was formed in 1966 (CODATA@45Years) to coordinate the ‘data deluge’ facing an increasing number of scientific researchers.  

1970s

The 1970s were an AI Winter and HCI was again on the rise and prices started to fall making computing more accessible to businesses and government. XEROX PARC was established and so was HUSAT. These two organizations provided both discretionary (PARC) and nondiscretionary (HUSAT) research and they overlapped with Human factors particularly with workstations like XEROX ALTO. The 70s still had the differentiation of roles of operators, analysts, managers. With the rise of computers in offices, the 70’s saw interest in 1971 in email and file sharing in 1973, but networked computing was still a ways off. Management science began to publish a column in 1967 and started MIS Quarterly in 1977. With this increased use, HCI began to need sociotechnical approaches because users were difficult or resistant. Computer graphics exploded. By the end of the 1970’s Hiltz and Turroff published, The Networked Nation, with a vision for a computer in every house. 

The 70s saw information science become more normalized across universities with the first info science Ph.D program in 1970 at University of Pittsburgh and department followed in 1973. Georgia Tech renamed their school to Information and Computer Science in 1970. 

In 1970 both NOAA and EPA were formed as U.S. government agencies. In 1970, NASA had the greenlight to build Landsat-1, the 1st satellite and it launched in 1972. The 1970s also saw software developed to analyze NASA remote sensing data on university mainframes. In the mid-70s US Government tried a large-scale data processing project that failed, but informed user research. 

Notes from my Dad: 

The first one I saw was in 1975 in college and it was a three register HP desktop. Then I went to York TECH in 1977 and we used keypunch an the college operated as a remote entry site – the actual computer was in Columbia on USC campus. York tech got a SOL32 desktop computer before I left – still fairly bulky and only 32k of RAM. Then I went to work at Springs Mills in 1978 – 81 where they had two big IBM 360s – again cards, tape and disk drive environment. 

Gerhard Fischer enters the research scene here in HCC and is exposed to Logo and Papert in the 1970s. Papert emphasizes how computers were becoming part of everyday life and giving children the power to be ‘active builders of their own knowledge structures. 

By the end of the 1970’s we are beginning the first wave of HCI and with each wave, the question is could computing be more? Waves are cumulative and still value of old things so that scholarship stays in front of the tech development… maybe?   

1980s First Wave of HCI

Starting in the 1980’s we began to see HCI forking from STAR onward to include many different fields. One great example of this is from November 1980, the experiential artwork called a ‘Hole in Space’ by Kit Galloway and Sherrie Rabinowitz were self-described ‘telecommunications artists.’ Their piece, ‘A Hole in Space,’ was a life size, livestream video and audio feed installed in two places, Century City in LA and the Lincoln Center in NY City. It ran for two hours on three days Nov 13, 14, 15 in 1980. With this work, Rabinowitz and Galloway began to crack open (fork) HCC towards seeing beyond the traditional computer scientist or psychologist researcher to include art as research.

Xerox STAR and Apple LISA came to the market in the 1980s. Prices continue to drop and less expensive minicomputers like STAR and LISA coming to the office and then IBM debuts the PC and minicomputers are phased out. With software, GUIs exploding, computers became accessible to individuals. While the Xerox Star wasn’t commercially successful, Smith et. al. created the metaphor of the desktop, icons that are still in use today, windows and direct representations of pages that made computers accessible to everyday people. This led to Office Information Systems (OIS) to go alongside Management Info Systems.  The 1980s are about office automation and increased discretionary use. There was debate whether the office computer use was discretionary or not because they did not choose the tools. Automation also wasn’t a term liked by office workers. Journals and SIGS emerged in the 80s. On a personal note, my parents had their first computer at home in 1987 when my dad started working at IBM.  

It seems like there is a rise in psychologists who liked tech here and in experiments were high (“Text editors are new white rats in HCI”). CHI was predated by SIGSOC – or Social and behavioural science computing and super well received. After the first conference they renamed it SIGCHI with the first conference in 1983. They focused on novice use and missed skilled use. In Europe focus was on inhouse development and use. CHI was disconnected from early work initially. CHI wanted to differentiate and split from Human factors and most cognitive psychologists were not familiar with HF literature. CHI had new journals and the focus shifted from human factors to User-centered welcoming design and other disciplines beyond engineering.  In the late 1980s, there was a rise of collaborative work and CSCW formed. This area also is where participatory design and ethnography saw rise with influence by Scandinavians. 

Card, Moran, Newell GOMS Cognitive Psychology to begin to bring experimental structure into understanding HCI. They realized both computers and people were information processors and brought cognitive science to the field. By end of 1980s Larkin and Simon (Chpt 46 HCC Remix) “bring forward the big idea that cognition involves internal and external representations.” In 1984, CU Institute for Cognitive Science had enrichment funds to establish a collaborative relationship with Computer Science and initial focus was on AI. Clayton Lewis and Gerhard Fischer were hired. 

Lucy Suchman in her book, Situated Action, from 1987 helped CHI “turn to the social.” This period in the late 1980s and early to mid 1990s saw methods and perspectives from sociology and anthropology being widely adopted in HCI (Rooksby, 2013). 

Finally, it is amazing to me that Shoshana Zuboff, a social psychologist, and authored the book, In the Age of the Smart Machine, in 1988. She coined the term informating which is a companion to automating. The information that emerges from the collection of data. 

Universities continued to add programs with Information in the titles. ALA added the Association for Library and Information Science Education (ALISE) in 1984. In the Earth and environmental science space, NSF funded the creation of Unidata in 1983 and UNAVCO and IRIS in 1984 to provide cyberinfrastructure for weather/atmosphere, GPS and seismology communities respectively. From 1980 to 1986, NASA piloted data systems studies as a precursor to the Earth Observing System approved by the U.S. Congress in 1990. One other note is the rise of the supercomputing centers around the U.S. which started in the mid-1980s and Supercomputing ACM/IEEE Computer Society meeting started in 1988.  

1980’s AI Summer: In 1982 Japan established the Institute for New Generation Computer Tech and focused on AI. This spurred international rise in AI again. AI and CHI could have been closer, but AI researchers didn’t want to incorporate novice users and so they ended up moving toward HF&E and more hardcore engineering.    

1990s – 2nd Wave of HCI – Whole Humans, Groups of Humans

The 1990s were another AI Winter and saw the growth of CSCW, collaboration and group decision support systems (GDSS). OA/OIS disappeared by 1995 as CSCW took over. Transactions of Office Info Systems -> Transactions of Info systems. Participatory Design and ethnography also became a larger part of CSCW, due to Lucy Suchman, who managed Xerox PARC. Ubiquitous computing was being imagined by Weiser and realized with location-based, Active Badge System by Want and Rey. This brought together the human factors and HCI. The Active Badge System seems to be where I first see truly considering humans as part of the system and designing to solve their problems instead of cool tech looking for a solution. HCI and psych continued to change, and many built on Larkin and Simon work. This is precursor to 2nd wave work.  We start to see programming and design for many types of users beyond software engineers and many groups of users. Schmidt and Bannon wrote a seminal piece about CSCW and Orlikowksi studied software adoption expanding from the individual to the organization. 

Universities continued to evolve Info science programs, and many dropped library from their names. Computer science was embraced by CHI and programming languages R and Python both were ‘first seen’ in the early 1990s. Domain researchers were starting to consider using computers as part of their workflows as described by Star and Ruhleder in their 1996 paper on the Worm Community System. The biggest advancement of the 1990s was the arrival of the Internet. In 1995, it was on university campuses and by 1999, $200M in funding led to widespread adoption. At CU, the Center for Life Long Learning and Design (L3D) formed in 1994 and Gerhard Fischer was the director.

With Congressional approval of the EOS mission in 1990 to better understand climate, the ‘NASA Earth System Enterprise’ was also formed to build the Earth Observing System Information System (EOSDIS). In addition, 12 distributed active archive centers (DAACs) in specific congressional districts were funded to support the archive of domain-specific NASA data. By the mid-1990s, the research community felt left out of the NASA EOSDIS process and there was a National Academy of Science study to review EOSDIS. This led to NASA funding 24 university projects that focused primarily on research and application uses of NASA data and then with the 12 DAACs, formed the Earth Science Information Partners in 1998. In addition to ESIP, the Open Geospatial Consortium, the body responsible for standards relating to geospatial data access, formed in 1994. Google search was founded in 1998 and the first Open-Source Conference (OSCON) was in 1999. Researchers were also teaching themselves to program and in 1998 Software Carpentry was launched to give researchers the initial skills needed to do data analysis. 

2000s  

Jackson notes in (Jackson, 2014) that CSCW by the late 1990s and early 2000s had pushed beyond its initial ‘traditional workplace grounding and toward an increasingly “social” direction.’ 

By the 2000s, we completely lost the separation of roles seen in the 1970s and managers were readily using computers as part of their work. By 2000, ten universities had programs with just Information in the name and that number continues to grow. CHI begins to include design and CSCW expands to include Web 2.0 and the rise of social media and near real-time communication technology. Hutchins et. al (2000) provides a new theory of distributed cognition to support this networked, collaborative paradigm shift and a research framework that brings together cognitive ethnography with experiments to support iterative design work. We start to see the AI Summer rise and funding from NSF was funded primarily geared toward AI-related work under the umbrella of HCI, but interested in brain-computer interaction. 

At CU, in 2001 Bill Coleman, a businessman and co-founder of BEA, pledged $250 million to CU because in part of the work happening at the Center for L3D to support improving the quality of life for people with cognitive disabilities. Unfortunately, the stock market tech bubble burst before the pledge was realized (see Figure 2). In the early to mid-2000s several centers including ATLAS and the Discovery Learning Center also form on campus for CS + *, where * is creativity or engineering. Project EPIC is formed in 2009 and “Crisis informatics examines how networked digital technology—particularly the social media-featured technologies of the 2000s and beyond (site)” 

Figure 2: Stock market index showing both the 2001 crash and the 2008 crash.

We also start to see the rise of Cyberinfrastructure with the 2003 Blue Ribbon report, the 2007 Edwards et al Workshop and in 2007, NSF CISE initiated a Datanet program, “DataNet program aims to create “a set of exemplar national and global data research infrastructure organizations (dubbed DataNet Partners) that provide unique opportunities to communities of researchers to advance science and/or engineering research and learning.”  There were Earth and environmental science specific DataNet projects like DataOne and the Data Conservancy. With NASA’s increase in Earth science data and information systems in the 1990s combined with these projects, the American Geophysical Union created a new section on request of the president of AGU, the Earth and Space Science Informatics section in 2006. 

On the technology side, Keyhole the precursor to Google Earth launched in 2001, AWS launched in July 2002 and use of persistent identifiers like DOI’s began at the end of the 2000s. In 2001, iPython was created by Fernando Perez, while he was a doctoral student at CU Boulder and Brian Granger, also an alum of CU Boulder. iPython formed the basis for Jupyter notebooks and the accessible programming that we see now changing the way we teach and do natural science. The 2000s saw the term Open become more common. Open science was coined in 1998 by Steve Mann and several declarations around open access of research happened to build on this in the early 2000s. The Creative Commons licenses were also established in 2002 to reduce the barrier to sharing, but still receive credit.  

2010s 3rd Wave of HCI – Social Dynamics, Non-work activities

Community context – tech is adopted into richer community and cultural context (not sure where this note came from, but I like the sentiment 😊 )

The 2010s have seen embedded and ubiquitous computers, complete with digital natives, or children who have never known a world without FaceTime and computers everywhere. In the 2000s we were just learning how to use mobile phones and by the end of this decade many people just have mobile phones and in developing countries mobile phones have leapfrogged traditional landlines. With this ubiquity, we are realizing the dark side of technology that Ted Nelson alluded to in 1974 with privacy, data breaches, mis and disinformation spreading rapidly and social media amplifying and behaving in ways we don’t understand. 

In 2010, the Obama Whitehouse had strong initiatives to make the government transparent and to share scientific data openly. In 2011, NSF began to mandate data management plans as part of proposals for funding. In 2013, the Obama Whitehouse published the OSTP memo on ‘Increasing Access to the Results of Federally Funded Scientific Research’ where it stated, “the direct results of federally funded scientific research are made available to and useful for the public, industry, and the scientific community. Such results include peer-reviewed publications and digital data.” From this point, there were new collaborative initiatives like NSF’s EarthCube formed in 2011, FORCE11 formed in 2011 to support scholarly communication community, the Research Data Alliance, an international community of practice funded by NSF, in part, formed in 2013 and the FAIR Principles (Wilkinson, et. al, 2016). We have also seen a push from societies like the American Geophysical Union to include data sharing in their programming and the European Geophysical Union has moved all publishing to Open Access.  

Data intensive science is on the rise with more and more researchers across domains learning to leverage computational power to do new research. Data science programs are spreading across universities, and they seem somewhat separate from iSchools and libraries. In 2010, the University of Washington created the eScience Institute and in 2014, the Moore-Sloan initiative awarded three universities, UW, UC Berkeley and NYU, five-year funding to create the Moore-Sloan Data Science Environments http://msdse.org/ which ended in 2019. 

In 2017, the Information Science department was formed, led by Dr. Leysia Palen with 12 faculty, 12 affiliated faculty and 12 initial graduate students. Palen also continued to work on Crisis Informatics, with an interesting paper on Crisis Informating that closes out this decade now that we have social networks. 

2020s – Fourth wave of Activism, Critical theory and Ethics

As we enter the 2020’s ethics is front and center. We can technically but should we from a moral and ethical perspective. Mis- and disinformation are on the rise and social media platforms seem to bias the angry. The work that the Palen Lab with leadership from Leysia Palen and Lindsay Diamond on understanding the different voices present in antivaccine dis and misinformation on Twitter and the work that Amy Voida’s group led by Shiva Darian are doing around understanding data-rhetoric used in blogs and social media by political nonprofits on both sides of the aisle are creating a foundation for understanding the complex sociotechnical implications of these platforms. Gerhard Fischer brings up quality of life and design trade-offs. Casey Fiesler, Bryan Semaan and others are seeking to understand, critique, and create ethical, moral, just and equitable sociotechnical systems by drawing on critical perspectives. 

It’s an interesting time to enter HCI as a new scholar. Recently, Soden, Ribes, Avle and Sutherland wrote a paper entitled, “Time for Historicism in CSCW: An Invitation.” The authors were motivated by CSCW tending to take a “presentist and forward looking approach, which prevents historical grounding.” The paper identified three ways that a historist approach could support CSCW including: (1) inspect the technical and its relationship with how groups relate to each other over time; (2) Examine historical traces but beware of the missing voices and (3) use old as inspiration for new design. This class has grounded my early research in the history of HCI. I agree with the authors that it is essential to moving forward and appreciate the context that I have now for placing work into this timeline.  

References: 

Grudin, J. (2009). AI and HCI: Two fields divided by a common focus. AI Magazine, Winter, 48–57. https://www.microsoft.com/en-us/research/publication/ai-and-hci-two-fields-divided-by-a-common-focus/

Jackson, S. J., Gillespie, T., & Payette, S. (2014). The policy knot: Re-integrating policy, practice and design in cscw studies of social computing. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, 588–602. https://doi.org/10.1145/2531602.2531674

Rooksby, J. (2013). Wild in the Laboratory: A Discussion of Plans and Situated Actions. 

ACM Transactions on Computer-Human Interaction, 20(3), 19:1-19:17. https://doi.org/10.1145/2491500.2491507

        Soden, R., Ribes, D., Avle, S., & Sutherland, W. (2021). Time for Historicism in 

CSCW: An Invitation. Proceedings of the ACM on Human-Computer Interaction, 5 (CSCW2), 1–18. https://doi.org/10.1145/3479603