Humboldt-Universität zu Berlin - Berlin School of Library and Information Science

Projects and Activities

Amerika Haus Archiv (Prof. Dr. Elke Greifeneder, Vera Hillebrand)

2018-2019 The aim of the Amerika Haus Archiv 2 project is to revise the structure and content of the digital library of the Amerika Haus Archive and to relaunch it. The Amerika Haus Archive contains parts of the Amerika Haus Berlin library's material, including the Amerika Dienst, which was published by the American Embassy between 1950 and 1994. The Amerika Dienst served to provide Germans with information about the USA after the Second World War. The range of topics covered by the publications includes politics, business, science, society and culture. The majority of the documents are unique and valuable for all those interested in recent American-German history. In particular, the project examines whether and how older digital libraries - which were primarily concerned with the digitization and publication of collections and which did not involve users in the construction of the site - can be designed in a user-oriented way through a relaunch and how much gain in user-friendliness a relaunch offers.

 

Information seeking and avoidance (Prof. Greifeneder, Kirsten Schlebbe, Vera Hillebrand)

We started a project on information avoidance and are currently coding the data. In our field, information avoidance is most frequently approached by a focus on how people cope with information overload. The problem with information avoidance is multifold: people are rarely aware that they avoid information; or they avoid information because of sensitive topics (such as not reading bills, when one has debts); or they do not remember when they have avoided information. All of those reasons make it difficult to collect data on information avoidance, but it is one of the most critical facets of the digital transformation. We are now working with a dataset collected in classes over five years of teaching. The aim of this research project is to develop a framework for information avoidance triggers.

 

Information behavior studies in natural information environments (Prof. Greifeneder)

Data validity in digital information environments was a driving force of Prof. Greifeneder’s dissertation research, and the results emphasized that it is critical for our field to know (a) how to collect valid data in online or natural environments, and (b) how to interpret these data without reaching wrong conclusions. As a follow‐up to her dissertation, she published an article in JASIST on the effects of distraction on task completion scores, reflecting the need to carefully reconsider how we interpret our standard indicators if we apply them in new research contexts, i.e. in online or real life contexts.

“From this research, it is clear that the seemingly straightforward equation—if potential distraction exists, then people become distracted—is not true. What is true is that researchers cannot control a test situation in a natural environment (or choose to refrain from controlling it, because they want to keep the setting realistic). This means that they need to have information that lets them retrace the situation as much as possible. The danger of data collection in a natural environment is not that events might occur, but that researchers know nothing about them.”

(Greifeneder 2016, p. 2869)

In 2015, she published a conference publication on outliers, which describes the outliers of a study that took place in user’s real life in detail. The study argues for a more careful handling of outliers instead of the current approach in our field, where frequently outliers are either not reported at all, or only the number of outliers is reported and that they were eliminated from the data set – without offering more detail and discussion about why some data do not follow the normal distribution.

In 2014 she conducted qualitative online interviews using Adobe Connect with users of the Danish virtual research environment LARM.fm, in order to uncover their contexts and needs when using such an environment. In addition to the qualitative interviews, she experimented with screen sharing and ran short usability tests remotely on users’ own screens. This approach allowed her to get a much better insight into the users’ behavior and their interactions within the virtual research environment.

 

Standards for data reuse when studying digital behavior (Prof. Greifeneder et al.)

In 2015 the head of the research department at Humboldt‐Universität zu Berlin signed a cooperation with the publisher Elsevier, allowing Elsevier to conduct user studies on a new academic social networking tool with researchers from Humboldt. Part of the cooperation was an agreement that data would be co-analyzed by a researcher from Humboldt‐Universität zu Berlin The data set consisted of 110 unique qualitative interviews of 81 individuals who participated in those interviews in three rounds over a period of nine months. Participants came from four different countries (United Kingdom, Germany, United States, and Singapore) and were from all levels of seniority, including a large number of senior researchers. The size of the sample far exceeded qualitative data sets that we, as an individual, could normally collect. Secondly, we were not the interviewer who conducted the interviews, but had to work with questions that were asked by others. Library and Information scientists strongly advocate for research data reuse, but we rarely see actual reuse.

 

Working with qualitative data where we as researcher could not probe further led to a couple of interesting questions about data reuse. We have one article published in JASIST using this data set in which our UK colleague Sheila Pontis took the lead in the data analysis and a second one in the Journal of Documentation where Prof. Greifeneder coded all the interviews and did the analysis. In both cases, we struggled with the reviewers who criticized us because we were not the ones who conducted the interviews, and they raised strong issues about the validity of the data, because of this fact. This research demonstrated that qualitative data can be reused, and in this case made research possible that our research team (Sheila Pontis and Ann Blandford from University College London and Prof. Greifeneder and Kirsten Schlebbe) would not have been able to afford financially. Further work, however, on the impact of the reuse of qualitative data on data validity is needed, and there ought to be a discussion about how reviewers should deal with publications that reuse data. A challenging task for the next few years will be to find an acceptable middle road between facing repeatedly criticism for not having collected the data set oneself, and giving carte blanche to researchers who reuse unacceptable research designs.

 

data reuse

 

 
 
 
 
 
 
 
 
 
 
 
 
Level of detail in literature reviews (Prof. Greifeneder, Kirsten Schlebbe)

A third research project on standards examines the level of detail reported on other studies in literature reviews. We we analyzed a large set of publications on the use of academic social networks according to the method the respective authors used to collect data on this usage, looking at the way prior studies had recruited participants as well as the size and diversity of the sample. We used these data to have a second look at the literature reviews of the publications on use of academic social networks, and examined in what detail other researchers report on methods and samples. Our analysis shows that the level of detail in the literature reviews we examined was rather low: if study parameters were mentioned at all, the authors described at most one or two. In other cases information about the research design or sample characteristics of the cited studies were missing completely. Even if studies directly compared their findings with previous studies, essential information was lacking. The paper has been published as a preliminary result for the iConference 2017 and was awarded runner-up for best preliminary papers.

 

Level of detail

 

Research quality in conference reviews (Prof. Greifeneder, Prof. Bogers, University of Aalborg)

One way to promote standards for methods and data quality in our field is to study review data that show how other researchers define what acceptable and good use is. This project examines data from reviews. In 2015, our Danish collaborator Prof. Toine Bogers and we received access by the executive committee from the iSchools and IRB approval from the Humboldt‐Universität zu Berlin to analyze all of the reviewing data from the iConferences 2014 and 2015. This data set includes 745 individual authors and reviewers and 1,265 reviews. Among other research questions, we wanted to determine whether the reviewer scores, which are intended to summarize the review, were representative of the written reviews, thus we compared official review scores with a manual coding of the review text. Review texts were coded according to how they represented the review components and the tone of the review. Overall, we observed that reviews were often not representative of the review scores. The review category “Soundness”, which describes the quality of the research design and of the analysis, was of special interest to us. This study was so well received that we were granted access to the iConference review data for the years 2016 to 2019 and also the same data for the ASIS&T conferences 2017 to 2019. We are currently in the analysis phase of those data.

 

Standards for an Information Science Lab (Prof. Greifeneder, Mareen Reichardt)

The research project that spanned the longest time period during Prof. Greifeneders junior professor position was the design and construction of an Information Science laboratory for the Berlin School. As part of her newly appointed professor position, she negotiated resources to build a laboratory that would fulfill the requirements for conducting both information behavior and information retrieval studies. The iLab was officially opened in April 2016. In addition to the laboratory itself, we created a website, showcasing what the iLab offers, and a booking system to book the rooms and/or technology. While several iSchools have laboratories, few offer detailed information about what they have and more importantly why software, devices or furniture has been bought. In short: a scholarly explanation about what an Information Science laboratory should look like was completely missing. Together with a student assistant, we studied the multitude of methods applied in Information Science to see what design settings they would require. For example, focus groups require a friendly, open atmosphere, tables and comfortable chairs for the participants, a recording system to properly record the voices of everyone in a focus group, and if possible a camera. Many of the standard retrieval tests require not so much a comfortable atmosphere as a distraction free environment where researchers can test several participants at separate tables at the same time. Eye tracking studies, as a third example, require good lightning conditions, height adjustable tables, chairs with no armrest and external screens ideally with a screen size of 28 inches. We compiled a catalog of requirements for an Information Science lab and validated those by talking to researchers at iSchools that already have laboratories and by visiting other laboratories in Psychology and Computer Science departments and selected laboratories in user experience companies. First results are published here.

Based on this catalog of requirements, we designed the two available rooms: the larger room, 50 m², serves as the main room for user studies. It follows the idea of being as flexible as possible and as stable as needed. It has a build‐in room microphone, a video camera, and a sideboard. All other equipment in this room can be moved into the room depending on the aim of the study. We use diverse sizes of carpets to create spaces within rooms and different forms of chairs (meeting chairs, swivel chairs and armchairs). The smaller room, 30 m², serves as control room. It has two desktop computers, of which one can be used to monitor recordings in the other room, to control the audio and to shift the video camera’s focus. There is a spy glass between the two rooms, so that researchers can observe participants in the larger room unobtrusively. Photos of the iLab are available on the website. We bought a large sample of devices: tablets, a smartphone and a smartwatch, a mobile and a head mounted eye tracker, audio recording machines, laptops and smaller tools. Students and researchers at the Berlin School can borrow all devices for free, because we want to enable our students to conduct high quality studies, and having the right tools is frequently a first barrier. Between its opening in May 2016 and December 2018, devices and moderator tools have been borrowed over 130 times and the room has been booked for research purposes more than 70 times.

 

iLab

 

Master's thesis: Information Behavior of unaccompanied minor refugees in consideration of the role and use of smartphones. (Leyla Dewitz)

This project was led by my our former student assistant Leyla Dewitz in 2016 who conducted interviews with unaccompanied minor refugees. She conducted the interviews in a refugee camp in South Germany and used the participatory design method. While there is general consensus that we need to know more about the information behavior of refugees to improve integration, this project clearly demonstrated that we have to rethink the use of our methods with participants who do not know how to read or write; a participant group that – to our knowledge – has been excluded from information behavior studies so far. Asking them to draw on a map how they traveled from Syria to Germany may result in a defensive position in interviews, because the young refugees may be unable to complete the task.

 

Dissertation: Using Reverse Image Lookup (RIL) and EXIF embedded metadata techniques to track the reuse of digital images. (Michelle Reilly)

This research focuses on the following questions: What are the obstacles that prevent digital image repository practitioners from using Reverse Image Lookup (RIL) and EXIF embedded metadata techniques to track the reuse of digital images across the web and what are the common practices among users that can facilitate the tracking of their use of digital image content over the web? The project also examines what are the adjustments in practices that information professionals can implement to more effectively discover and trace the use of digital objects over the web using RIL and EXIF?

 

Master's thesis: The Promotion of Information Literacy by Public Libraries in Germany. Current Skillset for Teaching Librarians.

This thesis project was conducted by our former student assistant Sina Menzel. In her research, she concentrated explicitly on public libraries, askink for the skillset required for the succesful promotion of information literacy by Teaching Librarians. Through the methodological triangulation of a content analysis, interviews and observations, a catalogue of 67 skills was obtained, 15 of which were identified as especially significant. One of the main findings is the value of lobbying activities executed by Teaching Librarians in german public libraries, especially in terms of the acquisition of new users. Moreover, the field of work was found to be highly influenced by innovation. Creativity, as well as open-mindedness therefore play a significant role. The findings of this master’s thesis give valid insights into the status quo in german public libraries. This may be of service to committees for information literacy. Furthermore, the catalogue of skills gives a basis for future adaptions of training curricula for Teaching Librarians.