Direkt zum InhaltDirekt zur SucheDirekt zur Navigation
▼ Zielgruppen ▼

Humboldt-Universität zu Berlin - Institut für Bibliotheks- und Informations­wissen­schaft

Projekte und Aktivitäten

Amerika Haus Archiv (Prof. Dr. Elke Greifeneder, Vera Hillebrand)

2018-2019 Ziel des Projektes Amerika Haus Archiv 2 ist eine strukturelle und inhaltliche Überarbeitung sowie ein anschließender Relaunch der Digitalen Bibliothek des Amerika-Haus-Archives. Das Amerikahausarchiv beinhaltet Teile der Bestände der Bibliothek des Amerika Hauses Berlin, unter anderem den Amerika Dienst, der von der amerikanischen Botschaft zwischen 1950 - 1994 veröffentlicht wurde. Der Amerika Dienst diente dazu, den Deutschen nach dem Zweiten Weltkrieg Informationen über die USA zu vermitteln. Das Themenspektrum der Veröffentlichungen umfasst dabei Bereiche der Politik, der Wirtschaft, der Wissenschaft, der Gesellschaft und Kultur. Der Großteil der Dokumente ist einzigartig und wertvoll für alle die, die an der jüngsten amerikanisch-deutschen Geschichte interessiert sind. Dabei untersucht das Projekt insbesondere, ob und wie man ältere digitale Bibliotheken - bei denen es vorwiegend um die Digitalisierung und Veröffentlichung von Beständen ging und Nutzer nicht beim Aufbau der Site eingebunden waren - durch einen Relaunch nutzerorientiert gestalten kann bzw. wie viel Zugewinn an Nutzerfreundlichkeit ein Relaunch bietet.

 

Information seeking and avoidance (Prof. Dr. Elke Greifeneder, Kirsten Schlebbe, Vera Hillebrand)

We started a project on information avoidance and are currently coding the data. In our field, information avoidance is most frequently approached by a focus on how people cope with information overload. The problem with information avoidance is multifold: people are rarely aware that they avoid information; or they avoid information because of sensitive topics (such as not reading bills, when one has debts); or they do not remember when they have avoided information. All of those reasons make it difficult to collect data on information avoidance, but it is one of the most critical facets of the digital transformation. We are now working with a dataset collected in classes over five years of teaching. The aim of this research project is to develop a framework for information avoidance triggers.

 

Information behavior studies in natural information environments (Prof. Dr. Elke Greifeneder)

Data validity in digital information environments was a driving force of Prof. Greifeneder’s dissertation research, and the results emphasized that it is critical for our field to know (a) how to collect valid data in online or natural environments, and (b) how to interpret these data without reaching wrong conclusions. As a follow‐up to her dissertation, she published an article in JASIST on the effects of distraction on task completion scores, reflecting the need to carefully reconsider how we interpret our standard indicators if we apply them in new research contexts, i.e. in online or real life contexts.

“From this research, it is clear that the seemingly straightforward equation—if potential distraction exists, then people become distracted—is not true. What is true is that researchers cannot control a test situation in a natural environment (or choose to refrain from controlling it, because they want to keep the setting realistic). This means that they need to have information that lets them retrace the situation as much as possible. The danger of data collection in a natural environment is not that events might occur, but that researchers know nothing about them.”

(Greifeneder 2016, p. 2869)

In 2015, she published a conference publication on outliers, which describes the outliers of a study that took place in user’s real life in detail. The study argues for a more careful handling of outliers instead of the current approach in our field, where frequently outliers are either not reported at all, or only the number of outliers is reported and that they were eliminated from the data set – without offering more detail and discussion about why some data do not follow the normal distribution.

In 2014 she conducted qualitative online interviews using Adobe Connect with users of the Danish virtual research environment LARM.fm, in order to uncover their contexts and needs when using such an environment. In addition to the qualitative interviews, she experimented with screen sharing and ran short usability tests remotely on users’ own screens. This approach allowed her to get a much better insight into the users’ behavior and their interactions within the virtual research environment.

 

Standards for data reuse when studying digital behavior (Prof. Dr. Elke Greifeneder et al.)

In 2015 the head of the research department at Humboldt‐Universität zu Berlin signed a cooperation with the publisher Elsevier, allowing Elsevier to conduct user studies on a new academic social networking tool with researchers from Humboldt. Part of the cooperation was an agreement that data would be co-analyzed by a researcher from Humboldt‐Universität zu Berlin The data set consisted of 110 unique qualitative interviews of 81 individuals who participated in those interviews in three rounds over a period of nine months. Participants came from four different countries (United Kingdom, Germany, United States, and Singapore) and were from all levels of seniority, including a large number of senior researchers. The size of the sample far exceeded qualitative data sets that we, as an individual, could normally collect. Secondly, we were not the interviewer who conducted the interviews, but had to work with questions that were asked by others. Library and Information scientists strongly advocate for research data reuse, but we rarely see actual reuse.

 

Working with qualitative data where we as researcher could not probe further led to a couple of interesting questions about data reuse. We have one article published in JASIST using this data set in which our UK colleague Sheila Pontis took the lead in the data analysis and a second one in the Journal of Documentation where Prof. Greifeneder coded all the interviews and did the analysis. In both cases, we struggled with the reviewers who criticized us because we were not the ones who conducted the interviews, and they raised strong issues about the validity of the data, because of this fact. This research demonstrated that qualitative data can be reused, and in this case made research possible that our research team (Sheila Pontis and Ann Blandford from University College London and Prof. Greifeneder and Kirsten Schlebbe) would not have been able to afford financially. Further work, however, on the impact of the reuse of qualitative data on data validity is needed, and there ought to be a discussion about how reviewers should deal with publications that reuse data. A challenging task for the next few years will be to find an acceptable middle road between facing repeatedly criticism for not having collected the data set oneself, and giving carte blanche to researchers who reuse unacceptable research designs.

 

 

 
 
 
 
 
 
 
 
 
 
 
 
Level of detail in literature reviews (Prof. Dr. Elke Greifeneder, Kirsten Schlebbe)

A third research project on standards examines the level of detail reported on other studies in literature reviews. We we analyzed a large set of publications on the use of academic social networks according to the method the respective authors used to collect data on this usage, looking at the way prior studies had recruited participants as well as the size and diversity of the sample. We used these data to have a second look at the literature reviews of the publications on use of academic social networks, and examined in what detail other researchers report on methods and samples. Our analysis shows that the level of detail in the literature reviews we examined was rather low: if study parameters were mentioned at all, the authors described at most one or two. In other cases information about the research design or sample characteristics of the cited studies were missing completely. Even if studies directly compared their findings with previous studies, essential information was lacking. The paper has been published as a preliminary result for the iConference 2017 and was awarded runner-up for best preliminary papers.

 

 

Research quality in conference reviews (Prof. Dr. Elke Greifeneder, Prof. Toine Bogers, PhD (University of Aalborg))

One way to promote standards for methods and data quality in our field is to study review data that show how other researchers define what acceptable and good use is. This project examines data from reviews. In 2015, our Danish collaborator Prof. Toine Bogers and we received access by the executive committee from the iSchools and IRB approval from the Humboldt‐Universität zu Berlin to analyze all of the reviewing data from the iConferences 2014 and 2015. This data set includes 745 individual authors and reviewers and 1,265 reviews. Among other research questions, we wanted to determine whether the reviewer scores, which are intended to summarize the review, were representative of the written reviews, thus we compared official review scores with a manual coding of the review text. Review texts were coded according to how they represented the review components and the tone of the review. Overall, we observed that reviews were often not representative of the review scores. The review category “Soundness”, which describes the quality of the research design and of the analysis, was of special interest to us. This study was so well received that we were granted access to the iConference review data for the years 2016 to 2019 and also the same data for the ASIS&T conferences 2017 to 2019. We are currently in the analysis phase of those data.

 

Standards for an Information Science Lab (Prof. Dr. Elke Greifeneder, Mareen Reichardt)

The research project that spanned the longest time period during Prof. Greifeneders junior professor position was the design and construction of an Information Science laboratory for the Berlin School. As part of her newly appointed professor position, she negotiated resources to build a laboratory that would fulfill the requirements for conducting both information behavior and information retrieval studies. The iLab was officially opened in April 2016. In addition to the laboratory itself, we created a website, showcasing what the iLab offers, and a booking system to book the rooms and/or technology. While several iSchools have laboratories, few offer detailed information about what they have and more importantly why software, devices or furniture has been bought. In short: a scholarly explanation about what an Information Science laboratory should look like was completely missing. Together with a student assistant, we studied the multitude of methods applied in Information Science to see what design settings they would require. For example, focus groups require a friendly, open atmosphere, tables and comfortable chairs for the participants, a recording system to properly record the voices of everyone in a focus group, and if possible a camera. Many of the standard retrieval tests require not so much a comfortable atmosphere as a distraction free environment where researchers can test several participants at separate tables at the same time. Eye tracking studies, as a third example, require good lightning conditions, height adjustable tables, chairs with no armrest and external screens ideally with a screen size of 28 inches. We compiled a catalog of requirements for an Information Science lab and validated those by talking to researchers at iSchools that already have laboratories and by visiting other laboratories in Psychology and Computer Science departments and selected laboratories in user experience companies. First results are published here.

Based on this catalog of requirements, we designed the two available rooms: the larger room, 50 m², serves as the main room for user studies. It follows the idea of being as flexible as possible and as stable as needed. It has a build‐in room microphone, a video camera, and a sideboard. All other equipment in this room can be moved into the room depending on the aim of the study. We use diverse sizes of carpets to create spaces within rooms and different forms of chairs (meeting chairs, swivel chairs and armchairs). The smaller room, 30 m², serves as control room. It has two desktop computers, of which one can be used to monitor recordings in the other room, to control the audio and to shift the video camera’s focus. There is a spy glass between the two rooms, so that researchers can observe participants in the larger room unobtrusively. Photos of the iLab are available on the website. We bought a large sample of devices: tablets, a smartphone and a smartwatch, a mobile and a head mounted eye tracker, audio recording machines, laptops and smaller tools. Students and researchers at the Berlin School can borrow all devices for free, because we want to enable our students to conduct high quality studies, and having the right tools is frequently a first barrier. Between its opening in May 2016 and December 2018, devices and moderator tools have been borrowed over 130 times and the room has been booked for research purposes more than 70 times.

 

 

Master's thesis: Information Behavior of unaccompanied minor refugees in consideration of the role and use of smartphones (Leyla Dewitz)

This project was led by my our former student assistant Leyla Dewitz in 2016 who conducted interviews with unaccompanied minor refugees. She conducted the interviews in a refugee camp in South Germany and used the participatory design method. While there is general consensus that we need to know more about the information behavior of refugees to improve integration, this project clearly demonstrated that we have to rethink the use of our methods with participants who do not know how to read or write; a participant group that – to our knowledge – has been excluded from information behavior studies so far. Asking them to draw on a map how they traveled from Syria to Germany may result in a defensive position in interviews, because the young refugees may be unable to complete the task.

 

Dissertation: Using Reverse Image Lookup (RIL) and EXIF embedded metadata techniques to track the reuse of digital images (Michelle Reilly)

This research focuses on the following questions: What are the obstacles that prevent digital image repository practitioners from using Reverse Image Lookup (RIL) and EXIF embedded metadata techniques to track the reuse of digital images across the web and what are the common practices among users that can facilitate the tracking of their use of digital image content over the web? The project also examines what are the adjustments in practices that information professionals can implement to more effectively discover and trace the use of digital objects over the web using RIL and EXIF?

 
Bachelorarbeit: Eine vergleichende Analyse von Bildinhalt, -beschreibung und Hashtags am Beispiel des Bewältigungsprozesses von Essstörungen auf Instagram (Paulina Bressel)

Das Ziel der Bachelorarbeit war es, einen Überblick über die Nutzung sozialer Medien während des Bewältigungsprozesses von Essstörungen zu schaffen. Hierfür wurde über einen Zeitraum von zwei Wochen eine Hashtaganalyse öffentlich geteilter Daten auf Instagram durchgeführt, welche anschließend qualitativ ausgewertet wurden. So sollten Zusammenhänge zwischen den Beitragskomponenten Bild, Beschreibung und Hashtag gefunden werden, sowie der Begriff der Recovery in Bezug auf die Nutzung sozialer Medien definiert werden. Basierend darauf wurde die Wahrscheinlichkeit einer zielführenden Hashtaganalyse untersucht. 
Verglichen mit psychologischen Studien im Bereich der Essstörungsbewältigung (McNamara & Parsons, 2016; Bardone-Cone, Hunt & Watson, 2018), entsprachen die Ergebnisse dieser Studie größtenteils den geäußerten Vermutungen. Beträge auf Instagram werden für persönliche Erfahrungen und Geschichten genutzt und ebenso als Mittel um mit anderen Betroffenen in Kontakt zu treten. Insbesondere der große Community-Bezug ist trotz fehlender Strukturen auf Instagram innerhalb dieses Themengebiets auffällig (Chancellor, Pater, Clear, Gilbert & De Choudhury, 2016). 
Die Ergebnisse der Zusammenhänge zwischen den Beitragskomponenten führte zu dem Ergebnis, dass mit einer Wahrscheinlichkeit von 69% themenrelevante Ergebnisse gefunden werden können, was jedoch wie zu erwarten stark von den verwendeten Hashtags abhängig ist.

 
Masterarbeit: Die Förderung von Informationskompetenz durch Öffentliche Bibliotheken in Deutschland. Aktuelle Anforderungen an Teaching Librarians. (Sina Menzel)

Dieses Abschlussprojekt wurde von unserer ehemaligen Studentischen Hilfskraft Sina Menzel durchgeführt. Ziel der Arbeit war die Erstellung eines Anforderungskataloges die erfolgreiche Vermittlung von Informationskompetenz in Öffentlichen Bibliotheken. Durch die Triangulation dreier Methoden (Content Analysis, Interviews, Beobachtungen) konnten 67 Kompetenzen herausgestellt werden, aus denen durch verschiedene Indikatoren 15 Hauptkompetenzen separiert wurden. Als Ergebnis konnte zum einen der Stellenwert der Lobbyarbeit nachgewiesen werden, die Teaching Librarians für Öffentliche Bibliotheken leisten. Hier geht es vor allem um die Akquise von neuen Nutzerinnen und Nutzern. Zum anderen lebt das Arbeitsfeld von Teaching Librarians stark von Innovation. Kreativität und Offenheit für Neues, gerade in Bezug auf Veranstaltungsformate und Digitale Medien, nehmen bei den Hauptkompetenzen daher einen wichtigen Stellenwert ein. Die Ergebnisse erlauben Rückschlüsse auf den Status quo der Informationskompetenzvermittlung in Öffentlichen Bibliotheken, der so bisher noch nicht untersucht wurde. Dies kann als Basis für die Arbeit in bibliothekarischen Gremien für Informationskompetenz dienen und sollte für die Gestaltung der Curricula bibliothekarischer Studien- und Ausbildungsgänge berücksichtigt werden.