PROCUCT & SOLUTION

Design and Evaluation of a COVID-19 Literature Search Engine that Supports Collaborative Information Seeking 

Why this topic?

The COVID-19 pandemic has changed the way clinical/medical researchers work together. Yet, there is no collaborative search feature applied to any of the literature search product. In addition, researchers are in need of finding the latest COVID-19 related literature. Then how might we develop such a product to support research teamworks?

TOPIC

+ Information Science

+ Online Teamwork

In a design perspective

MY ROLE

+ Researcher

METHOD

+ Research Method: Prototyping, semi-structured interview, survey, SUS scale, heuristic evaluation, usability test (convenience sampling)

+ Data Analysis Method: Qualitative coding (open coding - focused coding - thematic coding), statistical calculation (mean & standard deviation), affinity grouping

TOOL

+ Data Collection Tool: Google Form, WebEx audio and screen recording, Miro

+ Data Analysis Tool: NVivo, Excel, Qcamap.org

Publication

(In the progress) MedInfo 2021

Introduction of the topic and domain | Literature review

Collaborative Information Seeking (CIS) Can Help Clinical/medical Research Teams

Expected Outcome : Information Query System (IQS) interaction and user interface design.

Research Method : User study that learn the big picture of online teamwork, instead of the traditional calculation of algorithm efficiency.

Research question

How might a user interface design for the literature collection and review task bridge perceptual gaps between clinical/medical research team members?

Q1: How can CIS complement the interaction and user interface design to assist bridging the perceptual gap between team members?

Q1-1: What is the literature collection and review process like in a clinical/medical team?
Q1-2: How to understand the roles in the clinical/medical literature collection and review?
Q1-3: How to improve innovation efficiency by providing awareness?

Q2: How might we improve the proposed prototype system?

Q2-1: What are the usability issues?
Q2-2: How might we fix them?

method

An Iterative Process: Prototyping - Evaluation

Data Collection

Design: prototype 0

The Best Guess of A Search Engine Infrastructure

Drafting : 

Wire-frames and Design : BootStrap Studio: HTML framework + CSS

Implementation : Clickable with back-end algorithm

evaluation: prototype 0

Heuristic Evaluation: To Fix Minor Usability Issues

Aim : To prevent the participants from focusing on less important issues in the following lab evaluation, I decided to have a heuristic evaluation to the implemented prototype 0 which contains back-end algorithm.

Recruitment : design experts with a learning background in DAAP, N=3

Steps : 

Individually find usability problems as much as possible

Group brainstorm : share problems and propose design ideas

Design representative talk with development team to list development priority

Online Session : Miro + WebEx

Data Analysis : Affinity grouping according to a criteria of seriousness

Design: prototype 1

Minor but Important Issues Fixed

The key findings are:

  • The gap between the design deliverable and the front-end implementation.
  • Pure technical issues. 
  • Neglecting general human reactions. 
  • Less consideration of using scenarios. 
  • Dilemma of introducing specialty of the IQS.

Final Decision : High priority: minor but important usability issues. Low priority: functions related to CIS

Design :

Implementation : 

evaluation: prototype 1

Lab Evaluation: A Combined User Study

Aim : To learn from the user and reduce the recruitment difficulty 

Recruitment : Researchers in College of Medicine: professor, associate professor, Graduate Research Assistant, Librarian Fellow. N=12.

Steps : 

Background Survey

Usability test : Realistic Tasks + SUS Scale

Semi-structured Interview

Tools : Google Form, WebEx, WebEx Audio and Screen Recording

evaluation: prototype 1

Lab Evaluation: Background Survey

Aim : 

  • Collect demographic information for the lab evaluation
  • Collect search behaviors and collaboration behaviors
  • Supportive evidence due to limited sample size

Steps : 

Demographic info

Search behavior

Collaboration behavior

Data Analysis : Statistical charts

Results : Top choices. References for the further lab evaluation result because of its small size.

evaluation: prototype 1

Lab Evaluation (Usability Test): Realistic Tasks + SUS Scale

Aim : Learn usability issues from user’s perspective

Steps : 

PubMed search: search behavior observation

Search with IQS system: keyword+term, ID+term, ID+concept

Semi-Structured interview: user perspective usability

Usability scale: SUS scale

Data Analysis : 
Data of realistic tasks : Usually: completion of tasks. This study: qualitative coding (open coding + focused coding)
Data of SUS scale : Mean + standard deviation calculation

Online Session : 

Findings from realistic tasks : 

  • The user’s mental model 
    The understanding of IQS from a user perspective
  • The malfunction 
    Visibility of system status
    The matching between system and the real world
    The user control and freedom
  • Low Readability
    Recognition rather than recall
    Aesthetic and minimalist design
    The visibility of system status.
  • The system error
    A simple collection of implementation problems occurred

Result from SUS scale : 

  • A SUS score above average (current IQS is doing ok and could be improved)
  • Emerging categorization: group A vs group B (helpful for the following sections)
design: A part of Prototype 2

Based on Usability Test Results

Login page

Landing page

Searching page

Q&A page

Q&A details

exploriatory research: prototype 2

Lab Evaluation: Semi-structured Interview

Aim : Collect factual experiences as exploratory data

Data Analysis : Thematic coding (open coding + focused coding)

Steps : 

Semi-structured interview: facts, attitudes, reasonings, and opinions

Exit interview: questions and snow ball recruitment

Finding 1 : Categorization of tea-sonas and personas. There two types of teams and two types of members.

Finding 2 : The tea-sona and personas that help me understand the users.

Finding 3 : Experience maps that help to see the big picture of the current activities the users do and find out opportunities.

Finding 4 : Journey map that mapping out the design ideas.

Future Direction & Limitation

How might I Do Better?

Limitation

  • Recruitment
    Male >> female, can recruit more female participants in the future
    Small sample size, covering 85% problems
  • Design of Background Survey
    Misunderstanding of the qualification question (N=10 or 12)
  • Balance the Time and Resource
    Only two rounds of prototyping-evaluation, can have more in the future
  • Marketing Consideration
    Will be better if I have data to consider marketing related factors

Future Direction (Research)

  • The Tension Between Individual Contribution and Team Efficiency
    Efficiency: redundancy and novelty, individual efficiency and team efficiency. Study on this topic is not able to be done without an established CIS system.
  • High Learning Cost of Using CIS Systems
    Design pattern
  • Reflection on Heuristic Evaluation Criteria
    Heuristic evaluation criteria was established since 1990, and was revisited in 2007

Future Direction (Project)

  • Visual Based Communication
    For example, communication on protein structures
  • Future Collaborative Functions
    Search by topic, cross bubble opportunity
  • Trust
    Privacy and transparency

Reflections

  • Web Design Size
    I used 1680px * 1050px as the size to design the IQS website. This was just because the convenience that I can preview the design easily on my own screen. The ideal size would be 1920px * 1080px, and then make it responsive to other sizes. And since IQS is mainly used via desktop, I haven't make it responsible to smaller screens like smart phones.
  • Future Collaborative Functions
    Search by topic, cross bubble opportunity
  • Trust
    Privacy and transparency