Getting started with quallmer

The quallmer package helps qualitative researchers leverage the power of large language models for tasks such as coding, annotation, and thematic analysis. It is user-friendly and does not require extensive programming knowledge, making it accessible to researchers from various backgrounds.

Our tutorials provide a brief introduction to the quallmer package, which is designed to facilitate the use of large language models (LLMs) for qualitative research tasks. The package relies on the ellmer package for LLM interactions, providing a seamless interface for users to work with different LLM providers. For more information on the ellmer package and supported LLM interactions, please refer to its documentation here.

Basic usage

The quallmer package is developed for using it in R. Please make sure you have a recent version of R and RStudio installed on your computer. If you are new to R and RStudio, you can find a great and free-of-charge 1.5h introduction to R and RStudio on instats.

To get started with quallmer, you first need to install the package from GitHub.

# If you don't have pak installed yet, uncomment and run the following line:
# install.packages("pak")
# Then, install quallmer using pak:
pak::pak("quallmer/quallmer")

Then, you can load the package and begin using its functions.

library(quallmer)
#> Loading required package: ellmer
#> Warning: package 'ellmer' was built under R version 4.5.2

The quallmer workflow

The typical quallmer workflow consists of five steps:

  1. Define your codebook with qlm_codebook()
  2. Code your data with qlm_code()
  3. Replicate with different settings using qlm_replicate()
  4. Compare results with qlm_compare()
  5. Document everything with qlm_trail()

For a hands-on introduction with code examples, see The quallmer workflow.

Setting up LLM access

Before using large language models, you need to set up access to an LLM provider:

  1. Signing up for an OpenAI API key: Obtain an API key from OpenAI to use models like GPT-4o.

  2. Working with an open-source Ollama model: Use open-source models locally with Ollama.

The quallmer package supports multiple LLM providers through the ellmer package. For more information, see the ellmer documentation.