Skip to Main Content

Generative AI & Academic Use

Evaluating AI Outputs

In addition to evaluating the tools you use with the ROBOT tool, it's crucial to assess the material that the AI tools produce in response to your prompts. These tools are probabilistic and can often produce hallucinations that sound confident and informative, while being fabrications that can wrong. In libraries, we regularly see these tools invent citations for publications that look real, but don't exist.

Questions to Ask

Here are some questions to consider when evaluating output from generative AI tools:

  • Date: When was the information created? Has it been updated?
  • Authority: Who created the information? What is their authority and what are their credentials? What is their point of view? What possible biases might they have?
  • Purpose: What is the information source’s purpose? Why was it created? Who is the intended audience?
  • Documentation: What sources are cited in this information? If none, is there another way to verify the information?

You might notice that these questions are difficult (or sometimes even impossible) to answer when using generative AI tools. You will have to decide how this affects if and how you use the information you get from these tools. 

From: Boston College Libraries, Generative AI Guide.

Fact-Checking AI

In addition to evaluating the tools you use with the ROBOT tool, it's crucial to assess the material that the AI tools produce in response to your prompts. These tools are probabilistic and can often produce hallucinations that sound confident and informative, while being fabrications that can be wrong. In libraries, we regularly see these tools invent citations for publications that look real, but don't exist.

Fact-Check AI from UCSB Library on Vimeo

Access Fact-Check AI Descriptive Transcript