How to identify LLM application hallucination?

Share it with your senior IT friends and colleagues
Reading Time: < 1 minute

Namaste and Welcome to Build It Yourself.

Hallucination is a major challenge when building LLM-based applications.

What if we can identify when our LLM application has hallucinated?
And not only hallucination, what if we could also identify if sensitive information disclosure and other security issues have happened?

In this article, we will do the same with Giskard’s help. So, let’s dive in.

If you are a senior It professional (with 10+ years of experience) and looking to learn AI + LLM in a simple language, check out the courses and other details – https://www.aimletc.com/online-instructor-led-ai-llm-coaching-for-it-technical-professionals/

Identify LLM application hallucination and other security vulnerabilities

LLM applications are vulnerable. Many things could go wrong with an LLM application. For example:

  • LLM Application can hallucinate or provide misinformation
  • It can disclose sensitive/ proprietary information
  • Attackers can add prompt injection and change the application’s behaviour
  • LLM application’s service can be disrupted
  • and more

In this article, we will scan the LLM application and check if the LLM application has any of the above vulnerabilities. We will be using Giskard to do so.

You can find the code notebook here – https://github.com/tayaln/Evaluating-LLM-Giskard

To use it, download and then upload it to your Google Drive and open it as a Google Colab file.

Pre-requisite

– An Open Mind to learn new things

– OpenAI account

– OpenAI API Key (Please note – the entire scan might cost you around $3)

Concepts we discussed in this video

  1. Giskard
  2. Hallucination
  3. LLM security

Connect with me on LinkedIn – https://www.linkedin.com/in/nikhileshtayal/

Here is how I learned AI as a non-technical person in 4 months for free.

Let’s learn to build a basic AI/ML model in 4 minutes (Part 1)

Happy learning!

Share it with your senior IT friends and colleagues
Nikhilesh Tayal
Nikhilesh Tayal
Articles: 67