Skip to main content

Studying with the help of AI

·4 mins

Studying can often be a passive experience. You read through pages of text and notes, maybe highlight some sections, and hope the information sticks. But there’s a gap between reading and actually understanding the content.

Reading through a 50-page PDF doesn’t tell you whether you actually understand the material. You might feel like you’ve absorbed the information, but without active recall and testing, most of it won’t stick.

Also, even if you took an active role, you’d need to manually craft your own study material. This takes way too long and may not even end up with effective material.

What if we could bridge that gap with AI? What if you could take your study material and use it in an active, engaging way?

Let’s walk through it!

MVP: Quiz generation #

Let’s start with a simple end-to-end version of this where we focus only on generating a quiz from a document.

So let’s break down what we need:

  1. User uploads a document
  2. We extract the text from the document
  3. The user requests to generate study material from the text (in this case, a quiz)
  4. We generate the quiz questions
  5. The user takes the quiz
  6. The user receives feedback on their answers

User document to text #

Before we can generate questions, we need to extract readable text from input documents.

We don’t have to do anything fancy here. We can use a simple file upload form with a drag-and-drop interface.

Also, we can just stick to plain text and PDF documents for now. We’ll enforce this on the backend and also keep the file size to a reasonable limit.

Finally, we’ll extract the text from the document.

File processor implementation
class FileProcessor:
    def extract_text(self, file: UploadFile) -> str:
        if file.content_type == "application/pdf":
            return self._extract_pdf_text(file)
        elif file.content_type == "text/plain":
            return self._extract_text_content(file)
        else:
            raise UnsupportedFileTypeError(f"Unsupported file type: {file.content_type}")

    def validate_file(self, file: UploadFile) -> bool:
        # Size limits and MIME type validation
        return file.size <= 5_000_000 and file.content_type in SUPPORTED_TYPES

Question generation via LLM #

The heart of the system is a way to transform raw text into structured quiz questions. Rather than trying to build our own question generation model, we’ll steer an existing LLM instead.

Modern LLMs are exceptionally good at this task. Let’s use a third-party LLM provider for now. We’re going to prompt it with a set of instructions, feed the extracted text, and have it generate a JSON response.

Core prompt structure
prompt = f"""
You are an educational quiz generator that creates high-quality multiple-choice questions.

Guidelines:
- Generate exactly 5 questions per request
- Focus on key concepts and understanding from the provided text
- Create 4 plausible answer options (A, B, C, D)
- Provide clear explanations for correct answers
- Ensure questions are appropriate for college-level students
- Avoid trick questions or ambiguous phrasing
- Questions should test comprehension, not memorization
- Make incorrect options plausible but clearly wrong to someone who understands the material

Response Format:
Return your response as a JSON object with this exact structure:
{
  "questions": [
    {
      "question": "Clear, well-formed question text",
      "options": ["A) First option", "B) Second option", "C) Third option", "D) Fourth option"],
      "correct_answer": "B",
      "explanation": "Detailed explanation of why this answer is correct"
    }
  ]
}

Ensure the JSON is valid and properly formatted.
"""

Taking the quiz with feedback #

The quiz is a simple multiple-choice quiz where users can answer questions and receive immediate feedback on their answers.

After completing a quiz, the system will present the following:

  • Overall score and breakdown
  • Which questions you got wrong and why
  • Explanations for all answers, not just incorrect ones
  • The ability to take another quiz with the same material

This immediate feedback loop is crucial for learning. You can identify knowledge gaps right away rather than discovering them during an exam.

Why this is effective #

Unlike static study aids or generic quiz platforms, this system adapts to whatever material you’re studying with minimal effort. Upload a biology textbook chapter, get biology questions. Switch to a computer science paper, get technical questions appropriate to that content.

The LLM’s ability to understand context and generate relevant questions means you’re always getting targeted practice on the concepts that matter for your specific material. The explanations help reinforce correct understanding, not just identify right answers.

Most importantly, the entire process takes seconds instead of hours. This removes friction from the study process - you can generate a quick quiz during a study break or create multiple quizzes for spaced repetition practice.

Demo #

Check out the live demo at: https://study-buddy-frontend-1vqc.onrender.com

Building blocks for future features #

We built a pretty simple MVP, but the architecture sets up natural extensions.

We could add flashcard generation, study summaries, concept mapping, and a lot more!