Lets build a chat with PDF app.

Back story

I recently did a hackathon (opens in a new tab) and made Marill (opens in a new tab) - a chat with PDF app. So here is my implementation how I made it.

Caution - So I made this project in just 8 hours and made some poor architecture choices, just glued everything together, so this project is not for a production use, it just a basic example how something like this would be made.




Tech Stack
  1. NextJS
  2. Typescript
  3. Uploadthing/Cloudflare R2
  4. Supabase (PostgreSQL)
  5. Drizzle ORM
  6. pdf-parse-fork package
  7. OpenAI API
Flow of Program
  1. User logins
  2. User uploads PDF from client side and it is then send to object storage aka upload thing.
  3. Uploadthing returns file id & url, it is then stored into the Supabase database (PostgreSQL) using Drizzle ORM.
  4. It then downloads the pdf from the url into the server.
  5. Parses the pdf using pdf-parse-fork package and gets the text content inside it.
  6. Add the text content to the database and react global context state.
  7. Deletes the pdf from server
  8. User asks question in client, question + text content from state sends to openai api through server action.
  9. Answer is then streamed back to client.

Here is the github repository (opens in a new tab) if anyone wanna go through the code.

Some code snippets I wanna go through
// schema.ts
import { pgTable, serial, text, varchar } from 'drizzle-orm/pg-core'
export const users = pgTable('users', {
  id: serial('id').primaryKey(),
  email: text('email').unique().notNull(),
  name: varchar('name')
export const pdf = pgTable('pdf', {
  id: serial('id').primaryKey(),
  name: text('name').notNull(),
  url: text('url').notNull(),
  key: text('key').notNull(),
  content: text('content'),
  userId: serial('user_id').references(() => users.id)
  // filePath: text("file_path").notNull(),
export const chats = pgTable('chats', {
  id: serial('id').primaryKey(),
  createdAt: text('created_at').notNull(),
  userId: serial('user_id').references(() => users.id),
  pdfId: serial('pdf_id').references(() => pdf.id)
export const messages = pgTable('messages', {
  id: serial('id').primaryKey(),
  content: text('content').notNull(),
  createdAt: text('created_at').notNull(),
  chatId: serial('chat_id').references(() => chats.id)

here every pdf have an userId for the user who uploaded, url of uploadthing, content and a unique key.

[!NOTE] Note Due to absence of time was unable to work on chats and messages table but this is the good db schema for anyone starting out.

'use server'
import fetch from 'node-fetch'
import fs from 'fs'
import pdfParse from 'pdf-parse-fork'
import path from 'path'
import { fileURLToPath } from 'url'
import { dirname } from 'path'
import { db } from '@/lib/db/client'
import { pdf } from '@/lib/db/schema'
import OpenAI from 'openai'
export async function addFiletoDB(name: string, url: string, key: string) {
  const content = await doPDF(url, 'temp.pdf')
  console.log('content \n', content)
  const result = await db.insert(pdf).values({
    userId: 1
  return content
const __filename = fileURLToPath(import.meta.url)
const __dirname = dirname(__filename)
export async function doPDF(url: string, file_name: string) {
  const response = await readPDF(url, file_name)
  // const response = await axios.get(url, { responseType: 'arraybuffer' });
  return response
async function downloadPDF(url: string, destination: string) {
  try {
    const response = await fetch(url)
    const buffer = await response.buffer()
    fs.writeFileSync(destination, buffer)
    console.log('PDF downloaded successfully')
  } catch (error) {
    console.error('Error downloading PDF:', error)
export async function readPDF(url: string, file_name: string) {
  const filePath = path.resolve(__dirname, file_name)
  await downloadPDF(url, filePath)
  if (!fs.existsSync(filePath)) {
    console.error('File not found:', filePath)
  const data = fs.readFileSync(filePath)
  const content = await pdfParse(data)
  console.log('pdf read \n ', content.text)
  return content.text
export async function chatwithOpenAI(content: string, question: string) {
  const openai = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY
  console.log('content openai action \n', content)
  const response = await openai.chat.completions.create({
    model: 'gpt-3.5-turbo',
    messages: [
        role: 'system',
        content: `You are a chat with pdf educative teacher so you'll , so here is the pdf as text and you'll answer. ${content}`
        role: 'user',
        content: `${question}`
    temperature: 0.5,
    max_tokens: 400,
    top_p: 1
  // console.log(response.choices[0].message.content);
  return response.choices[0].message.content

this is action.ts file, the whole logic of application stays here.

© anurag.RSS