You can use Llama Parse as a content processor in RAGChat. To use Llama Parse for processing your data, first initialize RAGChat with your chosen model, then add context using Llama Parse as the processor:

import { RAGChat, upstash } from "@upstash/rag-chat";

const ragChat = new RAGChat({
  model: upstash("meta-llama/Meta-Llama-3-8B-Instruct"),
});

const fileSource = "./hackernews.html";
const response = await fetch("https://news.ycombinator.com/");
// Or, fs.writeFile('/Users/joe/test.txt', content);
await Bun.write(fileSource, await response.text());

await ragChat.context.add({
  options: {
    namespace: "llama-parse-upstash",
  },
  fileSource,
  processor: {
    name: "llama-parse",
    options: { apiKey: process.env.LLAMA_CLOUD_API_KEY },
  },
});
const result = await ragChat.chat("What is the second story on hacker news?", {
  streaming: false,
  namespace: "llama-parse-upstash",
});