UPSTASH_VECTOR_REST_URL
and UPSTASH_VECTOR_REST_TOKEN
and paste them into our .env
file.
.env
file in your project directory and add the following content:
LLAMA_CLOUD_API_KEY
, you can follow the instructions in the LlamaCloud documentation.
global_warming.txt
.
UpstashVectorStore
to create an index, and query the content. We’ll use OpenAI as the language model to interpret the data and respond to questions based on the document. You can use other LLMs that are supported by LlamaIndex as well.