79297924

Date: 2024-12-20 17:07:32
Score: 1
Natty:
Report link

you can Try Embedding Data with Metadata: Include summarization keys as tags and document sources in the metadata for each chunk. Metadata can have A basic "route idea" key for the chunk. Source: Origin of the document. Tags: Key information extracted from the document. [politics,health, billing,tool] these keys can be generated though LLM itself while creating embeddings

Filtering Using Metadata: While retrieving chunks from the vector store, use the metadata (e.g., tags, source) to filter the results effectively. Chaining Prompts for Query Handling for user input :
Prompt 1: Identify the route tags based on the query.
Prompt 2: Use these tags to filter or re-rank the chunks retrieved from the vector store.
Prompt 3: Combine the system prompt with the context to generate the final response using the LLM.

Let me know if you’ve already implemented any approach that works for you or are still stuck. If needed, we can collaborate to refine this approach further.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Low reputation (1):
Posted by: himanhsu rajpurohit