pw.xpacks.llm.prompts

pw.xpacks.llm.prompts.prompt_short_qa(query, docs, additional_rules='')

sourceGenerate a RAG prompt with given context.

Specifically for getting short and concise answers. Given a question, and list of context documents, generates prompt to be sent to the LLM. Suggests specific formatting for yes/no questions and dates.

  • Parameters
    • query (str) – Question or prompt to be answered.
    • docs (list[Json] | list[str]) – List of documents to be passed to the LLM as context. pw.Json can be wrapped around dict, string or any other type as document.
    • additional_rules (str) – Optional parameter for rest of the string args that may include additional instructions or information.
  • Returns
    Prompt containing question and relevant docs.

pw.xpacks.llm.prompts.prompt_qa(query, docs, information_not_found_response='No information found.', additional_rules='')

sourceGenerate RAG prompt with given context.

Given a question and list of context documents, generates prompt to be sent to the LLM.

  • Parameters
    • query (str) – Question or prompt to be answered.
    • docs (list[Json] | list[str]) – List of documents to be passed to the LLM as context. pw.Json can be wrapped around dict, string or any other type as document.
    • information_not_found_response – Response LLM should generate in case answer cannot be inferred from the given documents.
    • additional_rules (str) – Optional parameter for rest of the string args that may include additional instructions or information.
  • Returns
    Prompt containing question and relevant docs.
import pandas as pd
import pathway as pw
from pathway.xpacks.llm import prompts
t = pw.debug.table_from_pandas(pd.DataFrame([{"question": "What is rag?"}]))
docs = [{"text": "Pathway is a high-throughput, low-latency data processing framework that handles live data & streaming for you."},
{"text": "RAG stands for Retrieval Augmented Generation."}]
t_with_docs = t.select(*pw.this, docs=docs)
r = t_with_docs.select(prompt=prompts.prompt_qa(pw.this.question, pw.this.docs))

pw.xpacks.llm.prompts.prompt_summarize(text_list)

sourceGenerate a summarization prompt with the list of texts.

  • Parameters
    text_list (list[str]) – List of text documents.
  • Returns
    Summarized text.

pw.xpacks.llm.prompts.prompt_query_rewrite_hyde(query)

sourceGenerate prompt for query rewriting using the HyDE technique.

  • Parameters
    query (str) – Original search query or user prompt.
  • Returns
    Transformed query.

pw.xpacks.llm.prompts.prompt_query_rewrite(query, *additional_args)

sourceGenerate prompt for query rewriting.

Prompt function to generate and augment index search queries using important names, entities and information from the given input. Generates three transformed queries concatenated with comma to improve the search performance.

  • Parameters
    • query (str) – Original search query or user prompt.
    • additional_args (str) – Additional information that may help LLM in generating the query.
  • Returns
    Transformed query.