Original address: [LangChain Series 11] Prompt template – assembly combination
Quick reading of this article:
-
Multiple prompt template combinations
-
Single prompt template assembly
In normal business development, we often need to extract some common modules as an independent part, and then combine these modules in the business. In LLM application development, we will also need to adopt this idea, such as making some public prompt templates independent, so that prompt templates can be better reused, reduce unnecessary code, and keep the code and logic simple.
LangChain provides two ways to combine prompt templates:
1. Combine multiple prompt templates.
2. Assemble multiple parts into a prompt template.
01 Multiple prompt template combinations
LangChain provides PipelinePrompt to combine multiple prompt templates. A PipelinePrompt contains two parts:
-
Final prompt template: The final generated prompt template.
-
Prompt template to be combined: It is a list, and each item in the list contains a name and a prompt template.
As shown in the code below, full_prompt is the final prompt template, and input_prompts is the prompt template to be combined; the prompt templates in input_prompts are finally combined into full_prompt.
from langchain.prompts.pipeline import PipelinePromptTemplate from langchain.prompts.prompt import PromptTemplate full_template = """{introduction} {example} {start}""" full_prompt = PromptTemplate.from_template(full_template) introduction_template = """You are impersonating {person}.""" introduction_prompt = PromptTemplate.from_template(introduction_template) example_template = """Here's an example of an interaction: Q: {example_q} A: {example_a}""" example_prompt = PromptTemplate.from_template(example_template) start_template = """Now, do this for real! Q: {input} A:""" start_prompt = PromptTemplate.from_template(start_template) input_prompts = [ ("introduction", introduction_prompt), ("example", example_prompt), ("start", start_prompt) ] pipeline_prompt = PipelinePromptTemplate(final_prompt=full_prompt, pipeline_prompts=input_prompts) print(pipeline_prompt.input_variables)
Output result:
['example_a', 'person', 'example_q', 'input']
Execute the following code:
print(pipeline_prompt.format( person="Elon Musk", example_q="What's your favorite car?", example_a="Tesla", input="What's your favorite social media site?" ))
Output result:
You are impersonating Elon Musk. Here's an example of an interaction: Q: What's your favorite car? Answer: Tesla Now, do this for real! Q: What's your favorite social media site? A:
02 Single promptModel version assembly
Single prompt template assembly refers to assembling multiple parts into a complete prompt template. Generally speaking, it combines strings and prompt templates into a new prompt template. The following mainly introduces the assembly of the two templates, the string prompt template and the dialogue prompt template, and introduces their usage through two code examples.
String prompt template
In the following code, a string prompt template and two strings are assembled using +.
from langchain.prompts import PromptTemplate prompt = ( PromptTemplate.from_template("Tell me a joke about {topic}") + ", make it funny" + "\ \ and in {language}" ) print(prompt)
Output result:
PromptTemplate(input_variables=['language', 'topic'], output_parser=None, partial_variables={}, template='Tell me a joke about {topic}, make it funny\ \ and in {language}' , template_format='f-string', validate_template=True)
Execute code:
print(prompt.format(topic="sports", language="spanish"))
Output result:
'Tell me a joke about sports, make it funny\ \ and in spanish'
Likewise, we can use this assembled prompt in LLMChain.
from langchain.chat_models import ChatOpenAI from langchain.chains import LLMChain model = ChatOpenAI(openai_api_key="xxx") chain = LLMChain(llm=model, prompt=prompt) chain.run(topic="sports", language="spanish")
Execute the code and output the result:
' <p><em><strong>Conversation prompt template</strong></em></p> <p>In the following code, the Message and string in the dialog prompt are assembled using + to form a new prompt template. Not only can the Message be assembled, but also the MessagePrompt can be assembled, but the variables in the MessagePrompt must be assigned first. .</p> <pre>from langchain.prompts import ChatPromptTemplate, HumanMessagePromptTemplate from langchain.schema import HumanMessage, AIMessage, SystemMessage prompt = SystemMessage(content="You are a nice pirate") new_prompt = ( prompt + HumanMessage(content="hi") + AIMessage(content="what?") + "{input}" ) print(new_prompt.format_messages(input="i said hi"))
Output result:
[SystemMessage(content='You are a nice pirate', additional_kwargs={}), HumanMessage(content='hi', additional_kwargs={}, example=False), AIMessage(content='what?', additional_kwargs={}, example=False), HumanMessage(content='i said hi', additional_kwargs={}, example=False)]
Likewise, it can be used in LLMChain:
from langchain.chat_models import ChatOpenAI from langchain.chains import LLMChain model = ChatOpenAI(openai_api_key="xxx") chain = LLMChain(llm=model, prompt=new_prompt) chain.run("i said hi")
Execute the code and output the result:
'Oh, hello! How can I assist you today?'
Summary of this article
This article mainly introduces the assembly and combination of prompt templates. You can combine multiple prompt templates or assemble a single prompt template.
For more latest articles, please follow the public account: Dabai loves mountain climbing
strong>