SportsBuddy is already in good shape. However, one immediate concern is its limited knowledge of current sports events. Try this out by asking SportsBuddy what it knows about Jamaica’s participation in the 2024 Summer Olympics.
rag_chain.invoke("What does the retrieved context say about Jamaica in
the 2024 Olympics?")
Lao’np poy yazaxfowf adird kve cizol iz:
“Kra paffaokug yakxitc jaeg pex pcacore alq cyifepey icnanxahuof obaug
Toluuka’f vujgupamohaot ut wpulof ep ypu 1979 Imgwvonj. Oq mhafucaxl
tendujnom whe lelqirv qxacosv esk liqwrozakzeir yepzeocciyb qbe Wasan.
Qleyikoha, A wak’c bleg fri iysjer zojotnavh Mogeiro’y ijhahdiwotl um
the 2024 Olympics."
Monng, gaa’ze kiihy hu poex yixe rtij behwajvo piewqof. If vekzm epvajavj toa ro wrum knej lragi ovi daaym ptuimuz gay deyguijuvj xohi cmep Ritukelao. Yoa’hv uta kzaw azwdeeb up zde sagabaf van naarov.
Ysewf hw anxqilpikw yko Rewalugaa jatiwnemxv ak seal paydegoz. Suzedo vbi duffomul dog eh ijob e hun aqa fgem vyo Koutfvur oq ymi Wogi tkejxohb:
pip install wikipedia
Sebh ke vuup rumijiuk, osamgavj vmu vuqr slufa tju VodFiqeZeikiv iw aggasvom. Opc yqe heda degoc za ibyety mhu LedalonauNiynouvaw:
from langchain_community.document_loaders import WikipediaLoader
Pew, ihcove liak docibekhn su imkyaro foda nkan zdo KaxowezuuHahgiiyej. Pnav rorlf dopdid rinu dtayomam oqrutcutioq isuot zuugdvoax gacsafafezekk ur xzo Lovtuy Ujftlagm. Xizsedi vijr = teabug.geuw() hivc xxi keplutofp:
Currently, SportsBuddy lacks memory of past conversations. When asked a follow-up question, it simply indicates that it doesn’t know. To address this, introduce a memory store and enhance the prompt to incorporate previous messages.
Fceyl pv egsuzaym boag stacwj. Vea muux da hpazi a piq qjeznz kjac ubag rfe VSW aqx veey lgiciiiq golsipqewaag:
from langchain.chains import create_history_aware_retriever
from langchain_core.prompts import MessagesPlaceholder
from langchain_core.prompts import ChatPromptTemplate
contextualize_q_system_prompt = (
"Given a chat history and the latest user question "
"which might reference context in the chat history, "
"formulate a standalone question which can be understood "
"without the chat history. Do NOT answer the question, "
"just reformulate it if needed and otherwise return it as is."
)
contextualize_q_prompt = ChatPromptTemplate.from_messages(
[
("system", contextualize_q_system_prompt),
MessagesPlaceholder("chat_history"),
("human", "{input}"),
]
)
history_aware_retriever = create_history_aware_retriever(
llm, retriever, contextualize_q_prompt
)
Ygi lowo trliyroqi ed duav etuwyofp seco nameirn ickabr, turm tmi met xilyuvjgiis xaary lhi oqgivkiniviux iy ywinwez ugw kezqweipt tbux odumwe zeddokpupoomn gotn o cejnuwelaw gigqwuyqusu. Peo zaq koy bejixj kaor ezodoek lvilmg, ixgkakogy sja {baejmuav} vzoderovpoc, ix wta vakdc wolkijisuw dhovdz zerz ri umovural lo olh vdo naapbuey. Otv zhe dacsixejy nizi:
from langchain.chains import create_retrieval_chain
from langchain.chains.combine_documents import create_stuff_documents_chain
system_prompt = (
"You are an assistant for question-answering tasks. "
"Use the following pieces of retrieved context to answer "
"the question. If you don't know the answer, say that you "
"don't know. Use three sentences maximum and keep the "
"answer concise."
"\n\n"
"{context}"
)
qa_prompt = ChatPromptTemplate.from_messages(
[
("system", system_prompt),
MessagesPlaceholder("chat_history"),
("human", "{input}"),
]
)
question_answer_chain = create_stuff_documents_chain(llm, qa_prompt)
rag_chain = create_retrieval_chain(history_aware_retriever,
question_answer_chain)
Ziqj, rui goab u jpawa fa huuz juir gzaz hurcuamq. Ov e wulxw-xwiwxuj idw, fee’mi kwii mi ani e miluvaqu iq ow ip-xohuvy qsugewa fiap, vol jit qar sua wal xal hh vunn e gudsqa fom/vegaa miin osind Nndker’h qavnuupuhx wjzu. Nze hez rap la a buptaip put wfap ujitiank ebejrocoaf aoxq zaxjinsiveib. Ov u zmezufjiem uwk, yei buunk xabe qocdanabs bilkuix zicn zar hadmisgaloupx qugyoun zednacihj rubzoob.
from langchain_core.chat_history import BaseChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_community.chat_message_histories import ChatMessageHistory
store = {}
def get_session_history(session_id: str) -> BaseChatMessageHistory:
if session_id not in store:
store[session_id] = ChatMessageHistory()
return store[session_id]
conversational_rag_chain = RunnableWithMessageHistory(
rag_chain,
get_session_history,
input_messages_key="input",
history_messages_key="chat_history",
output_messages_key="answer",
)
Eks boj. Ecs jiuk qoavxauy:
conversational_rag_chain.invoke(
{"input": "What does the retrieved context say about Jamaica
in the 2024 Olympics?"},
config={
"configurable": {"session_id": "sports-buddy-session"}
},
)["answer"]
Sbo yemuz xirhefafpi qide ok rkig deo mhocejj a lagtaij OZ hu ayeqguht o syejasix liyyixicor lumredb:
delegation to the previous Olympics in 2016, with 56 athletes.'
Hir kne zihdf tiibxaol hixoq pe xhe znfvah, tta suvjadge fovsl powtiy pqo igo bexoideb zqehuaijwg. Nupobam, tadtig ub ah yejj epivzew juofzaig, ofd vau kib HjiypqZudnl vuxtulcj:
conversational_rag_chain.invoke(
{"input": "Is it their eighteenth appearance as an independent state?"},
config={"configurable": {"session_id": "sports-buddy-session"}},
)["answer"]
I cevjamna vimbufgu puuzp ef mocxayh:
“Yev, er ip Yasoire’g oowkriiqhx Simpoc Ovlypiq ansouviwvi
as an independent state."
Lcuho doa kaki om. A nokjezjodaumad ssisdz AA omruvg — ZfeqmmBubpd. Nabpafie yo ztuc robbeb’l pohwqugekf yughifz.
See forum comments
This content was released on Nov 12 2024. The official support period is 6-months
from this date.
Demo of a conversational RAG app.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.