The Chatbot Contract
This article explains the AI Chatbot Contract, which interacts with an LLM node (which is a part of the chain) and serves as a backend for our Chatbot website.
Chatbot Interface
Following is the Chatbot WIDL (Weilliptic Interface Definition Language) file :
interface Chatbot {
query func thread_list() -> list<string>;
query func thread_details(id: u32) -> result<list<tuple<string, string>>, string>;
query func list_models() -> result<list<string>, string>;
query func chat_query(id: u32, query_str: string) -> result<byteStream, string>; // returns response as a stream
mutate func append_response(id: u32, query_str: string, answer: string, title: option<string>) -> result<(), string>;
mutate func new_thread(model: string) -> u32;
mutate func update_thread_title(id: u32, new_title: string) -> result<(), string>
}
Let's go over the methods one by one :
- thread_list : Returns all chat thread titles for the current user.
- thread_details : Gets complete conversation history for the given thread id, in the form of list of (Query, Response).
- list_models : Lists the available LLM models, on the LLM node.
- chat_query :
- Core chat functionality.
- Sends user query to LLM with conversation context of the thread, returning response in streamed chunks.
- The chunks combinedly comprise of the LLM's response, as a byte stream.
- append_response :
- This is to be called right after chat_query to update the contract state with the new set of (Query, Response) conversation.
- Persistence of conversations.
- new_thread : creates a new thread with the given LLM model, returning the thread_id.
- get_thread_title : Uses LLM to generate a relevant title for the thread.
- update_thread_title : Persists the new_title of thread into the contract state.
The above description should be enough for interacting with the chatbot contract.
Now, let's dive into the implementation.
Contract Implementation
Rust had been the language chosen for writing this contract.
The main point in the implementation is usage of the ask_llm
function provided by weil_rs::Runtime
of our rust contract sdk. The method using this function should only return a ByteStream
or a Result<ByteStream, String>
.
Let's setup the basics first, i.e. widl generated bindings + imports and helper structs and functions. To get acquainted on boilerplate contract code generation, please refere to Create a Basic Applet.
use serde::{Deserialize, Serialize};
use wadk_utils::llm::llm_request::{ChatRequest, Role};
use weil_macros::{constructor, mutate, query, smart_contract, WeilType};
use weil_rs::{
collections::{map::WeilMap, streaming, vec::WeilVec, WeilId},
runtime::Runtime,
};
trait Chatbot {
fn new() -> Result<Self, String>
where
Self: Sized;
async fn thread_list(&self) -> Vec<String>;
async fn thread_details(&self, id: u32) -> Result<Vec<(String, String)>, String>;
async fn list_models(&self) -> Result<Vec<String>, String>;
async fn chat_query(
&self,
id: u32,
query_str: String,
) -> Result<weil_rs::collections::streaming::ByteStream, String>;
async fn append_response(
&mut self,
id: u32,
query_str: String,
answer: String,
title: Option<String>,
) -> Result<(), String>;
async fn new_thread(&mut self, model: String) -> u32;
async fn update_thread_title(&mut self, id: u32, new_title: String) -> Result<(), String>;
}
#[derive(Serialize, Deserialize, WeilType)]
struct ThreadObj {
title: String,
question_answer: WeilVec<QuestionAnswer>,
model: String,
}
#[derive(Serialize, Deserialize, WeilType)]
struct QuestionAnswer {
query: String,
response: String,
}
#[derive(Serialize, Deserialize, WeilType)]
pub struct ChatbotContractState {
addr_to_thread: WeilMap<String, WeilVec<ThreadObj>>,
weil_count: u32,
}
impl ChatbotContractState {
/// Returns the conversation history and model for a specific thread
///
/// Returns a tuple containing:
/// - Vector of (query, response) pairs
/// - Model string used by the thread
///
/// # Errors
/// - If the address has no threads
/// - If the thread ID is invalid
fn thread_details_(
&self,
addr: &String,
id: u32,
) -> Result<(Vec<(String, String)>, String), String> {
let threads_opt = self.addr_to_thread.get(addr);
let threads = match threads_opt {
None => {
return Err("Provided Address does not have any threads".to_owned());
}
Some(threads) => threads,
};
let thread_obj_opt = threads.get(id as usize);
let thread_obj = match thread_obj_opt {
None => {
return Err("Invalid Thread Id".to_owned());
}
Some(thread_obj) => thread_obj,
};
let mut thread_details = Vec::new();
for query_resp in thread_obj.question_answer.iter() {
thread_details.push((query_resp.query, query_resp.response));
}
Ok((thread_details, thread_obj.model))
}
/// Builds the context string for LLM by combining previous conversations
///
/// Flow:
/// 1. Retrieves thread for given address and ID
/// 2. Formats all previous query-response pairs into a single context string
/// 3. Returns formatted context or error if thread/address not found
///
/// # Errors
/// - If the provided address has no associated threads
/// - If the thread ID is invalid for the given address
fn build_context(&self, addr: &String, id: u32) -> Result<String, String> {
let threads_opt = self.addr_to_thread.get(addr);
let threads = match threads_opt {
None => {
return Err("Provided Address does not have any threads".to_owned());
}
Some(threads) => threads,
};
let thread_obj_opt = threads.get(id as usize);
let thread_obj = match thread_obj_opt {
None => {
return Err("Invalid Thread Id".to_owned());
}
Some(thread_obj) => thread_obj,
};
let mut context_str = "".to_owned();
for query_resp in thread_obj.question_answer.iter() {
context_str.push_str(&format!(
"Past Query:{}, Corresponding Response:{}\n",
query_resp.query, query_resp.response
));
}
Ok(context_str)
}
/// Helper function to retrieve a thread object by address and ID
///
/// # Errors
/// - If the address has no threads
/// - If the thread ID is invalid
fn get_thread_obj(&self, addr: &String, id: u32) -> Result<ThreadObj, String> {
let threads_opt = self.addr_to_thread.get(addr);
let threads = match threads_opt {
None => {
return Err("Provided Address does not have any threads".to_owned());
}
Some(threads) => threads,
};
let thread_obj_opt = threads.get(id as usize);
let thread_obj = match thread_obj_opt {
None => {
return Err("Invalid Thread Id".to_owned());
}
Some(thread_obj) => thread_obj,
};
Ok(thread_obj)
}
}
For WeilCollections (such as WeilMap , WeilVec) , please refer Collections.
-
The contract state (
ChatbotContractState
) contains :- Mapping of Address to the list of ThreadObj
- weil_count, which is used to create new WeilVec for a new Thread.
-
ThreadObj
struct represents a Thread, and comprises of Title, Model and a Vector of Query-Response.
And now, the main logic of the contract:
#[smart_contract]
impl Chatbot for ChatbotContractState {
#[constructor]
fn new() -> Result<Self, String>
where
Self: Sized,
{
Ok(ChatbotContractState {
addr_to_thread: WeilMap::new(WeilId(1)),
weil_count: 2,
})
}
/// Returns all thread titles for the current user
///
/// # Implementation
/// - Gets the current user's address using Runtime::sender()
/// - Retrieves their thread list from addr_to_thread map
/// - Returns empty vector if user has no threads
/// - Otherwise collects and returns all thread titles
#[query]
async fn thread_list(&self) -> Vec<String> {
let addr = Runtime::sender();
let mut title_vec = Vec::new();
let threads_opt = self.addr_to_thread.get(&addr);
match threads_opt {
None => Vec::new(),
Some(threads) => {
for t in threads.iter() {
title_vec.push(t.title);
}
title_vec
}
}
}
/// Returns the conversation history for a specific thread
///
/// # Implementation
/// - Uses thread_details_ helper to get conversation history
///
/// # Panics
/// - If thread_details_ returns an error (invalid ID or no threads)
#[query]
async fn thread_details(&self, id: u32) -> Result<Vec<(String, String)>, String> {
let addr = Runtime::sender();
Ok(self.thread_details_(&addr, id)?.0)
}
/// Returns the list of model identifiers that can be used
/// when creating new threads
#[query]
async fn list_models(&self) -> Result<Vec<String>, String> {
Ok(Runtime::list_llm_models().map_err(|err| err.to_string())?)
}
/// Processes a new chat query using the thread's associated LLM
///
/// # Implementation
/// 1. Gets thread's model and builds context from conversation history
/// 2. Creates ChatRequest with context and user's query
/// 3. Sends request to LLM via Runtime
/// 4. Returns streaming response from LLM
///
/// # Errors
/// - Returns error string if thread not found
/// - Returns error string if LLM request fails
#[query(stream)]
async fn chat_query(
&self,
id: u32,
query_str: String,
) -> Result<weil_rs::collections::streaming::ByteStream, String> {
let addr = Runtime::sender();
let model = self.get_thread_obj(&addr, id)?.model;
let context_str = self.build_context(&addr, id)?;
let mut prompt = format!("{context_str}, respond to this {query_str}, along with this, ");
// when the thread is empty initially, context will be nothing
if context_str.is_empty() {
prompt.push_str("Generate a short title for the prompt in the begining of the response, in the format: {<title>}");
}
let mut chat_request = ChatRequest::new(model);
chat_request.add_message(Role::User, prompt);
let llm_response = Runtime::ask_llm(chat_request).map_err(|err| err.to_string())?;
Ok(llm_response)
}
/// Stores a query-response pair in the thread's conversation history
///
/// # Implementation
/// 1. Retrieves or creates thread vector for current user
/// 2. Gets or initializes thread object at given ID
/// 3. Appends new question-answer pair to thread history
///
/// # Notes
/// - Creates new thread with default title if ID doesn't exist
/// - Automatically initializes storage for new users
#[mutate]
async fn append_response(
&mut self,
id: u32,
query_str: String,
answer: String,
title: Option<String>,
) -> Result<(), String> {
let addr = Runtime::sender();
let index = id as usize;
let Some(threads) = self.addr_to_thread.get(&addr) else {
return Err("No Thread present for Address".to_owned());
};
let mut thread_obj = match threads.get(index) {
None => {
return Err("No thread present for the given id".to_owned());
}
Some(thread_obj) => thread_obj,
};
if let Some(title) = title {
thread_obj.title = title;
}
thread_obj.question_answer.push(QuestionAnswer {
query: query_str,
response: answer,
});
// won't fail because get already performed on the same index
threads.set(id as usize, thread_obj).unwrap();
self.addr_to_thread.insert(addr, threads);
Ok(())
}
/// Creates a new chat thread for the current user
///
/// # Implementation
/// 1. Gets/creates thread vector for current user's address
/// 2. Initializes new thread with:
/// - "New Chat" as default title
/// - Empty conversation history
/// - Specified LLM model
/// 3. Returns ID of new thread
///
/// # Notes
/// - Automatically creates storage for new users
/// - Thread IDs are sequential per user
#[mutate]
async fn new_thread(&mut self, model: String) -> u32 {
let addr = Runtime::sender();
let threads = self.addr_to_thread.get(&addr);
let mut thread_vec = match threads {
// if no value exists for the given addr key, then create an entry in the map
None => {
let thread_vec = WeilVec::new(WeilId(self.weil_count));
self.weil_count += 1;
thread_vec
}
Some(x) => x,
};
thread_vec.push(ThreadObj {
title: "New Chat".to_owned(),
question_answer: WeilVec::new(WeilId(self.weil_count)),
model: model.clone(),
});
let id = (thread_vec.len() - 1) as u32;
Runtime::debug_log(&format!("setting thread with {:?}", model));
self.addr_to_thread.insert(addr.clone(), thread_vec);
self.weil_count += 1;
id
}
//
/// Updates the title of an existing thread
///
/// # Implementation
/// 1. Retrieves thread object for current user
/// 2. Updates title field
/// 3. Saves modified thread back to storage
///
/// # Panics
/// - If thread ID is invalid
/// - If user has no threads
#[mutate]
async fn update_thread_title(&mut self, id: u32, new_title: String) -> Result<(), String> {
let addr = Runtime::sender();
let Some(threads) = self.addr_to_thread.get(&addr) else {
return Err("No Thread present for Address".to_owned());
};
let mut thread_obj = match threads.get(id as usize) {
None => {
return Err("No thread present for the given id".to_owned());
}
Some(thread_obj) => thread_obj,
};
thread_obj.title = new_title;
// won't fail because get done on the same id
threads.set(id as usize, thread_obj).unwrap();
self.addr_to_thread.insert(addr, threads);
Ok(())
}
}
The methods utilizing the LLM power are : chat_query
and get_thread_title
. Runtime::ask_llm
returns ByteStream result type. In chat_query
, streaming response is better for the UX and therefore, ByteStream is returned directly. In get_thread_title
though, Usage of Stream will be unnecssary since the title is just 3-5 words (ideally, given that the LLM model selected follows the instructions), and therefore the streamed chunks are aggregated and then returned as a single string.
For more such explanations , please check out the Docs.