Create your first MCP Server
Model Context Protocol (MCP) servers enable AI models to interact with external tools and data sources in a standardized way. On the WeilChain platform, you can create MCP servers as applets that expose functions as tools for AI models to use.
This tutorial will guide you through creating an MCP server applet that provides arithmetic operations for AI models.
What is an MCP Server?
An MCP server is a standardized interface that allows AI models to access external functionality through well-defined tools. In the WeilChain ecosystem, MCP servers are deployed as applets that:
- Expose functions as AI-callable tools
- Provide structured descriptions for each tool
- Handle parameter validation and execution
- Return results in a format AI models can understand
The key advantage of MCP servers on WeilChain is that they run as decentralized applets, ensuring reliability, transparency, and auditability.
Prerequisites
This tutorial assumes that:
- You have completed the How-to install the WIDL compiler and extension
- You have Rust installed and configured for WebAssembly development
- You are familiar with basic Rust programming concepts
Project Setup
First, create a new Rust project for your MCP server:
cargo new --lib arithmetic
cd arithmetic
Interface Specification with @mcp Annotation
Create a file arithmetic.widl
with the following contents:
@mcp
interface Arithmetic {
// adds two numbers
query func add(
// the first number
x: int,
// the number to add with the first one
y: int) -> int;
// multiply two numbers
query func multiply(
// the first number
x: int,
// the number to multiply with the first one
y: int) -> int
}
The @mcp
annotation is crucial here - it tells the WIDL compiler to generate MCP-compatible bindings. Notice several important aspects:
Function Comments as Tool Descriptions
The comments above each function and parameter become the tool descriptions that AI models use to understand when and how to call your tools. These descriptions are critical because:
- AI Tool Selection: The AI model reads these descriptions to determine which tool to use for a given task
- Parameter Understanding: Parameter descriptions help the AI provide correct arguments
- Context Awareness: Well-written descriptions help the AI understand the tool's purpose and limitations
Generate Server Bindings
Generate the MCP server bindings using the WIDL compiler:
widl generate arithmetic.widl server rust
This creates a bindings.rs
file. The generated code includes a special tools()
method that returns a JSON description of available tools:
fn tools(&self) -> String {
r#"[
{
"type": "function",
"function": {
"name": "add",
"description": "adds two numbers\n",
"parameters": {
"type": "object",
"properties": {
"x": {
"type": "number",
"description": "the first number\n"
},
"y": {
"type": "number",
"description": "the number to add with the first one\n"
}
},
"required": [
"x",
"y"
]
}
}
},
{
"type": "function",
"function": {
"name": "multiply",
"description": "multiply two numbers\n",
"parameters": {
"type": "object",
"properties": {
"x": {
"type": "number",
"description": "the first number\n"
},
"y": {
"type": "number",
"description": "the number to multiply with the first one\n"
}
},
"required": [
"x",
"y"
]
}
}
}
]"#.to_string()
}
This JSON follows the standard function calling format, making it compatible with various AI models and frameworks.
Implement the Applet Logic
Copy the generated bindings to your main library file:
cat bindings.rs > src/lib.rs
rm bindings.rs
Update your Cargo.toml
to include the necessary dependencies:
[package]
name = "arithmetic"
version = "0.1.0"
edition = "2025"
[dependencies]
weil_rs = { path = "../../../contract-sdk/rust/weil_rs/" }
weil_macros = { path = "../../../contract-sdk/rust/weil_rs/weil_macros" }
weil_contracts = { path = "../../../contract-sdk/rust/weil_rs/weil_contracts" }
wadk-utils = { path = "../../../wadk-utils" }
anyhow = "1.0.97"
serde = { version = "1.0.219", features = ["derive", "rc"] }
serde_json = { version = "1.0.140", features = ["raw_value"] }
[lib]
crate-type = ["cdylib"]
Now implement the applet logic in src/lib.rs
. Replace the unimplemented functions with:
#[query]
async fn add(&self, x: i32, y: i32) -> i32 {
x + y
}
#[query]
async fn multiply(&self, x: i32, y: i32) -> i32 {
x * y
}
Compile the MCP Server
Build your MCP server applet:
cargo build --target wasm32-unknown-unknown --release
You should now have a file target/wasm32-unknown-unknown/release/arithmetic_mcp.wasm
containing your MCP server applet.
Deploy Your MCP Server
To deploy your MCP server applet to the WeilChain platform, follow the detailed instructions in the Deploy an Applet on Weilchain tutorial.
Copy the applet id of the deployed applet. It will be required while registering this MCP server on Icarus
Next Steps
Once your MCP server is deployed, you can:
- Add it to Icarus, the onchain AI-chatbot
- Interact with the applet using natural language
In the next tutorial, we'll show you how to do so.. Register and use an MCP server