Add human feedback using Worker Bindings
This guide explains how to provide human feedback for AI Gateway evaluations using Worker bindings.
1. Run an AI Evaluation
Section titled “1. Run an AI Evaluation”Start by sending a prompt to the AI model through your AI Gateway.
const resp = await env.AI.run( "@cf/meta/llama-3.1-8b-instruct", { prompt: "tell me a joke", }, { gateway: { id: "my-gateway", }, },);
const myLogId = env.AI.aiGatewayLogId;
Let the user interact with or evaluate the AI response. This interaction will inform the feedback you send back to the AI Gateway.
2. Send Human Feedback
Section titled “2. Send Human Feedback”Use the patchLog()
method to provide feedback for the AI evaluation.
await env.AI.gateway("my-gateway").patchLog(myLogId, { feedback: 1, // all fields are optional; set values that fit your use case score: 100, metadata: { user: "123", // Optional metadata to provide additional context },});
Feedback parameters explanation
Section titled “Feedback parameters explanation”feedback
: is either-1
for negative or1
to positive,0
is considered not evaluated.score
: A number between 0 and 100.metadata
: An object containing additional contextual information.
patchLog: Send Feedback
Section titled “patchLog: Send Feedback”The patchLog
method allows you to send feedback, score, and metadata for a specific log ID. All object properties are optional, so you can include any combination of the parameters:
gateway.patchLog("my-log-id", { feedback: 1, score: 100, metadata: { user: "123", },});
Returns: Promise<void>
(Make sure to await
the request.)
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Products
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark