Agent testing
The Testing page lets you see how your agent would respond to any comment without affecting live behavior. Use it to verify your settings, test edge cases, and build confidence before going live.
How testing works
- Select a post: Click "Select a Post", browse your connected pages, and choose a post to test against
- Enter a comment: Type a test comment in the textarea, or click "Import comment" to load a real comment from your pages
- Click "Test Comment": Your agent processes the comment using your current settings
- Review results: See the full analysis, confidence scores, and generated reply
Understanding test results
After running a test, you'll see several result sections:
Analysis
Shows how your agent interpreted the comment:
- Intent pill: The detected intent — question, complaint, praise, feedback, spam, etc.
- Sentiment pill: The emotional tone — positive, negative, or neutral
- Will Reply / Will Skip indicator: Whether your agent would respond to this comment
Confidence Metrics
Two scores that measure how well your agent handled the comment:
- Understanding: How well the AI understood the comment's intent and context (0–100%)
- Reply Confidence: How confident the AI is in the quality and relevance of its reply (0–100%)
Scores are color-coded: green for high confidence, amber for medium, and red for low.
Auto-Publish Decision (hybrid mode only)
If you're using "High confidence" publishing mode, this section shows whether the reply would be auto-published or held as a draft based on your confidence threshold from the Replies page.
Moderation Result
Shows whether the comment would be hidden or approved based on your current moderation settings. This reflects the rules configured on the Moderation page.
Automation Result
If you have automations configured, this shows which automation (if any) matched the comment, along with the automation name and its confidence percentage.
Conversation Preview
Shows the generated reply text and the reasoning behind it. If no reply was generated, this section explains why (e.g., the comment was too short, off-topic, or matched a skip condition).
Importing real comments
Click "Import comment to test" to browse actual comments from your connected pages and load one into the test field. This is useful for:
- Testing against real-world scenarios your agent will encounter
- Reproducing a specific situation where your agent didn't respond as expected
- Validating settings changes against comments you've already seen
Best practices for testing
Start with real examples
- Import actual comments from your pages
- Test against various types of comments (questions, complaints, praise)
- Try comments in different languages if your audience is multilingual
Test edge cases
- Very short comments (emojis, single words)
- Long, detailed comments
- Comments with links or attachments
- Off-topic or inappropriate comments
Iterative improvement
- Test with current settings to establish a baseline
- Make small changes to your settings
- Test the same comments again to see the difference
- Refine until you're satisfied with the responses
FAQ: Testing questions
"Do tests affect my live agent?"
No. Testing is completely isolated from your live agent. No replies are published, no comments are moderated, and no data is changed.
"What do the confidence percentages mean?"
- Understanding measures how well the AI grasped what the commenter was asking or saying. Low scores may indicate vague or ambiguous comments.
- Reply Confidence measures how sure the AI is that its response is helpful and relevant. Low scores may indicate the agent lacks knowledge to answer well — consider adding to your Knowledge Base.
"Why didn't my agent generate a reply?"
Common reasons include:
- The comment is too short or lacks substance (emoji-only, single words)
- The comment doesn't match your "Respond on" criteria (e.g., agent only responds to ads)
- Response Frequency is set to "When relevant" and the AI determined a reply wouldn't add value
- The comment was flagged by moderation settings
"Can I test the same comment multiple times?"
Yes, you can test the same comment repeatedly with different settings to compare results.