Call Quality
Aila's Call Quality system helps you maintain consistent standards across all calls through AI-powered quality checklists, automatic scoring, and risk assessment.
Overview
The Call Quality dashboard provides comprehensive monitoring of call quality and compliance across your organization. Use it to:
- Monitor standards - Ensure all calls meet your quality criteria
- Identify risks - Catch compliance issues and problem calls
- Track trends - See quality improvements over time
- Coach effectively - Find specific calls needing attention
Quality Checklists
What are Quality Checklists?
Quality checklists define the standards every call should meet. Aila automatically evaluates each call against your checklist criteria using AI.
Checklist Components
Each checklist contains:
Criteria - Individual quality standards to evaluate:
- "Greeted customer professionally"
- "Used customer's name at least once"
- "Asked discovery questions"
- "Handled objections effectively"
- "Set clear next steps"
Scoring - How each criterion is evaluated:
- Pass/Fail - Binary yes/no evaluation
- Points - Weighted scoring (0-10 points per criterion)
- Critical - Must-pass items (instant fail if missed)
Thresholds - Overall quality levels:
- High Quality - 90%+ score
- Acceptable - 70-89% score
- Needs Improvement - 50-69% score
- Poor - Below 50% score
Creating Quality Checklists
See Settings > Call Scoring for detailed instructions on:
- Creating custom checklists
- Defining quality criteria
- Setting up scoring rules
- Assigning checklists to call types
Call Quality Dashboard
Dashboard Overview
The Call Quality page shows:
Summary Metrics:
- Total Calls Analyzed - Calls evaluated this period
- Average Quality Score - Overall quality rating
- High-Risk Calls - Calls failing critical criteria
- Compliance Rate - Percentage meeting standards
Quality Distribution:
- Chart showing score distribution
- Trend lines for quality over time
- Comparison to previous periods
- Team-level breakdowns
Risk Categories:
- Compliance violations
- Customer satisfaction issues
- Policy violations
- Training needs
Filtering Quality Data
Filter the dashboard by:
- Date range - Specific time periods
- Team members - Individual or group performance
- Quality score - High, medium, low quality calls
- Risk level - High, medium, low risk
- Call type - Inbound, outbound, specific purposes
- Checklist template - Specific quality standards
Quality Scores
How Scoring Works
For each call, Aila:
- Transcribes the call
- Analyzes content against checklist criteria
- Evaluates each criterion (pass/fail or point value)
- Calculates overall score
- Flags any high-risk items
- Generates improvement suggestions
Score Breakdown
View detailed scoring for any call:
Overall Score - Total quality rating (0-100%)
Criterion-Level Results:
- ✓ Passed criteria (green)
- ✗ Failed criteria (red)
- AI reasoning for each evaluation
- Timestamp references to transcript
Critical Items:
- Highlighted separately
- Automatic flagging for review
- Required for quality approval
Manual Score Review
Override AI scores when needed:
- Open call quality view
- Click Review Score
- Adjust individual criterion results
- Add notes explaining override
- Save updated score
Risk Assessment
High-Risk Call Detection
Aila automatically flags calls with:
Compliance Issues:
- Missing required disclosures
- Inappropriate language
- Regulatory violations
- Policy breaches
Quality Concerns:
- Very low scores
- Failed critical criteria
- Customer dissatisfaction indicators
- Escalation signals
Business Risks:
- Commitments made without authority
- Pricing errors
- Misrepresentation
- Legal concerns
Risk Levels
High Risk - Immediate attention needed:
- Compliance violations
- Customer escalations
- Legal issues
- Failed critical criteria
Medium Risk - Should be reviewed:
- Below-standard quality
- Missing best practices
- Training opportunities
- Process violations
Low Risk - Monitor:
- Minor quality issues
- Inconsistent adherence
- Improvement areas
Risk Management Workflow
- Detection - AI flags potential issues
- Review - Manager evaluates flagged calls
- Action - Coaching, retraining, or escalation
- Track - Monitor resolution and trends
- Prevent - Update training and processes
Quality Monitoring Workflow
Daily Quality Review
Morning Routine:
- Check overnight calls
- Review high-risk flagged calls
- Address critical issues immediately
- Schedule coaching for low scores
End of Day:
- Review all calls from today
- Verify quality scores
- Leave coaching feedback
- Update quality trends
Weekly Quality Analysis
- Trends - Review quality score trends
- Patterns - Identify common issues
- Coaching - Schedule sessions for recurring problems
- Training - Update materials based on findings
- Standards - Adjust checklists if needed
Monthly Quality Reporting
Generate comprehensive reports:
- Quality score averages by team member
- Trend analysis (improving/declining)
- Common failure points
- Training needs identified
- Compliance status
- Recommended actions
Quality Improvement
Identifying Training Needs
Use quality data to find improvement opportunities:
Individual Level:
- Review specific team member's low-scoring calls
- Identify consistent failure patterns
- Create targeted coaching plans
- Track improvement over time
Team Level:
- Analyze organization-wide quality trends
- Find commonly failed criteria
- Update training programs
- Revise processes if needed
Coaching from Quality Data
- Filter for low-quality calls
- Review specific failures
- Find patterns in individual's performance
- Prepare coaching with call examples
- Track improvement after coaching
Continuous Improvement
Regular Review Cycle:
- Weekly - Review quality trends
- Monthly - Analyze patterns and update training
- Quarterly - Revise quality standards
- Annually - Comprehensive program review
Feedback Loop:
- Quality data → Coaching → Improved performance → Updated standards
Quality Reports
Available Reports
Team Quality Report:
- Average scores by team member
- Quality distribution
- Trend lines
- Comparison rankings
Compliance Report:
- Pass/fail rates for critical criteria
- Risk assessment summary
- Violations by type
- Remediation tracking
Trend Analysis:
- Quality over time
- Improvement trajectories
- Seasonal patterns
- Before/after training impact
Exporting Quality Data
- Navigate to Call Quality dashboard
- Apply desired filters
- Click Export Report
- Choose format (PDF, CSV, Excel)
- Select metrics to include
- Download or schedule recurring reports
Best Practices
Setting Quality Standards
Start Simple:
- Begin with 5-7 core criteria
- Focus on most important standards
- Add complexity gradually
- Test with real calls before full rollout
Make Criteria Specific:
- ✗ "Provide good service"
- ✓ "Greet customer by name within first 30 seconds"
- ✓ "Ask at least 3 discovery questions"
- ✓ "Confirm next steps before ending call"
Balance Standards:
- Mix critical must-haves with nice-to-haves
- Include both process and outcome criteria
- Consider different call types
- Align with business goals
Regular Calibration
Weekly Calibration:
- Review AI scores vs. manager assessments
- Discuss edge cases as team
- Refine criteria definitions
- Update AI training if needed
Monthly Review:
- Are criteria still relevant?
- Do weights reflect importance?
- Are thresholds appropriate?
- Should any criteria be added/removed?
Using Quality Data Effectively
Don't Over-Focus on Scores:
- Scores are indicators, not absolute truth
- Context matters
- Use for trends, not individual judgment
- Combine with qualitative feedback
Positive Reinforcement:
- Recognize high-quality calls
- Share excellent examples
- Celebrate improvements
- Use library for positive examples
Coaching, Not Punishment:
- Use data to help, not criticize
- Focus on specific, actionable feedback
- Provide examples and alternatives
- Track improvement over time
Troubleshooting
Inaccurate Scores
AI Misses Context:
- Add more specific criteria definitions
- Include examples in criteria descriptions
- Calibrate with manual reviews
- Override incorrect scores with notes
Consistently Wrong on Specific Criterion:
- Revise criterion wording
- Add clarifying examples
- Check if criterion is measurable
- Consider removing if not AI-evaluable
No Quality Scores Showing
- Verify checklist template is assigned
- Check call processing completion
- Ensure checklist is active
- Review call routing rules
Scores Don't Match Expectations
- Review actual call transcript
- Check criterion definitions
- Calibrate scoring thresholds
- Discuss with team for consensus
Next Steps
- Settings > Call Scoring - Create and manage checklists
- Coaching - Use quality data for coaching
- Performance - Track quality trends across team
- Calls - Review individual call quality scores