Upload
raghavcatty2009
View
226
Download
0
Embed Size (px)
Citation preview
7/27/2019 Interim Eng
1/44
7/27/2019 Interim Eng
2/44
Guidelines for Core KeyPerformance IndicatorsInterim Report onPrimary Service Channels
Guidelines for Core KeyPerformance:Primary Service Channels
Guidelines for Core
Key Performance Indicators
Interim Report on Primary Service Channels
September 2004
2
7/27/2019 Interim Eng
3/44
Executive Summary
The development of key performance indicators (KPIs) for the Government ofCanada (GoC) became a priority as Canadas Government Online (GOL) initiative
matured from 1998 through 2004. The rapid development of the Internet channelas a means of providing effective public service delivery created an appetite forrevolutionary change in all types of service delivery. Prior to GOL, large-scaleimprovements to service delivery were confined to specific government programsand services. Interdepartmental projects were rare. The advent of the Internetand the preference of Canadians to to access on-line government services hascreated cutting edge opportunities for change in delivering services to Canadians.
In the past three years, dozens of interdepartmental initiatives have taken holdand have helped to foster citizen-centred service delivery. As more and morebusiness improvement opportunities were conceived, it became clear that the
Government of Canada needed clear communication for analytical decisionmaking. Many departments have made significant investments in performancemanagement and made progress towards the disciplined decision-makingcharacteristic of the worlds best corporations. Nevertheless, differences interminology, definitions, usage, data collection, and performance frameworkswere quickly identified as limiting the ability to monitor and affect enterprise-levelperformance.
The genesis of the Core KPI project came from the GoCs Telephony ServiceWorking Group an interdepartmental collection of GoC call centre managersand executives that came together to share best practices, establish consistent
service standards and generally improve the capabilities of GoC call centreoperations. In 2003, this working group quickly identified, and provided precisedefinitions of, common KPIs.
Coincident with this achievement, Treasury Board of Canada, Secretariatdeveloped a modernized approach to the management of the public sectororganizations and programs called the Management Accountability Framework(MAF). This comprehensive set of tools, standards, and processes provided anover-arching framework for the Core KPI project. The operational nature of KPIsstrongly supported the MAF and provided direct information to two of the primaryMAF categories stewardship and citizen-focused service.
In 2003, as the GoCs Internet channel rapidly matured and initial significanttransactional capability came online, new interdepartmental working committeeswere formed to deal with the complexities of multi-service, multi-channel deliveryalternatives. Internet gateways and clusters rapidly evolved; this helped organizeservices in parallel with client segments and life events. This has createdopportunities to effect corresponding changes in how GoC services are delivered
3
7/27/2019 Interim Eng
4/44
in-person and by mail. By 2004, there was a clear need to establish commoncore KPIs and establish a working environment to develop further a commonperformance language.
The Core KPI project brought together numerous government managers expertsin delivering services to Canadians, visitors and businesses. Managers withoperational responsibility for call and mail processing centres, Internet sites, andin-person locations were engaged in several meetings to identify the KPIs thatprovide maximum management value.
The result of these meetings was a small set of channel-specific core KPIs thatreflect specific MAF themes. These KPIs will be required for a variety of reportingrequirements, Treasury Board submissions, and ongoing reviews. Additionaloperational KPIs were identified that are recommended by Treasury Board (butnot required) as effective indicators that provide strong operational benefits to
service delivery organizations.
The Core KPI project is not complete. There is an ongoing requirement forimplementation, improvement, and additions as the GoC service delivery strategyevolves. Perhaps the most important and lasting benefit is the networking of thebest performance management people in the GoC. These experts continue todevelop new techniques and identify improvements to ensure that Canadaremains one of the world leaders in public sector service delivery. And thatposition clearly improves our competitive position in the twenty-first century.
4
7/27/2019 Interim Eng
5/44
Record of Changes
Version Date Summary of Changes
V 0.9 August 30, 2004 First draft for formal review
V 1.0 Sept. 30, 2004 Minor edits
Detailed Description of Changes from Previous Version
Acknowledgements
Project Authority: Victor Abele Director, Service Strategy,CIOB Treasury Board Secretariat, Canada
Project Analyst: Phillip Massolin Analyst, Service StrategyCIOB, Treasury Board Secretariat, Canada
Author: Dan Scharf Equasion Business Technologies
Contributors: Daryl SommersColin SmithReina GribovskyDolores LindsayDaniel TremblayKyle Toppazzini
Marg Ogden
Web Content: Morris Miller
Service Improvement - Guidelines for Key PerformanceIndicators
_______________________________________________________
Table of Contents
5
7/27/2019 Interim Eng
6/44
Service Improvement - Guidelines for Key PerformanceIndicators
_______________________________________________________
1.0 INTRODUCTION
Citizens are faced with a greater choice of channels than ever before to accessgovernment services (in-person, phone, internet and mail) , creatingcorresponding challenges for organizations to manage service delivery across allchannels.
Key Performance Indicators (KPI) are increasingly used by the private and publicsectors to measure progress towards organizational goals using a defined set ofquantifiable measures. For the GoC, KPIs are becoming an essential part of
achieving Management Accountability Framework (MAF) compliance.
Once approved, the KPI framework will constitute a key element of departmentsannual monitoring. You can navigate KPIs by:0* Channel - each channel includes standard metrics (KPIs) for managingperformance, or1* Management Accountability Framework category
A series of government-wide workshops identified the requirement for aconsistent approach to measure service delivery performance across the GoC.Workshop resultscan be assessed to help you create baseline frameworks for
channel measurement and provide input into the process.
6
http://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_APPENDIX_B:__Referenceshttp://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asphttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_APPENDIX_B:__Referenceshttp://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asp7/27/2019 Interim Eng
7/44
Key Performance Indicators for Service Delivery Channels
2.0 DEFINING THE CHANNELS
Phone Service is the preferred service channel for most Canadians. Primary
modes of interaction within this channel are: Interactive Voice Response (IVR) to provide self-service, routing, and
broadcast services; Agent-based services the most highly valued service channel for citizens
today.
Internet Service: most surveys indicate this as preferred channel of the future.Modes of interaction include: Self-service via online transactions, search and website navigation
strategies E-mail which provides delayed support both automated and agent
authored Online chat technologies which provides real time agent assisted Internet
service delivery
Mail: primary indicators indicate that this channel the paper channel - isdecreasing in popularity. Surveys in the past few years indicate that citizens willincrease the use of more timely and interactive channels. Mail is sent using threemethods (analogous to the modes of interaction in the other channels): Fax instant transmission via facsimile devices over phone or broadband
networks; Courier expedited delivery via priority parcel carriers (within 48 hours); Regular Mail via regular postal services (3 to 10 days).
In-Person Service: Canadas extensive network of local offices provides asignificant proportion of all government service delivery using primarily queuedand appointment based service models. Some In-Person points of service alsooffer assisted Internet and telephone services through kiosks, publicly availablecomputers, and public phones. There are four services modes for the In-PersonService channel:
Queued a managed, multi-agent counter office which often has areception counter to triage visitors to the correct counter and answersimple questions;
Scheduled significant volumes of in-person services are provided on apre-scheduled one-on-one basis;
Outreach several service delivery organizations schedule seminars andtraining sessions in communities throughout Canada;
http://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_8.0_KEY_PERFORMANCEhttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_10.0_KEY_PERFORMANCE_INDICATORS%20%E2%80%93Shttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_11.0_KEY_PERFORMANCE_INDICATORS%20%E2%80%93Shttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_9.0_KEY_PERFORMANCE_INDICATORS%E2%80%93In-http://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_8.0_KEY_PERFORMANCEhttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_10.0_KEY_PERFORMANCE_INDICATORS%20%E2%80%93Shttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_11.0_KEY_PERFORMANCE_INDICATORS%20%E2%80%93Shttp://var/www/apps/conversion/tmp/scratch_4/HYPERLINK%23_9.0_KEY_PERFORMANCE_INDICATORS%E2%80%93In-7/27/2019 Interim Eng
8/44
7/27/2019 Interim Eng
9/44
Key Performance Indicators for Service Delivery Channels
3.0 MANAGEMENT ACCOUNTABILITY FRAMEWORK
The Government of Canada has instituted a consistent management frameworkfor its programs and services. Comprehensive information on the Management
Accountability Framework (MAF) can be found at Treasury Boards website(reference: http://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asp ) . MAF providesdeputy heads and all public service managers with a list of managementexpectations that reflect the different elements of current managementresponsibilities. Key Performance Indicators for Service Delivery are groupedwithin the MAF categories (Policy and Programs, People, Citizen-FocusedService, Risk Management, Stewardship, Accountability) as shown in thefollowing diagram.
The majority of service delivery indicators relate to the operational nature of the
Stewardship category. Additional indicators measure progress to objectives underthe Citizen-Focused Service Category and People. The Accountability categoryprovides checklists and processes for establishing effective service levelagreements. Specific assessment tools are used for the Policy and Programsand Risk Management categories.
4.0 SERVICE STANDARDS
http://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asphttp://www.tbs-sct.gc.ca/maf-crg/maf-crg_e.asp7/27/2019 Interim Eng
10/44
Key Performance Indicators for Service Delivery Channels
The Service Improvement Initiative defines specific guidelines for all departmentsand agencies to establish and publish services standards for citizens,international visitors, and businesses using GOC services and programs.
The Citizen First Survey provides specific information on client serviceexpectations through a formal comprehensive survey. Trends over the past 5years indicate shifts in these expectations that provide government servicedelivery managers with effective direction to prioritize service improvementinitiatives.
The overarching goal is to establish a 10% increase in client satisfaction by 2005.Departments are required to set standards and measure progress to this goalusing primary criteria such as:Timeliness - the time required to receive a service or product
Access - how accessible was the service or ordering process to the client?
Outcome - did the client receive what was needed?Satisfaction - overall client satisfaction with the service/product request.Common high-level service standards include:
Average speed to answer - 5 minutesExpected answer by e-mail - Next business dayQueue time for in-person services - 15 minutes.
7/27/2019 Interim Eng
11/44
Key Performance Indicators for Service Delivery Channels
5.0 ACCOUNTABILITY and Key Performance Measures
As specified in MAF, service delivery channels must employ clear accountabilityframeworks. Most frequently, departments and agencies use a Service Level
Agreement (SLA) both for internally resourced services and for third party andpartnership teams. The following table presents a minimum set of componentswhich must be included in a Government of Canada SLA which provides thefoundation for service delivery to citizens, visitors and businesses.
COMPONENT DESCRIPTION
Service Level AgreementName
The name of the SLA, particularly useful when asingle SLA is used across multiple service offerings.
Service Description The details of the service the government intends toprovide and the benefits the users can expect toreceive
Service Criticality Level Normally identified in a Service Catalogue based onalready defined metrics. This level of criticalityshould be based primarily on the service usersrequirements.
Service Channels Identifies which channels this service is availablethrough e.g. telephone, mail, in-person, Internet,and appropriate contact information for thechannels.
Service Primary ServiceProvider
The department or agency which is primarilyresponsible for the service.
Service Partner Providers Other partner-departments that provide support to aPrimary Service Provider for a service. e.g. GTISprovides the Internet Server to the Health Canada
Pledge Provides the details of the quality of the service aclient can expect. This is frequently time based e.g.Passports will be processed in X number of days.
Delivery Targets Delivery targets describe the key aspects of theservice provided, such as access, timeliness andaccuracy.
Dates The effective start and end dates of the agreement.A review date must also be identified so thatperformance measurements can be made and theSLA can be adjusted or action can be taken toimprove the performance of an SLA.
7/27/2019 Interim Eng
12/44
Key Performance Indicators for Service Delivery Channels
Costs Identifies a cost for service (even when user feesare not required) to ensure that users understandand form realistic expectations about servicesoffered by the Federal Government
Complaint and Redress Provides the service user with mechanisms toresolve their Concerns, for example when the SLAhas not been met.Dates: start, end and reviewScope: (what is covered and what is not)Responsibilities: Service Providers, Partners andthe User
Service Hours Service availability e.g. 24x7. Service hours shouldprovide maximum cost-effective access for theservice user. Public holidays must be identified aswell as the hours for each channel.
Throughput Describes the anticipated volumes and timing for activities within a specific service e.g. UI
Applications Sep-May=100,000, Jun-Aug=50,000.This is important so that any performance issues,which have been caused by excessive throughputoutside the terms of the SLA, can be identified.
Change Management Identifies the policies surrounding any changes thatwill affect the service provided to the user. Forexample, if UIC benefits are going to be mailed outevery 2nd month instead of every month, how willthe change be managed to ensure that expectations
are being met.Security and Privacy Identifies inter-departmental policies on the sharing
of user information for various services.Organizations must comply with PIPEDA andTreasury Board Policies.
Service reporting andreviewing
Specifies the content, frequency and distribution ofservice reports and the frequency of service reviewmeetings.
PerformanceIncentives/Penalties
This section identifies any agreement regardingfinancial incentives or penalties based uponperformance against service levels. Penalty
clauses can create a barrier to partnership if unfairlyinvoked on a technicality and can also make serviceproviders and partners unwilling to admit mistakesfor fear of the penalties being imposed.
7/27/2019 Interim Eng
13/44
Key Performance Indicators for Service Delivery Channels
6.0 POLICY AND PROGRAMS
The Policy and Programs in this MAF context refers to relevant lines of activitywithin Departments and Agencies. Departmental Reports are the primary
reporting tool used to document the overall policy effectiveness of specificprograms. Readers should consult the MAF, relevant Treasury Board Policies aswell as the Program Activity Architecture (P.A.A.) for information and guidance inthis category.
In the Fall / Winter of 2004/05, we will be consulting with the service communitywith a view to developing a more comprehensive service policy. This work willallow us to formalize the approach to KPIs which is currently in draft format.
7.0 RISK MANAGEMENT
The Risk Management category of MAF specifies a checklist for departmentalmanagement to establish comprehensive and transparent identification of risks,tolerances, mitigation strategies, and effective communication approaches. Fordetailed information, readers should refer to MAF.
7/27/2019 Interim Eng
14/44
Key Performance Indicators for Service Delivery Channels
8.0 KEY PERFORMANCE INDICATORS Phone Channel
MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
KPI: Call AccessDescription: Percentage of calls presented that get into the ACD.Objective: Measures overall service capacity from ACD to Agent.Definition: (Calls Answered +Calls Abandoned) divided by Calls Presented.Busy signals generated by switch divided by total calls received in reportingperiod.Derivation: ACD
Suggested benchmark / Range: 40% to 60%Status: Proposed as a Core KPI
KPI: Caller AccessDescription: Percentage of unique callers who attempt and successfully accessservice.Objective: Basic volume measure. Determines service level by countingunserviced callers. Removes repeat callers from accessibility measure.Definition: Total unique phone numbers completed divided by Total UniquePhone Numbers attempted.
Suggested benchmark / Range: 80% to 85%Status: Proposed as a Core KPI
KPI: Abandoned CallsDescription: Percentage of calls which are abandoned while in queue due toprolonged delay waiting for service, typically for a live agent.Objective: Key Measure for overall service level.Definition: Number of calls abandoned within agent-queue + IVR abandonsbefore success markers divided by total calls answered + total calls abandoned. Derivation: ACD.Suggested benchmark / Range: 10% to 15%Status: Proposed as a Core KPI
Metrics for Delay
7/27/2019 Interim Eng
15/44
Key Performance Indicators for Service Delivery Channels
KPI: Average Speed to Answer (ASA)Description: The average delay, while in queue before connecting to an agent,expressed in seconds.Objective: Primary Indicator of caller satisfaction.
Definition: The total number of seconds from ACD queuing of call to agentacceptance / total agent calls.Derivation: Measured by ACDStatus: Proposed as a Core KPI
KPI: Service LevelDescription: Percentage of calls that reach an agent or are abandoned within aspecified time threshold.Objective: This measure is required in order to set and publish telephone servicestandards.
Definition: Calls answered within Threshold + Calls Abandoned withinThreshold / (Total Calls Answered + Total Calls Abandoned).Derivation: Measured by ACDStatus: Proposed and required for phone service management
Metrics for Quality
KPI: Answer AccuracyDescription: Consistency of IVR and agent answers.
Objective: To ensure program integrity.Definition: Local quality scorecard assessed by call monitoring and / or mysteryshopper approaches.Derivation: # of calls answered in IVR terminated at success markers +# ofagent calls resulting in success status times accuracy evaluation ratioStatus: Proposed as a Core KPI
KPI: ProfessionalismDescription: Encompasses a range of soft-skills that govern the approach todelivering accurate information and reliable services.Objective: Identifies and reinforces effective communication.Definition: Best measured through the use of a mystery shopper program thatuses specific planned calls placed to the call centre by a measurementorganization. Can also be measured by exit surveys performed immediately aftercall completion.Status: Recommended as an operational measure.
7/27/2019 Interim Eng
16/44
Key Performance Indicators for Service Delivery Channels
Metrics for Client Satisfaction
KPI: Client Satisfaction LevelDescription:Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfactionObjective: Primary indicator of client perception of service quality, measuredrepeatedly, trends provide strong evidence of service improvement levelsDerivation: The multi-channel survey tool will be used (at a minimum) todetermine the core measures relevant to the telephone channel.Status: Proposed as a Core KPI
KPI: Service ComplaintsDescription: Count and categorization of complaints received through allchannels concerning the Phone channel.
Objective: Primary indicator of service quality particularly when measured overtime.Derivation: Counted by incident tracking system. Complaints received throughother channels must be added to total.Status: Recommended as a Core KPI but not currently feasible as mostGoC organizations do not integrate service feedback information.
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per CallDescription: The total operational cost of the call centre over the reportingperiod divided by total calls handled during the reporting period..Objective: Provides high level indication and trend of overall serviceperformance.Definition: will require further working group consultationStatus: Recommended as Core KPI
KPI: Agent CapacityDescription: The anticipated number of hours of agent time available for
telephone service for each full-time equivalent (FTE).Objective: Ensures that agent resources are dedicated to required functionsStatus: Recommended as an operational measure
7/27/2019 Interim Eng
17/44
Key Performance Indicators for Service Delivery Channels
KPI: Resource AllocationDescription:A management indicator assessing allocated FTEs to servicedelivery.Objective: Measures effective use of channel resources.
Definition: Locally definedStatus: Recommended as an operational measure
KPI: Agent AdherenceDescription: An assessment of telephone agent adherence to schedule andmaking oneself available during anticipated service periods.Objective: Contributes to resourcing effectivenessDefinition: Calculated as total agent login time divided by scheduled work timeStatus: Recommended as an operational measure
KPI: Agent OccupancyDescription: The percentage of agent time spent in direct service including talkand wrap up time.Objective: Ensures accurate resourcing levels to achieve target service levels.Definition: (Talk time + after call wrap up time) divided by total agent log in timeover measured period.Suggested benchmark / Range: 85%Status: Recommended as an operational measure
Metrics for Service Effectiveness
KPI: First Call ResolutionDescription: The degree to which the client needs are met without further referralor call-back within a designated time interval.Objective: Minimize cost and maximize client satisfaction.Definition: number of single calls by unique phone number within 48 hour periodnot abandonedStatus: recommended as Core KPI
KPI: Accurate ReferralDescription:A redirect to the correct service for resolution of client need (may beto a higher service tier or to a separate organization/jurisdiction providing theservice).Objective: Measures key caller criteria of more than 2 transfers.Definition: will require further working group participationStatus: Not recommended. Not technically feasible at this time.
7/27/2019 Interim Eng
18/44
Key Performance Indicators for Service Delivery Channels
Metrics for Use of Technology
KPI: Call AvoidanceDescription: A call that quickly exits the system after an introductory message orbulletin that provides a desired answer for a substantial portion of the callingpopulation, e.g. related to an immediate but temporary service outage.Objective: Measures utility of IVR/bulletins to answer high-volume inquiries.Definition: Calls terminated at specific IVR marker after bulletinStatus: Proposed as Core KPI
KPI: Calls Answered by IVR Successfully
Description: A call that terminates in IVR tree after success marker.Objective: Measures utility of IVR response tree to provide self-service answer;an important indicator of IVR utility; secondary indicator of client satisfaction.Definition: Calls terminated at all IVR success markers.Status: Proposed as Core KPI
Metrics for Channel Take-up
KPI: CallsDescription: Total calls received
Objective: Measures overall service demandDefinition: Number of calls received at switch. Note that this will includerepeat callers who are refused at the switch.Status: proposed as Core KPI
KPI: CallersDescription: Unique CallersObjective: Measures service demand more accurately.Definition: Unique phone numbers dialing the serviceStatus: proposed as Core KPI
7/27/2019 Interim Eng
19/44
Key Performance Indicators for Service Delivery Channels
MAF CATEGORY: PEOPLE
At publishing time, KPIs for the MAF PEOPLE category had not yet been
proposed to the working group for review. Some examples of KPIs that might besuitable for this MAF category include:
Total Months Staff on Strength, Average Months on Strength per Agent: Ameasure of the total experience level of the agents within the call centre.Monitoring this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team.Provides a secondary indicator of Call Centre health and it often correlates tooverall customer satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helpsmeasure the utilization of Call Centre supervisor time as well as the investment inagent skill improvement.
Training Days/Agent: Total training days delivered during the measurementperiod divided by the number of agents. Training is required for program/servicedelivery, for technology, and for the development of skills related toprofessionalism and customer interaction.
Further discussion with departments and agencies will be conducted to identifyeffective KPIs under this MAF category.
7/27/2019 Interim Eng
20/44
Key Performance Indicators for Service Delivery Channels
9.0 KEY PERFORMANCE INDICATORS In-PersonChannel
MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
KPI: Visitor AccessDescription:Count of visitors who either a ) are serviced at agent stations or b) obtain self-service through in-location computers
ORCount of visitors entering facility. This depends on the service model and facility.Objective: Basic volume measure.Definition: Total visitors entering facility over measurement period.Suggested benchmark / Range: TBDStatus: Proposed as Core KPI. Tracked by all operations.
KPI: Visitors ServicedDescription: Ratio of visitors receiving agent service to total visitors. Providesindication of utilization of self-service capabilities and overall operational capacity.Definition: Total agent-visitor services divided by total visits
Status: Recommended as an operational measure.
Metrics for Delay
KPI: Average Wait Time (AWT)Description: The average delay from time of entering facility to introduction atagent station.Objective: Primary Indicator of visitor satisfaction.Definition: The total number of minutes from pulling of service ticket to service.
Derivation: Measured by service management systemStatus: Recommended as an operational KPI. Measured by all Queued serviceoperations. Not trackable within retail service model.
KPI: Service LevelDescription: Percentage of visitors that reach an agent within target wait time.
7/27/2019 Interim Eng
21/44
Key Performance Indicators for Service Delivery Channels
Objective: This measure is required in order to set and publish in-person servicestandards.Definition: Visitors served within threshold/Total Visitors ServicedDerivation: Measured by service ticketing system
Status: Recommended as an operational KPI. Measured by all Queued servicemodels. Not relevant to retail service model.
Metrics for Quality
KPI: Answer AccuracyDescription: Reliability of agent answers.Objective: To ensure program integrity.Definition: Local quality scorecard assessed by supervisor monitoring and / or
mystery shopper approaches and/or exit surveysDerivation: # of visitors answered successfully by agentsStatus: Under review. May be impractical in several service models.
KPI: ProfessionalismDescription: Encompasses a range of soft-skills that govern the approach todelivering accurate information and reliable services.Objective: Identifies and reinforces effective communication.Definition: Best measured through the use of a mystery shopper program thatuses specific planned visits to the service centre by a measurement organization.Can also be measured by exit surveys either conducted by staff or at self-service
computers.Status: Under review. May be impractical in several service models.
KPI: Transaction duration variabilityDescription: For operations providing specific transaction services, analysis ofvariance of transaction duration correlates strongly to application accuracy.Objective:Assess process consistency across agents.Status: Proposed to working team. Applicable only to some operations.Possible as a recommended operational KPI.
KPI: Critical Error RateDescription: Some operations monitor application/transaction errors (typicallyomission of required information) requiring additional interactions with clients.Objective: Assessment of pre-visit instructions to clients and/or reception desktriage procedures.Status: Proposed as an operational measure. Applicability to be reviewed.
7/27/2019 Interim Eng
22/44
Key Performance Indicators for Service Delivery Channels
Metrics for Client Satisfaction
KPI: Client Satisfaction LevelDescription:Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfactionObjective: Primary indicator of client perception of service quality, measuredrepeatedly, trends provide strong evidence of service improvement levelsDerivation: The multi-channel survey tool will be used (at a minimum) toestablish the CSat level for in-person services.Status: Recommended as core KPI.
KPI: Service ComplaintsDescription: Count and categorization of complaints received through allchannels concerning the In-Person channel.Objective: Primary indicator of service quality particularly when measured over
time.Definition: Total service complaints received during reporting period divided bytotal number of calls.Derivation: Counted by incident tracking system. Complaints received throughother channels must be added to total.Status: Recommended as core KPI. Caveat: Members noted that currentsystems do not currently support the collection and categorization of servicecomplaints that are received through a wide variety of channels (e.g. Ministerscorrespondence, general e-mails, complaints at end of successful service phonecall)
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per ContactDescription: Total labour costs divided by total service requests.Objective: Provides a snapshot of current operational efficiency specificallyrelated to agent/manpower.
Definition: TBD
Status: Recommended as Core KPI. Definition of labour cost to be determined.
KPI: Agent CapacityDescription: The anticipated number of hours of agent time available for counterservice for each agent.
7/27/2019 Interim Eng
23/44
Key Performance Indicators for Service Delivery Channels
Objective: Ensures that agent resources are dedicated to required servicefunctionsStatus: Recommended as operational KPI for queued service models.
KPI: Resource AllocationDescription:A management indicator assessing allocated agent positions toservice delivery.Objective: Measures effective use of channel resources.Definition: Locally definedStatus: Recommended as operational KPI for queued service models.
KPI: Agent AdherenceDescription: An assessment of service agent adherence to schedule andmaking oneself available during anticipated service periods.
Objective: Contributes to resourcing effectivenessDefinition: Calculated as total agent login time divided by scheduled work timeStatus: Recommended as operational KPI for queued service models.
KPI: Agent OccupancyDescription: The percentage of agent time spent in direct service including talkand wrap up time.Objective: Ensures accurate resourcing levels to achieve target service levels.Definition: (Talk time + after visit wrap up time) divided by total agent log in timeover measured period.
Suggested benchmark / Range: TBDStatus: Recommended as operational KPI for queued service models.
Metrics for Service Effectiveness
Working group is asked to contribute suggestions for KPIs in this theme.
KPI: Turn Around Time
Description: The average time to transaction complete (i.e. receipt by client)
expressed as a percentage of target time.Objective: Measures the response time to the client primary indicator ofcustomer satisfaction.Definition:Status: Under review.
7/27/2019 Interim Eng
24/44
Key Performance Indicators for Service Delivery Channels
Metrics for Use of Technology
KPI: Self-Service RatioDescription:A visitor to the service office that accesses computersObjective: Measures utility computer facilities within service office.Definition: Count of number of computer accesses divided by total visitorsduring measurement period.Status: Proposed as a Core KPI.
Metrics for Channel Take-up
KPI: VisitorsDescription: Total visitors entering the office.Objective: Measures overall service demandDefinition: See ACCESS measure.Status: Proposed as Core KPI.
MAF CATEGORY: PEOPLE
Total Months Staff on Strength, Average Months on Strength per Agent: Ameasure of the total experience level of the agents/staff within the service centre.Monitoring this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team.Provides a secondary indicator of service centre health and it often correlates tooverall customer satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helpsmeasure the utilization of service centre supervisor time as well as the investmentin agent skill improvement.
Training Days/Agent: Total training days delivered during the measurementperiod divided by the number of agents. Training is required for program/servicedelivery, for technology, and for the development of skills related toprofessionalism and customer interaction.
7/27/2019 Interim Eng
25/44
Key Performance Indicators for Service Delivery Channels
Further discussion with departments and agencies will be conducted to identifyeffective KPIs under this MAF category.
7/27/2019 Interim Eng
26/44
Key Performance Indicators for Service Delivery Channels
10.0 KEY PERFORMANCE INDICATORS Internet Channel
The Canadian Gateways team has published a definitive report on InternetMeasurement identifying the suitability and meaning of specific web measures
(for example, hits versus visits). Readers are asked to review this document (seeAppendix B).
MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
In the Internet Channel, the access theme includes measures concerning the availability of thesite to potential site visitors. There are two primary components to site availability:
a) How easily can site visitors locate the site through search engines, links from other sitesor via publication of the URL through other channels such as the phone and mail? and
b) Is the site available for site visitors once it has been located?Other qualitative characteristics contributing to access include compliance with W3C AccessibilityStandards to ensure the site is fully inclusive and available to persons with disabilities.
KPI: Search Engine RankingDescription: Relevance ranking weighted from distribution of site visitors whoentered the site through commercial search engines. Metric assumes that a highsearch engine rank provides maximum accessibility to those visitors who accessthe site via search.Objective: Measures overall site access through search engines.Definition: Sum of (relevance ranking multiplied by search engine referring
count) divided by total search engine referralsDerivation: Relevance rank from top five referring search engines using visitorrepresentative sample of search termsSuggested benchmark / Range:Status: Proposed as a Core KPI
KPI: Direct Access RatioDescription: Percentage of visits which access the site directly via same orknown URL to total visitors.This metric assumes that visits accessing the site
directly are either typing or pasting a URL in from another source (e.g. abrochure) or have bookmarked the site as a result of repeated visits.Objective: Assessment of site memory through known URL or bookmarking;Definition: Visits arriving at any page in site who do not have a referring URLassociated with the visit.Derivation: Web traffic statistics counting visits arriving at site without referringURL.
7/27/2019 Interim Eng
27/44
Key Performance Indicators for Service Delivery Channels
Status: Proposed as a Core KPI
KPI: Server Availability PercentageDescription: Total available server hours over total planned server hours during
reporting period.Objective: Indicative of overall Internet service capacityDefinition: sum of total available server hours less scheduled maintenancehours divided by total planned server hoursDerivation: Server/Operating System LogsStatus: Proposed as a Core KPI
KPI: Referral percentage.Description: Percentage of total visits arriving at the site from planned referralsites. This KPI can be further broken down into specific sites: e.g. GoC
Gateways, other GoC sites, other jurisdictions etc.Objective: Measures another access route to the site and can be used toadjust access strategies.Definition: Total visits arriving from specified websites divided by total visits.Status: Proposed as Core KPI
KPI: Conversion RateDescription: Rate at which visitors initiate transactions and reach the submitpage.Objective: Key Measure of overall service level and visitor satisfaction
Definition: Total visits reaching submit pages divided by total visits viewingtransaction start pages.Derivation: web monitoring packageSuggested benchmark / Range:Status: Proposed as Core KPI
KPI: Abandonment RateDescription: Rate at which visitors initiate transactions but do not reach thesubmit page PLUS visitors exiting site from non-content pagesObjective: Key Measure for overall service level.Definition: Visits with unsatisfactory exit pages divided by total visitsDerivation: web traffic statisticsSuggested benchmark / Range:Status: Proposed as Operational Measure
7/27/2019 Interim Eng
28/44
Key Performance Indicators for Service Delivery Channels
7/27/2019 Interim Eng
29/44
Key Performance Indicators for Service Delivery Channels
Metrics for Delay
KPI: Average Visit Duration
Description: The average duration of a visit. This metric can provide someindication of visitor need. However as more and more transactions are put online,statistics for visit duration may need to be separated according to type of visit(e.g. transactional, browse, search)Objective: Assessment of site stickiness overall relevance of site contentand transactions to visitors requirements.Definition: Total elapsed seconds from site entry at any page to site exit for allvisits divided by number of visitsDerivation: Measured by web traffic software.Status: Recommended as an operational measure.
Metrics for Quality
KPI: Site Error MessagesDescription: Capture of all computer identified error conditions. Such as pagenot found message, invalid links, transaction aborts etc.Objective: Improves overall site quality and response.Definition: Total error page views divided by total visitsDerivation: Web activity trackingStatus: Recommended as Core KPI
KPI: Internet Channel Feedback
Description: Total criticisms, complaints and compliments categorized intoeffective topics, received through all sources. It is recognized by working groupthat this is difficult to track today. However, it is recognized as high value.Objective: Contributes to program integrity.Definition: Count of complaints by topic over reporting period.Derivation: E-Mail, Phone Incident Tracking System, Ministerial CorrespondencesystemStatus: Recommended as an operational measure but not currently usedwithin most departments.
7/27/2019 Interim Eng
30/44
Key Performance Indicators for Service Delivery Channels
KPI: ProfessionalismDescription: Encompasses a range of soft skills that govern the approach todelivering accurate information and reliable services.Objective: Identifies and reinforces effective web design and web authoring
skills.Definition: Best measured through the use of focus groups and independenttesting organizations. Some input may be available from Media Metrics. Qualityof e-mail responses where implemented can be verified by Email ResponseManagement System (# of QA corrections, etc).Status: Proposed as Core KPI but must be further developed by workinggroup.
Metrics for Client Satisfaction
KPI: Client Satisfaction Level
Description:Application of Common Measurement Tool (CMT) to assess andbenchmark client satisfactionObjective: Primary indicator of client perception of service quality, measuredrepeatedly, trends provide strong evidence of service improvement levelsDerivation: The multi-channel survey tool will be used (at a minimum) todetermine the core measures relevant to the Internet channel. As well, period exitsurveys should be conducted upon site exit. Timeliness of speed of e-mailresponse can also be measured.Status: Proposed as Core KPI
MAF CATEGORY: STEWARDSHIP
KPI: Cost per Visit, Cost per VisitorDescription: The total operational cost of the site over the reporting perioddivided by total visits/visitors handled during the reporting period..Objective: Provides high level indication and trend of overall serviceperformance.Definition: will require further working group consultationStatus: Recommended as Core KPI
Metrics for Agent Utilization
The following four measures can be tracked for agent-assisted calls concerningthe Internet channel and for all messages/e-mails submitted through the Internetsite. All are recommended as Operational Measures.
7/27/2019 Interim Eng
31/44
Key Performance Indicators for Service Delivery Channels
KPI: Agent CapacityDescription: The anticipated number of hours of agent time available for servicefor each full-time equivalent (FTE).Objective: Ensures that agent resources are dedicated to required functions
KPI: Resource AllocationDescription:A management indicator assessing allocated FTEs to servicedelivery.Objective: Measures effective use of channel resources.Definition: Locally defined
KPI: Agent AdherenceDescription: An assessment of telephone agent adherence to schedule andmaking oneself available during anticipated service periods.
Objective: Contributes to resourcing effectivenessDefinition: Calculated as total agent login time divided by scheduled work time
KPI: Agent OccupancyDescription: The percentage of agent time spent in direct service including talkand wrap up time.Objective: Ensures accurate resourcing levels to achieve target service levels.Definition: (Talk time + after call wrap up time) divided by total agent log in timeover measured period.User support metrics
Suggested benchmark / Range: 85%
Metrics for Service Effectiveness
KPI: First Visit ResolutionDescription: Unique visitors over x-day period who exited the site from successcontent pagesObjective: Minimize cost and maximize client satisfaction.Definition: number of single unique visits within x-day period who exited the sitefrom specific success (i.e. answer found) pages
7/27/2019 Interim Eng
32/44
Key Performance Indicators for Service Delivery Channels
Metrics for Use of Technology
As the Internet Channel is used to provide self-service through Technology, this
theme is not applicable within the channel.
Metrics for Channel Take-up
Web channel take up data is used in comparison with other channels to determinethe impact of web site changes.
KPI: VisitsDescription: Total site visits acceptedObjective: Measures overall service demand
Definition: Number of visit sessions initiated by web servers.Status: Proposed as Core KPI
KPI: VisitorsDescription: Unique VisitorsObjective: Measures service demand accurately.Definition: Unique visitors counted either through registration/login processesor via cookies.Status: Proposed as Core KPI
MAF CATEGORY: PEOPLE
At publishing time, KPIs for the MAF PEOPLE category had not yet beenproposed to the working group for review. Some examples of KPIs that might besuitable for this MAF category include:
Total Months Staff on Strength, Average Months on Strength per Agent: Ameasure of the total experience level of the agents within the call centre.Monitoring this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team.Provides a secondary indicator of Call Centre health and it often correlates tooverall customer satisfaction levels.
7/27/2019 Interim Eng
33/44
Key Performance Indicators for Service Delivery Channels
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helpsmeasure the utilization of Call Centre supervisor time as well as the investment inagent skill improvement.
Training Days/Agent: Total training days delivered during the measurementperiod divided by the number of agents. Training is required for program/servicedelivery, for technology, and for the development of skills related toprofessionalism and customer interaction.
Further discussion with departments and agencies will be conducted to identifyeffective KPIs under this MAF category.
7/27/2019 Interim Eng
34/44
Key Performance Indicators for Service Delivery Channels
11.0 KEY PERFORMANCE INDICATORS Mail Channel
MAF CATEGORY: CITIZEN FOCUSED SERVICE
Metrics for Access
KPI: Applications/Pieces OpenedDescription: Count of new envelopes opened during reporting period.Objective: Basic volume measure.Definition: Total envelopes opened less inappropriate mail (junk mail,
wrongly-addressed etc)Suggested benchmark / Range:Status: Proposed as Core KPI.
KPI: Applications CompletedDescription: Outbound mail for completed files.Definition:Status: Proposed as Core KPI.
KPI: Applications/Mail in ProcessDescription: All files remaining open at end of reporting period.
Represents the work in progress within the processingcentre.
Definition: Previous open files + applications received less applicationscompleted.
Status: Proposed as Core KPI.
Metrics for Delay
KPI: Average Cycle Time (ACT)
Description: The average elapsed time that the application/mail was held withinthe processing centre prior to completion.
Objective: Primary Indicator of visitor satisfaction.Definition: The total number of minutes from opening of envelope to mailing of
response.Derivation: Measured by mail tracking system.Status: Recommended as a Core KPI.
7/27/2019 Interim Eng
35/44
Key Performance Indicators for Service Delivery Channels
KPI: Pass Through RatioDescription: Ratio of total handling time to total cycle time.Objective: Primary Indicator of workflow efficiency.Definition: Total minutes of processing time (time in agent) divided by total
elapsed time. Ratio should approach 1.0 to indicate zero delaybetween processes.Derivation: Measured by mail tracking system.Status: Recommended as an Core KPI.
KPI: Service LevelDescription: Percentage of mail that are completed within target processing time.Objective: This measure is required in order to set and publish mail servicestandards.Definition: Applications completed within service threshold divided by total
applications completed.Derivation: Measured by mail tracking systemStatus: Recommended as an operational KPI.
Metrics for Quality
KPI: Response AccuracyDescription: Reliability of mail response/completion.Objective: To ensure program integrity.
Definition: Local quality scorecard assessed by quality assurance review ofoutbound mail plus write backs One or more subsequent mail receipts forsame applications.Derivation: QA report.Status: Under review.
KPI: ProfessionalismDescription: Encompasses a range of soft-skills that govern the approach to
delivering accurate information and reliable services.Objective: Identifies and reinforces effective communication.Definition: Best measured through the use of an enclosed feedback postcardor through alternate channel surveys (e.g. post response phone call).Status: Under review. May be impractical in several service models.
KPI: Critical Error RateDescription: Some operations monitor application/transaction errors (typicallyomission of required information) requiring additional interactions with clients.
7/27/2019 Interim Eng
36/44
Key Performance Indicators for Service Delivery Channels
Objective: Assessment of application instructions to clientsStatus: Proposed as an operational measure. Applicability to be reviewed.
7/27/2019 Interim Eng
37/44
Key Performance Indicators for Service Delivery Channels
Metrics for Client Satisfaction
KPI: Client Satisfaction LevelDescription:Application of Common Measurement Tool (CMT) to assess and
benchmark client satisfactionObjective: Primary indicator of client perception of service quality, measuredrepeatedly, trends provide strong evidence of service improvement levelsDerivation: The multi-channel survey tool will be used (at a minimum) toestablish the CSat level for in-person services.Status: Recommended as core KPI.
KPI: Service Complaints
Description: Count and categorization of complaints received through allchannels concerning the mail channel.
Objective: Primary indicator of service quality particularly when measured overtime.Definition: Total service complaints received during reporting period divided bytotal number of mail received.Derivation: Counted by incident tracking system. Complaints received throughother channels must be added to total.Status: Recommended as core KPI. Caveat: Members noted that currentsystems do not currently support the collection and categorization of servicecomplaints that are received through a wide variety of channels (e.g. Ministerscorrespondence, general e-mails, phone call)
MAF CATEGORY: STEWARDSHIP
Metrics for Agent Utilization
KPI: Cost per ContactDescription: Total labour costs divided by total service requests.Objective: Provides a snapshot of current operational efficiency specificallyrelated to agent/manpower.Definition: TBDStatus: Recommended as Core KPI. Definition of labour cost to be determined.
KPI: Agent CapacityDescription: The anticipated number of hours of agent time available for mailservice for each agent.Objective: Ensures that agent resources are dedicated to required service
7/27/2019 Interim Eng
38/44
Key Performance Indicators for Service Delivery Channels
functions.Status: Recommended as operational KPI for mail processing service models.
KPI: Resource Allocation
Description:A management indicator assessing allocated agent positions toservice delivery.Objective: Measures effective use of channel resources.Definition: Locally definedStatus: Recommended as operational KPI for mail processing service models.
KPI: Agent AdherenceDescription: An assessment of service agent adherence to schedule andmaking oneself available during anticipated service periods.Objective: Contributes to resourcing effectiveness
Definition: Calculated as total agent login time divided by scheduled work timeStatus: Recommended as operational KPI for queued service models.
KPI: Agent OccupancyDescription: The percentage of agent time spent in direct mail service includingwrap up time.Objective: Ensures accurate resourcing levels to achieve target service levels.Definition: (Response time + wrap up time) divided by total agent log in timeover measured period.Suggested benchmark / Range: TBD
Status: Recommended as operational KPI for mail service models.
Metrics for Service Effectiveness
KPI:Description:Objective:Definition:Status:
Metrics for Use of Technology
KPI: Automated Response RatioDescription: Ratio of applications received and completed but not handled byagents to total applications received.
7/27/2019 Interim Eng
39/44
Key Performance Indicators for Service Delivery Channels
Objective:Definition:Status: Proposed as an Operational KPI.
Metrics for Channel Take-up
KPI: Applications ReceivedDescription: Total applications/mail entering the processing centre.Objective: Measures overall service demandDefinition: See ACCESS measure.Status: Proposed as Core KPI.
MAF CATEGORY: PEOPLE
Total Months Staff on Strength, Average Months on Strength per Agent: Ameasure of the total experience level of the agents/staff within the service centre.Monitoring this over time provides a measure of the impact of staff turnover.
Staff Turnover Ratio: A measure of the churn rate within the Agent team.Provides a secondary indicator of service centre health and it often correlates tooverall customer satisfaction levels.
Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Helpsmeasure the utilization of service centre supervisor time as well as the investment
in agent skill improvement.
Training Days/Agent: Total training days delivered during the measurementperiod divided by the number of agents. Training is required for program/servicedelivery, for technology, and for the development of skills related toprofessionalism and customer interaction.
Further discussion with departments and agencies will be conducted to identifyeffective KPIs under this MAF category.
7/27/2019 Interim Eng
40/44
Key Performance Indicators for Service Delivery Channels
12.0 USING SERVICE DELIVERY KPIs inDEPARTMENTAL REPORTING
The MAF provides the primary framework for departments to prepare requiredannual performance reports to provide formal feedback to Deputy Ministers.Treasury Board of Canada, Secretariat is developing a Web-based approach tosupport and streamline departmental performance reporting.
Service delivery key performance Indicators will become an important componentof these departmental reports and will significantly contribute to commonunderstanding of overall service channel performance across government.
7/27/2019 Interim Eng
41/44
Key Performance Indicators for Service Delivery Channels
13.0 CONTACT INFORMATION
Further information, suggestions and contributions can be forwarded to:
Service Delivery ImprovementTreasury Board of Canada, Secretariat2745 Iris StreetOttawa, OntarioK1A 0R5
Victor Abele ([email protected])Director, Service Delivery ImprovementTelephone: (613) 946-6264Fax: (613) 952-7232
Shalini Sahni ([email protected])AnalystTelephone: (613) 948-1119Fax: (613) 952-7232
http://var/www/apps/conversion/tmp/scratch_4/HYPERLINKmailto:[email protected]://var/www/apps/conversion/tmp/scratch_4/HYPERLINKmailto:[email protected]7/27/2019 Interim Eng
42/44
Key Performance Indicators for Service Delivery Channels
APPENDIX A: Terms and Definitions
ACD Automatic Call Distributor a software/hardware device that managescall queues, delivers IVR recordings as selected by the caller, and routes callsfrom the queue to appropriate agents based on any number of caller parameters
CTI Computer Telephony Integration technology that provides an integratedphone/computer capability to the service agent. CTI provides features such asautomatic caller file retrieval, soft phone, referral/call back electronic forms withresponse script suggestion, caller wait time, and quick access to the mainframeand online reference material..
Channel The primary service channels are telephone, Internet, mail and in-person.
IVR/VR Interactive Voice Response/Voice Recognition two related termsdescribing two types of self-service technology employed in the TelephoneService Channel. Interactive Voice Response provides the caller with a series ofoptions to be selected using the telephone keypad. Voice Recognition allows thecaller to speak the question or say an option from a recorded list.
KPI - Key Performance Indicator a measurable objective which provides aclear indication of service centre capability, quality, customer satisfaction, etc.
7/27/2019 Interim Eng
43/44
Key Performance Indicators for Service Delivery Channels
APPENDIX B: References
Citizen First 3 report, 2003. Erin Research Inc, Institute for Citizen-CentredService, Institute of Public Administration of Canada.
Common Web Traffic Metrics Standards, March 21, 2003. Version 1.1., TreasuryBoard Secretariat, Canada.
Key Performance Indicators Workshop, 2003. Service Delivery Improvement,Treasury Board Secretariat, Canada.
Performance Management Metrics for DWP Contact Centres, March 14, 2003.Version 2.0. Ivackovic and Costa. Department of Works and Pensions, UnitedKingdom.
Peformance Measures for Federal Agency Websites: Final Report. October 1,2000. McClure, Eppes, Sprehe and Eschenfelder. Joint report for DefenseTechnical Information Center, Energy Infomration Administration and GovernmentPrinting Office, U.S.A.
Service Improvement Initiative How to Guide, 2000. Treasury BoardSecretariat, Canada.
Service Management Framework Report, 2004. Fiona Seward. Treasury BoardSecretariat Canada and Burntsands Consulting.
Summary Report on Service Standards, 2001. Consulting and Audit Canada(Project 550-0743)
7/27/2019 Interim Eng
44/44
Key Performance Indicators for Service Delivery Channels
APPENDIX C: Summary of Core Key Performance Indicators
These core indicators were vetted by the working group and are recommendedfor inclusion into the MAF.
Phone In-personInternet Mail
Call Access Visitor Access Search EngineRanking
Applications/PiecesOpened
Caller Access Client SatisfactionLevel
Direct AccessRatio
ApplicationsCompleted
Abandoned Calls ServiceComplaints
Server AvailabilityPercentage
Applications/Mail inProcess
Average Speedto Answer
Cost per Contact ReferralPercentage
Average CycleTime
Answer Accuracy Visitors Conversion Rate Pass ThroughRatio
ClientSatisfaction Level
Site ErrorMessages
Client SatisfactionLevel
Cost per Call Professionalism Service Complaints
First CallResolution
Client SatisfactionLevel
Cost per Contact
Call Avoidance Cost per Visit,Cost per Visitor
ApplicationsReceived
Calls Answeredby IVRSuccessfully
Visits
Calls Visitors
Callers