{
    "version" : "https://jsonfeed.org/version/1",
    "content" : "news",
    "type" : "single",
    "title" : "Unsolicited data: A valuable resource for digital customer experience enhancement |Digital.gov",
    "description": "Unsolicited data: A valuable resource for digital customer experience enhancement",
    "home_page_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/","feed_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2024/06/25/unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement/index.json","item" : [
    {"title" :"Unsolicited data: A valuable resource for digital customer experience enhancement","summary" : "Optimizing federal service touchpoints involves analyzing both actively-sought and spontaneous feedback, introducing new metrics and data points.","date" : "2024-06-25T00:00:00Z","date_modified" : "2025-01-27T19:42:55-05:00","authors" : {"isabel-izzy-metzger" : "Isabel (Izzy) Metzger"},"topics" : {
        
            "analytics" : "Analytics",
            "user-experience" : "User experience"
            },"primary_image" : { "uid" : "feedback-communication-gentle-studio-istock-getty-images-1428649542", "alt" :
  "Four columns each have three feedback and communication icons.", "width" :
  "1200", "height" :
  "630", "credit" :
  "", "caption" :
  "Gentle-Studio/iStock via Getty Images", "format" :
  "png" },"branch" : "bc-archive-content-3",
      "filename" :"2024-06-25-unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement.md",
      
      "filepath" :"news/2024/06/2024-06-25-unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement.md",
      "filepathURL" :"https://github.com/GSA/digitalgov.gov/blob/bc-archive-content-3/content/news/2024/06/2024-06-25-unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement.md",
      "editpathURL" :"https://github.com/GSA/digitalgov.gov/edit/bc-archive-content-3/content/news/2024/06/2024-06-25-unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement.md","slug" : "unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement","url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2024/06/25/unsolicited-data-a-valuable-resource-for-digital-customer-experience-enhancement/","weight" : "1","content" :"\u003cp\u003eWhen it comes to digital federal services, the stakes are high. For millions of the American public, these platforms are not just convenient online options; they are lifelines for essential needs. As highlighted in \u003ca href=\"https://designsystem.digital.gov/next/introduction/\"\u003eTransforming the American digital experience: A report about what’s next for the U.S. Web Design System\u003c/a\u003e:\u003c/p\u003e\n\n\n\n\n\n\n\u003cdiv class=\"quote-block \"\u003e\n    \u003cblockquote\u003e\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e“\u003c/span\u003e\n      For millions, access to digital services isn’t a luxury — it’s critical. And their experiences using government websites to find unemployment support, file taxes, apply for student loans, or get assistance with housing, childcare, or food can dramatically affect how they feel about the government.\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e”\u003c/span\u003e\u003c/blockquote\u003e\n  \u003c/div\u003e\n\u003cp\u003eUnderstanding and optimizing digital touchpoints requires listening to users in the various solicited and unsolicited ways they communicate with the government. The \u003ca href=\"https://digital.gov/resources/delivering-digital-first-public-experience/\"\u003e21st Century Integrated Digital Experience Act\u003c/a\u003e (21st Century IDEA) emphasizes that federal agencies must design their digital services, including websites and applications, to meet user goals, needs, and behaviors based on analysis of various types of data.\u003c/p\u003e\n\u003cp\u003eAgencies collect data on their services through a variety of avenues, but teams often face barriers to leveraging that data. These barriers may include unfamiliarity with the format of the data, methods for analyzing the data, and types of insights the data can uncover.\u003c/p\u003e\n\u003cp\u003eThis blog provides a checklist for getting started on analyzing unsolicited customer feedback.\u003c/p\u003e\n\u003cp\u003eYou can learn more about survey design and analyzing solicited customer feedback by checking out these blog posts: \u003ca href=\"https://digital.gov/2023/10/30/decoding-public-sentiment-harnessing-open-data-to-gain-insights-into-service-delivery/\"\u003eDecoding public sentiment\u003c/a\u003e and \u003ca href=\"https://digital.gov/2023/12/19/amplifying-customer-voices/\"\u003eAmplifying customer voices\u003c/a\u003e.\u003c/p\u003e\n\u003ch2 id=\"unsolicited-unstructured-customer-feedback-a-treasure-trove-of-data\"\u003eUnsolicited, unstructured customer feedback: A treasure trove of data\u003c/h2\u003e\n\u003cp\u003eStructured data, such as numerical survey response ratings and website traffic, only represent a fraction of the available data on public interactions with digital services. Unstructured data – such as open-ended survey questions, emails, chatbot conversations, and search queries – can also be mined for insights. Both structured and unstructured data can be further categorized as “solicited” or “unsolicited,” depending on whether it was initiated by the provider or the user.\u003c/p\u003e\n\u003cp\u003eThe table below shows types of customer feedback along with examples:\u003c/p\u003e\n\u003ctable class=\"usa-table usa-table--striped\"\u003e\n  \u003ccaption\u003e\u003c/caption\u003e\n  \u003cthead\u003e\n    \u003ctr\u003e\n      \u003cth scope=\"col\"\u003eTypes of feedback\u003c/th\u003e\n      \u003cth scope=\"col\"\u003eExample sources\u003c/th\u003e\n    \u003c/tr\u003e\n  \u003c/thead\u003e\n  \u003ctbody\u003e\n    \u003ctr\u003e\n      \u003cth scope=\"row\"\u003e\u003cstrong\u003eSolicited and structured\u003c/strong\u003e: Requested by a provider and limited to predefined response options\u003c/th\u003e\n      \u003ctd\u003eNumerical ratings in survey responses and comment cards\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003cth scope=\"row\"\u003e\u003cstrong\u003eSolicited and unstructured\u003c/strong\u003e: Requested by a provider but free-form\u003c/th\u003e\n      \u003ctd\u003eOpen-ended survey responses, customer advisory board interviews\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003cth scope=\"row\"\u003e\u003cstrong\u003eUnsolicited and structured\u003c/strong\u003e: Initiated by customer but limited to predefined response options\u003c/th\u003e\n      \u003ctd\u003eProduct and service ratings on third-party review sites\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003cth scope=\"row\"\u003e\u003cstrong\u003eUnsolicited and unstructured\u003c/strong\u003e: Initiated by customer and free-form\u003c/th\u003e\n      \u003ctd\u003eSocial media posts, contact center calls, emails, and chats\u003c/td\u003e\n    \u003c/tr\u003e\n  \u003c/tbody\u003e\n\u003c/table\u003e\n\u003ch2 id=\"a-key-distinction-between-solicited-and-unsolicited-feedback\"\u003eA key distinction between solicited and unsolicited feedback\u003c/h2\u003e\n\u003cp\u003eSolicited feedback reflects responses to specific inquiries posed by the service provider, and may be skewed toward more-engaged users who are willing to participate in surveys. While extremely valuable, solicited feedback data will present a view constrained by the questions asked. Unsolicited feedback, however, arises from the user\u0026rsquo;s initiative. It offers another view into the customer experience and captures a broad spectrum of user experiences, including needs that might not surface in a structured survey.\u003c/p\u003e\n\u003ch2 id=\"examples-of-unsolicited-data-and-questions-to-ask\"\u003eExamples of unsolicited data, and questions to ask\u003c/h2\u003e\n\u003cp\u003eDigital service providers can leverage unsolicited customer experience (CX) data to understand different user segments and their needs, preferences, and behaviors (e.g., users’ search habits). This can lead to specific changes in how your digital service is provided—for example, by proactively generating or modifying content, or by creating a more targeted experience for specific customer types.\u003c/p\u003e\n\u003cp\u003eEmails are a great place to start for customer feedback analysis, especially since it is a data stream already being collected. Email data is particularly valuable for digital service providers not yet collecting user survey data. Even if you are collecting survey data, emails are important to analyze because there may be differences in usage across feedback channels (e.g., some users may be more likely to email their feedback rather than participate in surveys). Additionally, unsolicited feedback in emails can uncover areas not addressed in surveys. For example, emails may capture new user groups outside of those predefined in questionnaires, or provide deeper insight into how well a provider is delivering customer service.\u003c/p\u003e\n\u003cp\u003eYou can analyze the content of the customer-initiated emails to address questions such as:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWhat types of information and services do users need?\u003c/li\u003e\n\u003cli\u003eIs there recurring negative and positive feedback in the emails?\u003c/li\u003e\n\u003cli\u003eHow have the topics of emails changed over time?\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eYou can also segment the emails into customer groups and delve into the group-specific trends and patterns. Such customer experience insights can reveal new, non-obvious insights on service delivery, such as discovering that customer interactions via email might be a more powerful CX-improvement lever for the digital service provider than website content.\u003c/p\u003e\n\u003ch3 id=\"website-search-data\"\u003eWebsite search data\u003c/h3\u003e\n\u003cp\u003eAnalyze the search terms visitors use on the site to see their frequency and patterns over time.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWhat are the top search terms used in the digital platform’s search tools, and what does this indicate about your users’ interests?\u003c/li\u003e\n\u003cli\u003eAre there new and emerging search terms that can be used for content strategy?\u003c/li\u003e\n\u003cli\u003eUtilize website URL structure to correlate search terms with user groups; this can help inform how the website caters resources to its diverse user groups.\u003c/li\u003e\n\u003cli\u003eAnalyze user behavior before and after major site updates to measure the impact of these changes, e.g., did a new feature lead to an increase in searches for specific content or services?\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch3 id=\"emails\"\u003eEmails\u003c/h3\u003e\n\u003cp\u003eAnalyze the content of the emails to understand what customers initiating emails are requesting.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWhat types of information and services do they need?\u003c/li\u003e\n\u003cli\u003eIs there recurring negative and positive feedback in the emails that echo what we found in the solicited data?\u003c/li\u003e\n\u003cli\u003eHow have the topics of emails changed over time?\u003c/li\u003e\n\u003cli\u003eConduct a deeper sentiment analysis (the process of analyzing digital text to determine if the emotional tone of the message is positive, negative, or neutral) to understand the emotional signals in customer emails.\u003c/li\u003e\n\u003cli\u003eSegment emails into customer groups to identify group-specific trends and patterns.\u003c/li\u003e\n\u003cli\u003eWhat percentage of customer emails received a response? What was the sentiment of the overall email interaction?\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch3 id=\"chatbot-conversations-and-queries\"\u003eChatbot conversations and queries\u003c/h3\u003e\n\u003cp\u003eAnalyze the types of queries and issues raised by users to understand needs and service effectiveness.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eCan we identify chatbot user archetypes and understand their respective interaction patterns?\u003c/li\u003e\n\u003cli\u003eWhat are key factors of website chatbot user experience? What impact do response relevance and dialogue helpfulness have?\u003c/li\u003e\n\u003cli\u003eLook at the number of interactions between chatbot and user, as well as the emotions detected throughout the chat conversation. Is there a peak negative interaction? At what point?\u003c/li\u003e\n\u003cli\u003eWould sentiment pattern mining of chatbot dialogues show common user journeys — for example, from negative to positive indicating maybe a resolution, or from neutral to negative indicating dissatisfaction?\u003c/li\u003e\n\u003cli\u003eLook at the volume of chatbot interactions over time to identify any spikes that may correlate with events such as COVID-19 and major site updates.\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"you-cant-change-what-you-dont-measure-new-performance-metrics-for-digital-services-to-consider\"\u003eYou can’t change what you don’t measure: New performance metrics for digital services to consider\u003c/h2\u003e\n\u003cp\u003eShortcomings with CX metrics based solely on structured data — such as website hits and solicited survey satisfaction ratings like \u003ca href=\"https://digital.gov/2016/08/05/csat-nps-ces-3-easy-ways-to-measure-customer-experience-cx/\"\u003eCSAT and NPS\u003c/a\u003e — are well-documented.\u003c/p\u003e\n\u003cp\u003eAs highlighted in an article about using artificial intelligence to track how customers feel \u003csup\u003e\u003ca aria-describedby=\"footnote-label\" href=\"#fn1\" id=\"footnotes-ref1\"\u003e[1]\u003c/a\u003e\u003c/sup\u003e:\u003c/p\u003e\n\n\n\n\n\n\n\u003cdiv class=\"quote-block \"\u003e\n    \u003cblockquote\u003e\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e“\u003c/span\u003e\n      Companies spend huge amounts of time and money in efforts to get to know their customers better. But despite this hefty investment, most firms are not very good at listening to customers. It’s not for lack of trying, though — the tools they’re using and what they’re trying to measure may just not be up to the task. Our research shows that the two most widely used measures, customer satisfaction (CSAT) and Net Promoter Scores (NPS), fail to tell companies what customers really think and feel, and can even mask serious problems.\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e”\u003c/span\u003e\u003c/blockquote\u003e\n  \u003c/div\u003e\n\u003cp\u003eTo address this within the General Services Administration (GSA), the Office of the Chief Financial Officer’s Analytics and Decision Support Division created open-source tools, the \u003ca href=\"https://github.com/GSA/DigitalCXAnalyzer.git\"\u003eDigitalCXAnalyzer\u003c/a\u003e and \u003ca href=\"https://github.com/GSA/GovCXAnalyzer/\"\u003eGovCXAnalyzer\u003c/a\u003e. These tools facilitate the implementation and systematic processing and analyzing of vast and diverse types of structured and unstructured customer feedback data by allowing users to do trend analysis, sentiment analysis, topic modeling, and more. The team at GSA also developed and published code to support the generation of new performance metrics from structured and unstructured data that could give more granular depictions of customer experience.\u003c/p\u003e\n\u003cp\u003eThe code supports calculating metrics that offer a more comprehensive understanding of how different users engage with the platform’s services; an example metric is shown below. The metrics range in level of difficulty to implement, but offer new ways of understanding digital service performance and customer experience. While the toolkit was piloted with Department of Labor’s Employment Training Administration’s CareerOneStop platform, the code repository can be adapted and applied across various digital services.\u003c/p\u003e\n\n\n\n\u003carticle\n  class=\"dg-note \"\n\u003e\n  \u003ch4 class=\"dg-note__heading\"\u003e\n    \u003csvg\n      class=\"dg-note__icon usa-icon dg-icon dg-icon--large\"\n      aria-hidden=\"true\"\n      focusable=\"false\"\n    \u003e\n      \u003cuse xlink:href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/uswds/img/sprite.svg#notifications\"\u003e\u003c/use\u003e\n    \u003c/svg\u003e\n    \n      Note\n    \n  \u003c/h4\u003e\n  \u003cp\u003e\u003cstrong\u003eExample of one of the metrics a user can generate from the code repository\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eHelp Request Rate Measure\u003c/strong\u003e: You can monitor the rate at which different user groups are making help requests or asking for assistance by leveraging keyword matching on emails (e.g., “please assist,” “having trouble”) to further identify and measure gaps across your user groups. For example, if certain user groups exhibit higher help request rates, this could indicate usability issues that disproportionately affect those users. Differences in rates across user groups could point to issues with content clarity. Content that resonates well with one user group might be confusing or less intuitive for another, prompting more help requests.\u003c/p\u003e\n\n\u003c/article\u003e\n\n\u003ch2 id=\"call-to-action-embrace-unsolicited-customer-feedback\"\u003eCall to action: Embrace unsolicited customer feedback\u003c/h2\u003e\n\u003cp\u003eIn a time when access to digital government services is critical, understanding and enhancing user interactions with those services is essential. Traditional analytics and metrics fall short of capturing the full spectrum of user experiences. GSA encourages readers to adopt a more comprehensive approach to digital user experience data analysis by leveraging the unsolicited feedback that users are providing.\u003c/p\u003e\n\u003cp\u003eThere are various areas where your agency can begin to tap in to unsolicited feedback:\u003c/p\u003e\n\u003cdiv class=\"box \"\u003e\n  \u003cp\u003eWe recommend starting with the available data and then incorporating more data and advanced analytics.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eInventory existing CX data\u003c/strong\u003e: Identify the data you currently have, its sources, and formats (structured vs. unstructured). This includes both solicited data (such as surveys and feedback forms) and unsolicited data (such as emails).\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eCreate a CX pilot\u003c/strong\u003e: Starting small with simpler implementations and straightforward analysis questions is recommended to test your approach, tools, and team’s analytical capabilities. Starting with a pilot can also help avoid substantial information technology (IT) changes. For example, if big data cloud processing is not an option for you, consider analyzing a subset of CX data on a government-furnished computer or virtual desktop. Leverage the open-source Python \u003ca href=\"https://github.com/GSA/GovCXAnalyzer/tree/main/notebooks/digitalcx\"\u003eCX analysis toolkits\u003c/a\u003e on the available data and get preliminary CX insights on specific customer experience issues or opportunities.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eStart using new CX metrics\u003c/strong\u003e: Leverage the CX analysis toolkit \u003ca href=\"https://github.com/GSA/GovCXAnalyzer/blob/main/notebooks/digitalcx/digital_metrics.py\"\u003eto implement new CX metrics\u003c/a\u003e. In the toolkit, measures such as Help Request Rate, User Sentiment Score, Resource Utilization Rate, and Service-Specific Metrics, enable you to make informed decisions by pinpointing specific issues and successes within different user groups or service areas. For instance, high Request Rates can directly signal areas where users need more support, allowing you to focus improvements where they are most needed. Similarly, you can leverage the User Sentiment Score tool to give you a detailed view of user emotions, which can guide enhancements in user interaction and service delivery. We encourage you to adopt these measures and metrics to understand your service’s impact, actively refine your strategies, and drive meaningful improvements in your digital offerings.\u003c/p\u003e\n\n\u003c/div\u003e\n\n\u003cp\u003eWe hope this work holds value for you and your agency. If you have questions or would like to learn more about this work, please reach out to the Analytics and Decision Support Division at \u003ca href=\"mailto:bia@gsa.gov\"\u003ebia@gsa.gov\u003c/a\u003e.\u003c/p\u003e\n\u003cdiv class=\"dg-footnote\"\u003e\n\u003ch3 class=\"dg-footnote__heading\" id=\"footnote-label\"\u003eFootnotes\u003c/h3\u003e\n\u003col class=\"dg-footnote__list\"\u003e\n\u003cli class=\"dg-footnote__list-item\" id=\"fn1\"\u003eZaki, Mohamed, Janet McColl-Kennedy, and Andy Neely. 2021. “Using AI to Track How Customers Feel — in Real Time.” \u003cem\u003eHarvard Business Review\u003c/em\u003e, May 4, 2021. \u003ca href=\"https://www.hbr.org/2021/05/using-ai-to-track-how-customers-feel-in-real-time\"\u003ewww.hbr.org/2021/05/using-ai-to-track-how-customers-feel-in-real-time\u003c/a\u003e \u003ca href=\"#footnotes-ref1\" aria-label=\"Back to content\"\u003e↩\u003c/a\u003e\u003c/li\u003e\n\u003c/ol\u003e\n\u003c/div\u003e\n"}
  ]
}
