{
    "version" : "https://jsonfeed.org/version/1",
    "content" : "news",
    "type" : "single",
    "title" : "Amplifying customer voices |Digital.gov",
    "description": "Amplifying customer voices",
    "home_page_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/","feed_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2023/12/19/amplifying-customer-voices/index.json","item" : [
    {"title" :"Amplifying customer voices","deck" : "A guide to natural language processing in customer experience data analysis","summary" : "Understand how to use natural language processing and advanced analytics to gain insights on high-impact service performance and survey design.","date" : "2023-12-19T17:38:00-05:00","date_modified" : "2025-01-27T19:42:55-05:00","authors" : {"isabel-izzy-metzger" : "Isabel (Izzy) Metzger"},"topics" : {
        
            "customer-experience" : "Customer experience",
            "product-and-project-management" : "Product and project management",
            "software-engineering" : "Software engineering"
            },"primary_image" : { "uid" : "green-structured-unstructured-data-iceberg-chavapong-prateep-na-thalang-istock-getty-images-1353745656-b", "alt" :
  "An iceberg illustration in shades of green and white. The small 20% section above water represents structured data, while the larger 80% section below water is unstructured data.", "width" :
  "1200", "height" :
  "630", "credit" :
  "", "caption" :
  "Chavapong Prateep Na Thalang/iStock via Getty Images", "format" :
  "png" },"branch" : "bc-archive-content-3",
      "filename" :"2023-12-19-amplifying-customer-voices.md",
      
      "filepath" :"news/2023/12/2023-12-19-amplifying-customer-voices.md",
      "filepathURL" :"https://github.com/GSA/digitalgov.gov/blob/bc-archive-content-3/content/news/2023/12/2023-12-19-amplifying-customer-voices.md",
      "editpathURL" :"https://github.com/GSA/digitalgov.gov/edit/bc-archive-content-3/content/news/2023/12/2023-12-19-amplifying-customer-voices.md","slug" : "amplifying-customer-voices","url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2023/12/19/amplifying-customer-voices/","weight" : "1","content" :"\u003cp\u003eAgencies rely on customer experience surveys to collect critical feedback about how their services and products are working for the public. Some services across the government have been designated as \u003ccode\u003ehigh impact\u003c/code\u003e because they have an extensive customer base or a profound impact on the individuals they serve.\u003c/p\u003e\n\u003cp\u003e\u003ca href=\"https://www.whitehouse.gov/wp-content/uploads/2018/06/s280.pdf\"\u003eOMB Circular A-11 Section 280 (PDF, 385 KB, 14 pages, 2023)\u003c/a\u003e provides guidance on managing customer experience and improving service delivery. This guidance requires \u003ca href=\"https://www.performance.gov/cx/hisps/\"\u003eHigh-impact Service Providers\u003c/a\u003e (HISPs) to collect feedback after each transaction across seven indicator measures of experience. Agencies survey customers and ask them to rate different aspects of their experience on a 5-point scale, ranging from very negative (1) to very positive (5); these ratings generate structured data. The survey design also includes an optional open-ended response allowing customers to describe their experiences in their own words, which creates unstructured data. This work has resulted in a large collection of structured and unstructured data on the performance of high-impact services.\u003c/p\u003e\n\u003ch2 id=\"bridging-the-gap-from-structured-to-unstructured-data\"\u003eBridging the gap: From structured to unstructured data\u003c/h2\u003e\n\u003cp\u003eThe customer experience data we have collected resembles an iceberg. The more visible, structured data at the tip of the iceberg can offer insight, while the massive, unstructured data below the surface holds great potential for developing a deeper understanding of customer experiences.\u003c/p\u003e\n\n\n\n\n\n\n\n\u003cdiv class=\"image\"\u003e\n  \u003cimg\n        src=\"https://s3.amazonaws.com/digitalgov/nlp-cx-customer-experience-large-iceberg-in-water-simon-lee-unsplash-comp.png\"alt=\"A text box near the tip of an iceberg says, NUMERICAL RATINGS FROM SURVEY RESPONSES (STRUCTURED DATA). Another text box that reads, OPEN-ENDED FREE-TEXT SURVEY RESPONSES (UNSTRUCTURED DATA), is beneath the water\u0026#39;s surface to symbolize the depth of information hidden from view.\"/\u003e\u003cp\u003eIceberg photo by Simon Lee on Unsplash.\u003c/p\u003e\u003c/div\u003e\n\n\n\u003cp\u003eWhen customers answer survey questions like \u003ccode\u003eWere you satisfied?\u003c/code\u003e, the structured response data helps us gauge whether we are providing an effective service. But it doesn’t do much to shed light on why that service was effective (or not), or how we can improve that service. These insights are more likely to exist in the free-text data, where individuals have an opportunity to explain their service experience.\u003c/p\u003e\n\u003cp\u003eGSA’s Analytics and Decision Support Division within the Office of the Chief Financial Officer sought to dive deeper into this unstructured data to gain richer insights into the customer experiences with two different services at the \u003ca href=\"https://www.dol.gov/\"\u003eDepartment of Labor\u003c/a\u003e: an in-person service, and an online, digital-based service. We set out to answer two questions:\u003c/p\u003e\n\u003col\u003e\n\u003cli\u003e\u003cstrong\u003eWhat\u0026rsquo;s driving customer satisfaction?\u003c/strong\u003e Beyond surface-level metrics and aggregated scores, we wanted to pinpoint specific elements and features that resonate positively with users. Using natural language processing, especially techniques like \u003ccode\u003etopic modeling\u003c/code\u003e, we sought to discover hidden themes and sentiments. The goal was not only to identify these themes but to understand their correlation with satisfaction levels and potential pain points.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eAre there varied experiences across customer types?\u003c/strong\u003e With the digital service catering to a broad spectrum of users, it was important to discern variations in experiences. Key questions posed included: \u003cem\u003eDo different user categories experience similar challenges? Is the level of satisfaction consistent across the board?\u003c/em\u003e Comprehensive analysis and statistical tests of ratings – segmented by user groups and coupled with deep dives into their responses via natural language processing – enabled us to discern these differences, offering insights into the distinct journeys and experiences of various users.\u003c/li\u003e\n\u003c/ol\u003e\n\u003cp\u003eIn our efforts to support High-impact Service Providers in analyzing and understanding their data, we created a \u003ca href=\"https://digital.gov/2023/10/30/decoding-public-sentiment-harnessing-open-data-to-gain-insights-into-service-delivery/\"\u003edashboard to decode public sentiment\u003c/a\u003e and a \u003ca href=\"https://github.com/GSA/GovCXAnalyzer\"\u003ereusable code repository\u003c/a\u003e. The code repository provides the statistical and text analysis methods we describe in this blog post.\u003c/p\u003e\n\u003ch2 id=\"transforming-unstructured-data-into-actionable-insights\"\u003eTransforming unstructured data into actionable insights\u003c/h2\u003e\n\u003cp\u003eBefore diving into our findings, it\u0026rsquo;s worth specifying the three natural language processing techniques we used to systematically analyze the unstructured data:\u003c/p\u003e\n\u003col\u003e\n\u003cli\u003e\u003cstrong\u003eAspect-level sentiment analysis\u003c/strong\u003e: Traditional sentiment analysis tells us whether a customer’s free-text response is generally positive, negative or neutral. Aspect-level sentiment analysis goes one step further by categorizing the specific aspects or features of the statement. For the statement, \u0026ldquo;The information was helpful, but the website was hard to navigate,\u0026rdquo; aspect-level sentiment analysis would identify \u003ccode\u003einformation\u003c/code\u003e as positive and \u003ccode\u003ewebsite navigation\u003c/code\u003e\u0026quot; as negative. This allows us to break down the feedback into targeted areas for praise or improvement.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eTopic modeling\u003c/strong\u003e: This technique automatically identifies the themes or topics that are prevalent in a large collection of texts. In the context of customer feedback, it clusters similar comments together to highlight recurring issues or strengths. For example, if multiple respondents mention phrases like \u003ccode\u003elong wait times\u003c/code\u003e or \u003ccode\u003eslow service\u003c/code\u003e, this technique will cluster these together under a broader topic like \u003ccode\u003eEfficiency\u003c/code\u003e. It offers a scalable alternative to manual categorization and ensures that no major themes are overlooked. This was incredibly important when analyzing digital services that often capture thousands of survey responses per quarter.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eNamed entity recognition\u003c/strong\u003e: This technique identifies and classifies named entities such as organizations, persons, dates, and even specialized terms, like programs, within the text. The approach allows us to track mentions of specific entities within the customer feedback.\u003c/li\u003e\n\u003c/ol\u003e\n\u003ch2 id=\"rethinking-survey-design-the-role-of-sentiment-analysis\"\u003eRethinking survey design: The role of sentiment analysis \u003c/h2\u003e\n\u003cp\u003eIncorporating program-specific questions into your surveys can significantly enhance the depth and relevance of the insights you gather. Specifically, asking questions about the user can enable a more granular and effective analysis of customer segments and their unique needs. For example, a service that supports the private sector might benefit from asking questions about the user’s specific industry sector. Asking at least one user-type question not only enriches your analysis but helps identify which customer segments are currently well-served and which may be lacking attention.\u003c/p\u003e\n\u003cp\u003eAdditionally, our analysis uncovered a critical issue with survey design that impacts the data’s reliability. It was through sentiment analysis that we identified a disconnect between the sentiment expressed in open-text responses and the numerical ratings. We identified instances where customers gave a \u003ccode\u003e1\u003c/code\u003e (low) rating but conveyed a positive experience in the accompanying text. This discovery highlights the need for survey question clarity, both in the prompts, and the available options for responses.\u003c/p\u003e\n\u003cp\u003eIn response to this finding, we proposed a survey redesign to accurately capture customer experience. We suggested rephrasing the questions in the following formats:\u003c/p\u003e\n\u003cdiv class=\"box \"\u003e\n  \u003col\u003e\n\u003cli\u003eDo you agree or disagree with the following statement:\u003cbr /\u003e\u0026ldquo;The information I needed to complete [service] was easy to find and understand.\u0026rdquo;\u003c/li\u003e\n\u003c/ol\u003e\n\u003cul\u003e\n\u003cli\u003eStrongly Agree\u003c/li\u003e\n\u003cli\u003eAgree\u003c/li\u003e\n\u003cli\u003eNeutral\u003c/li\u003e\n\u003cli\u003eDisagree\u003c/li\u003e\n\u003cli\u003eStrongly Disagree\u003c/li\u003e\n\u003c/ul\u003e\n\u003col start=\"2\"\u003e\n\u003cli\u003eOn a scale of 1-5 (with 1 being very difficult and 5 being very easy), please rate your ease of finding the information you needed to complete [service].\u003c/li\u003e\n\u003c/ol\u003e\n\u003cul\u003e\n\u003cli\u003e1 (very difficult)\u003c/li\u003e\n\u003cli\u003e2\u003c/li\u003e\n\u003cli\u003e3\u003c/li\u003e\n\u003cli\u003e4\u003c/li\u003e\n\u003cli\u003e5 (very easy)\u003c/li\u003e\n\u003c/ul\u003e\n\n\u003c/div\u003e\n\n\u003cp\u003eBy clarifying the phrasing of the question and explicitly labeling the rating scale, these questions aim to prevent misunderstandings and improve the quality of the data collected.\u003c/p\u003e\n\u003ch2 id=\"department-of-labor-case-study-enhancing-services-through-comprehensive-customer-feedback-analysis\"\u003eDepartment of Labor case study: Enhancing services through comprehensive customer feedback analysis\u003c/h2\u003e\n\u003cp\u003eAs mentioned earlier, GSA’s Analytics and Decision Support Division within the Office of the Chief Financial Officer applied natural language processing techniques to customer survey data from two services at the Department of Labor: an in-person service and an online, digital-based service.\u003c/p\u003e\n\n\n\n\n\n\n\u003cdiv class=\"quote-block \"\u003e\n    \u003cblockquote\u003e\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e“\u003c/span\u003e\n      \u0026hellip;[GSA] was able to look at the data we had collected, along with additional data they collected, and provide us insights into our program that gave us a roadmap to really improve the customer experience..[Their analysis] provided us with an excellent summary of their findings, and some concrete actions we could take to improve our interactions with both our internal and external stakeholders. At the same time that [GSA] was analyzing our processes and data, we were working on a project to modernize our program. The recommendations from [the analysis] dovetailed well with our own discoveries during this process and provided support for making changes\u0026hellip;we intend to investigate those recommendations further and incorporate them in the upcoming fiscal year, where we can.\n      \u003cspan class=\"quote-block__quotation-mark\"\u003e”\u003c/span\u003e\u003ccite\u003e— Department of Labor HISP Customer Experience manager (in-person service)\u003c/cite\u003e\u003c/blockquote\u003e\n  \u003c/div\u003e\n\u003cp\u003e\u003cstrong\u003eHighlights from the in-person service deep dive analysis\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eUnderstanding high satisfaction drivers\u003c/strong\u003e: Customers reported overwhelmingly high satisfaction, with the majority of survey respondents rating the service at 4 or 5. Free-text offered important context because there was a lack of variation in the numerical customer ratings. Sentiment analysis highlighted employee interactions (people factor) and the perceived service quality as the primary drivers behind positive customer experience. Customer free-text responses praised Department of Labor personnel for their helpfulness, dedication, and competence.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eEfficiency challenges\u003c/strong\u003e: Despite the high satisfaction ratings, analysis of the free-text responses revealed issues with the speed and efficiency of the service. In particular, customers highlighted delays in the notification process. Additional analysis revealed that certain industry sectors felt these pain points more acutely than others. Based on these findings, we recommended that the service:\n\u003cul\u003e\n\u003cli\u003eEstablish clear benchmarks for each stage of the process to identify and address inefficiencies,\u003c/li\u003e\n\u003cli\u003eNotify customers upfront about expected timeframes (particularly if lengthy) to set accurate expectations and alleviate user anxieties.\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eHighlights from the website service deep dive analysis\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eVaried experience across customer segments\u003c/strong\u003e: Customer segments interacted with the platform differently, leading to remarkably varied feedback scores. One customer segment in particular reported a less satisfactory experience across most of the survey prompts. Their main pain points revolved around the perceived absence of required services in certain geographic areas, and the platform\u0026rsquo;s complexity. The insight indicates that they may require a user-specific interface or website page to address a perceived lack of clear information and/or service availability. Other customer segments had much more positive experiences in comparison. Testimonials such as, “I have been using this website for a couple of years\u0026hellip; It has been very helpful,” highlighted their positive journey. They expressed genuine appreciation for the resources provided, often citing specific tools and pages the platform offers.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eEmotion as a key dimension\u003c/strong\u003e: For users with complex information needs, such as searching for information on federal disaster assistance and unemployment benefits, emotion played a pivotal role in their customer experience. When these users didn’t find solutions to their challenges, they often expressed stronger negative emotions and broader frustrations that went beyond the platform to include the entire U.S. government. This aligns with the \u003ca href=\"https://www.performance.gov/cx/life-experiences/recovering-from-a-disaster\"\u003eRecovering from a Disaster\u003c/a\u003e project on Performance.gov, which focuses on developing trauma-informed communication guidelines. The project aims to provide agency staff with the knowledge, skills, and support for a trauma-informed approach, enhancing the recovery experience for disaster-impacted individuals. It also introduces a holistic methodology to calculate the end-to-end burden on users, considering psychological and learning costs. Adopting these practices in customer service is highly recommended, especially for federal services that are serving customers in challenging situations.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eEmergent theme from analyzing customer responses\u003c/strong\u003e: Topic modeling identified a theme in the text feedback around outdated content and broken links. For this web-based service, we recommended leveraging the user feedback to identify and prioritize areas for content updates.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eThe comprehensive analysis provided a rich and detailed picture of user sentiment, allowing for targeted recommendations for improving the online platform\u0026rsquo;s user experience across diverse customer segments. We’re continuing to partner with this Department of Labor service to perform a \u003ca href=\"https://www.performance.gov/policies/terms/cx/#voc\"\u003eVoice of the Customer\u003c/a\u003e analysis that goes beyond the survey data to look at unsolicited feedback from customers, such as emails and chat messages.\u003c/p\u003e\n\u003cp\u003eIn parallel, GSA\u0026rsquo;s \u003ca href=\"https://coe.gsa.gov/coe/customer-experience.html\"\u003eCustomer Experience Center of Excellence\u003c/a\u003e also worked with the Department of Labor to help them identify ways to improve their end-to-end customer experience through a customer experience maturity assessment, a website assessment and customer and stakeholder interviews.\u003c/p\u003e\n\u003cp\u003eTheir findings from this work echoed many of the insights uncovered by our team. By combining advanced data science techniques (such as natural language processing) and the Center of Excellence’s mixed methods work, GSA was able to provide a fuller picture of customer experience.\u003c/p\u003e\n\u003ch2 id=\"closing-thoughts-and-call-to-action\"\u003eClosing thoughts and call to action\u003c/h2\u003e\n\u003cp\u003eNatural language processing opens up a new world in customer experience analytics. With it, we\u0026rsquo;re not just measuring experiences. We\u0026rsquo;re also using advanced analytics to understand the experiences — and, most importantly, to drive how we can improve them.\u003c/p\u003e\n\u003cp\u003eWhether it\u0026rsquo;s understanding the pivotal role of federal staff in delivering an in-person program or addressing the user-friendliness of a digital platform, combining analysis of structured and unstructured customer feedback provides actionable insights for enhancing customer experience with federal services.\u003c/p\u003e\n\n\n\n\u003carticle\n  class=\"dg-note \"\n\u003e\n  \u003ch4 class=\"dg-note__heading\"\u003e\n    \u003csvg\n      class=\"dg-note__icon usa-icon dg-icon dg-icon--large\"\n      aria-hidden=\"true\"\n      focusable=\"false\"\n    \u003e\n      \u003cuse xlink:href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/uswds/img/sprite.svg#notifications\"\u003e\u003c/use\u003e\n    \u003c/svg\u003e\n    \n      Note\n    \n  \u003c/h4\u003e\n  For those interested in implementing these techniques, the code is available in GSA’s \u003ca href=\"https://github.com/GSA/GovCXAnalyzer/\"\u003eGovCXAnalyzer GitHub repository\u003c/a\u003e. Feel free to dive in, adapt it, and start to analyze your own structured and unstructured customer feedback data.\n\u003c/article\u003e\n\n"}
  ]
}
