{
    "version" : "https://jsonfeed.org/version/1",
    "content" : "news",
    "type" : "single",
    "title" : "Case Study: OCSIT’s Email Customer Survey Process |Digital.gov",
    "description": "Case Study: OCSIT’s Email Customer Survey Process",
    "home_page_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/","feed_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2016/01/29/case-study-ocsits-email-customer-survey-process/index.json","item" : [
    {"title" :"Case Study: OCSIT’s Email Customer Survey Process","summary" : "To help us gauge the effectiveness of the programs we offer to other government agencies, we launched our first Government Customer Experience Index (GCXi) survey in 2013.","date" : "2016-01-29T10:00:34-04:00","date_modified" : "2025-01-27T19:42:55-05:00","authors" : {"rflagg" : "Rachel Flagg"},"topics" : {
        
            "customer-experience" : "Customer experience",
            "information-collection" : "Information collection",
            "research" : "Research"
            },"branch" : "bc-archive-content-3",
      "filename" :"2016-01-29-case-study-ocsits-email-customer-survey-process.md",
      
      "filepath" :"news/2016/01/2016-01-29-case-study-ocsits-email-customer-survey-process.md",
      "filepathURL" :"https://github.com/GSA/digitalgov.gov/blob/bc-archive-content-3/content/news/2016/01/2016-01-29-case-study-ocsits-email-customer-survey-process.md",
      "editpathURL" :"https://github.com/GSA/digitalgov.gov/edit/bc-archive-content-3/content/news/2016/01/2016-01-29-case-study-ocsits-email-customer-survey-process.md","slug" : "case-study-ocsits-email-customer-survey-process","url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2016/01/29/case-study-ocsits-email-customer-survey-process/","content" :"\u003cp\u003eAt GSA’s Office of Citizen Services and Innovative Technologies (OCSIT), we offer technology services and tools to make government work better. To help us gauge the effectiveness of the programs we offer to other government agencies, in 2013 we launched our first Government Customer Experience Index (GCXi) survey. This annual email survey consistently measures customer satisfaction, loyalty and ease of use for various OCSIT programs.\u003c/p\u003e\n\u003cdiv class=\"image\"\u003e\n  \u003cimg\n    src=\"https://s3.amazonaws.com/digitalgov/_legacy-img/2016/01/600-x-400-Online-Survey-devke-iStock-Thinkstock-469618252.jpg\"\n    alt=\"The word survey in giant red lettering, with a computer mouse plugged into the letter R.\"/\u003e\u003cp\u003edevke/iStock/Thinkstock\u003c/p\u003e\u003c/div\u003e\n\n\n\u003cp\u003eA previous post about the GCXi (\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/2015/12/28/ocsits-2015-customer-survey-what-we-learned/\"\u003eOCSIT’s 2015 Customer Survey—What We Learned\u003c/a\u003e) generated lots of questions from readers about the back-end processes we use to conduct the survey and turn customer data into action. Since we’re big fans of transparency, we’re sharing this case study in hopes that it’s helpful to you as you build your own Voice of the Customer (VOC) program.\u003c/p\u003e\n\u003ch2 id=\"the-big-picture\"\u003eThe Big Picture\u003c/h2\u003e\n\u003cp\u003eThough we survey our government customers just once per year via the GCXi, we actually work on the process throughout the entire year.\u003c/p\u003e\n\u003cp\u003eWe conduct our annual survey in the spring. The survey is created in SurveyMonkey (though any modern survey tool would work), and delivered via email. We have a \u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/resources/paperwork-reduction-act-fast-track-process/\"\u003ePRA clearance\u003c/a\u003e, since our customer base includes not just federal, but also state and local government folks. (Note, if you’re looking for a survey tool, check out the list of free tools that have a \u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/resources/negotiated-terms-of-service-agreements/\"\u003efederal-friendly terms of service agreement\u003c/a\u003e.)\u003c/p\u003e\n\u003cp\u003eDuring the summer, we review and analyze the data, then develop action plans for each program. We work to implement improvements during the fall and winter, and by then, it’s time to gear up for the next year’s survey.\u003c/p\u003e\n\u003cp\u003eA benefit of keeping this process top-of-mind for staff all year long is that it enforces the importance of customer-centric thinking in all we do. It’s also just one of many tools in our VOC toolbox. Other tools we use include:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/guides/dap/\"\u003eWeb analytics\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/resources/digitalgov-user-experience-resources/digitalgov-user-experience-program-usability-starter-kit/\"\u003eUsability testing\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/2014/02/28/what-do-people-think-of-your-content-ask-your-contact-center/\"\u003eCall center data\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003eWeb pop-up surveys\u003c/li\u003e\n\u003cli\u003eFree-form customer comments from social media, chat, blogs or email\u003c/li\u003e\n\u003cli\u003eAgency consultations and office hours\u003c/li\u003e\n\u003cli\u003eEmployee training and engagement activities\u003c/li\u003e\n\u003cli\u003ePost-event surveys\u003c/li\u003e\n\u003cli\u003eTalking to customers one-on-one\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"goals\"\u003eGoals\u003c/h2\u003e\n\u003cp\u003eOne of our primary goals was to develop a framework to consistently measure customer satisfaction across all our programs, on an ongoing basis. Consistency is important to benchmark progress, and the index gives us a framework to do just that. By asking the same core questions for all our programs, we’ve created a baseline to help us evaluate whether we’re improving over time.\u003c/p\u003e\n\u003ch2 id=\"the-questions\"\u003eThe Questions\u003c/h2\u003e\n\u003cdiv class=\"image\"\u003e\n  \u003cimg\n    src=\"https://s3.amazonaws.com/digitalgov/_legacy-img/2015/12/600-x-400-Feedback-Text-on-Small-Wooden-Cube-Gajus-iStock-Thinkstock-536974445.jpg\"\n    alt=\"The word Feedback seen on a small wooden cube sits on a laptop keyboard\"/\u003e\u003cp\u003eGajus/iStock/Thinkstock\u003c/p\u003e\u003c/div\u003e\n\n\n\u003cp\u003eThe email survey currently consists of six questions. Four of the questions are multiple-choice and are used to calculate a “score” for each program; this is our baseline. The last two are open-ended, and provide the most actionable data, because they give customers a platform to tell us, in their own words, what’s working and where we can improve.\u003c/p\u003e\n\u003cp\u003eThe core questions measure satisfaction, loyalty and ease of use. Questions may be customized slightly, such as to identify a specific program name. Here are the questions we use in our email surveys:\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eHow would you rate your overall experience with [this program/service]?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eVery good\u003c/li\u003e\n\u003cli\u003eGood\u003c/li\u003e\n\u003cli\u003eFair\u003c/li\u003e\n\u003cli\u003ePoor\u003c/li\u003e\n\u003cli\u003eVery poor\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eHow likely are you to recommend [this program/service] to a friend?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eVery likely\u003c/li\u003e\n\u003cli\u003eLikely\u003c/li\u003e\n\u003cli\u003eNeither likely nor unlikely\u003c/li\u003e\n\u003cli\u003eUnlikely\u003c/li\u003e\n\u003cli\u003eVery unlikely\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eHow likely are you to use [this program/service] in the future?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eVery likely\u003c/li\u003e\n\u003cli\u003eLikely\u003c/li\u003e\n\u003cli\u003eNeither likely nor unlikely\u003c/li\u003e\n\u003cli\u003eUnlikely\u003c/li\u003e\n\u003cli\u003eVery unlikely\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eHow easy or difficult was it to [use this program/service]?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eVery easy\u003c/li\u003e\n\u003cli\u003eEasy\u003c/li\u003e\n\u003cli\u003eNeither easy nor difficult\u003c/li\u003e\n\u003cli\u003eDifficult\u003c/li\u003e\n\u003cli\u003eVery difficult\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eWhat are the greatest strengths of [this program/service]?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eOpen-ended\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eWhat are the greatest weaknesses?\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eOpen-ended\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"the-score\"\u003eThe Score\u003c/h2\u003e\n\u003cp\u003eResponse choices are listed from positive to negative. The top two (positive) responses are “promoters,” the middle response is “neutral,” and the bottom two (negative) responses are “detractors.” We subtract the percentage of detractors from the percentage of promoters to get the score. We score each question, as well as calculate an overall score for each program. Note, that if you have more detractors than promoters, it’s possible to get a negative score (range is plus or minus 100).\u003c/p\u003e\n\u003cp\u003eI know you’re thinking, “So, where are your scores?” While the actual numbers are for internal management purposes only, we’ve shared some overall insights in this post, \u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/2016/01/05/digging-into-the-data-of-our-customer-survey/\"\u003eDigging Into the Data of Our Customer Survey\u003c/a\u003e.\u003c/p\u003e\n\u003ch2 id=\"closing-the-loop\"\u003eClosing the Loop\u003c/h2\u003e\n\u003cp\u003eOne main reason this has worked for us is management support. Our management team is committed to improving the customer experience for all our programs across OCSIT, and that commitment is embraced by the entire team.\u003c/p\u003e\n\u003cp\u003eThe other main reason for success is our action planning process, which “closes the loop.” We don’t just collect data, we actually do something with it. Program managers are tasked to review customer feedback and identify areas for improvement. We develop action plans that outline when and how we’ll address issues and make those improvements. We share these plans across the team, and report out to senior management on what we learned, the actions we took and how the feedback is expected to improve our programs.\u003c/p\u003e\n\u003ch2 id=\"evolution\"\u003eEvolution\u003c/h2\u003e\n\u003cp\u003eLike anything else in life, you try something, learn, adapt, move forward. We learn something new each time we run the GCXi, and continue to iterate and improve as time goes on. For example, the “ease of use” question was added in 2015, and has provided us with a clear call to action to provide more direct training and support to our customers, particularly for our more technical programs.\u003c/p\u003e\n\u003cp\u003eAs background, here are some of the resources that inspired us as we developed the GCXi:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003ca href=\"https://www.youtube.com/watch?v=9VxW7mFZUc4\u0026amp;list=PLd9b-GuOJ3nH7xSSjL1XBXPfVqw68BNbW\u0026amp;index=15\"\u003eDesigning a Better Customer Survey\u003c/a\u003e—video\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/guides/dap/\"\u003eDigital Metrics Guidance and Best Practices\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.forrester.com/CX-Index/-/E-MPL191\"\u003eForrester’s Customer Experience Index\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"http://www.clicktools.com/wp-content/uploads/2015/04/Navigating-the-Alphabet-Soup-of-Survey-Methodologies.pdf\"\u003eNavigating the Alphabet Soup of Survey Methodologies\u003c/a\u003e—ClickTools (PDF)\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://media.clemson.edu/cbshs/prtm/research/resources-for-research-page-2/Vagias-Likert-Type-Scale-Response-Anchors.pdf\"\u003eLikert-Type Scale Response Anchors (PDF, 57 kb, 2 pages)\u003c/a\u003e—recommended wording and rating scales for a variety of survey questions\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eWhile this post focused on our email surveys, it’s worth noting that we follow a similar process for most of our website surveys, asking the same core questions, as well as asking about task completion (but that’s a topic for another post!).\u003c/p\u003e\n\u003cp\u003eOur GCXi has given us a framework to listen to customers, benchmark progress in a consistent way and evaluate whether we’re improving over time. If you have suggestions or ideas to help us better serve you, \u003ca href=\"mailto:rachel.flagg@gsa.gov\"\u003eplease let us know\u003c/a\u003e! Interested in learning more about improving the government customer experience? Join the \u003ca href=\"https://digital.gov/communities/customer-experience/\"\u003eGovernment Customer Experience Community\u003c/a\u003e and review the \u003ca href=\"https://digital.gov/resources/customer-experience-toolkit/\"\u003eCustomer Experience Toolkit\u003c/a\u003e.\u003c/p\u003e\n"}
  ]
}
