Five Lessons Learned During a Content Audit
As part of USAGov’s efforts to provide our audience with the reliable and quality information that they need, this summer, the Health, Education & Benefits (HE&B) topic desk completed its first content audit.
Methodology and Results
Data informed every step we took. In order to determine which areas to focus on first, the desk gathered data from four distinct sources: the Contact Center’s usage of our content to answer customers’ inquiries; site analytics; content inventory and review; and website survey comments. A report and an action plan were produced which would serve as the main driver of our content overhaul.
Steps We Took and Lessons Learned
- Approach. The team decided that all the subtopics were going to be part of the audit, and took a bottom-up analytical approach during the discovery phase. This type of approach occurs when the analysis of the metrics is done without a defined goal. Although we technically had a goal — improve our content — it proved to be too high level. We struggled with the approach because we were trying to find answers when we didn’t know yet the questions to ask.
_Lesson learned: Select a representative sample of the content, observe the metrics over time, and clearly define the focus before beginning the work._
- Timeframe for Analysis. Time constraints forced us to focus on our objectives and prioritize data that we determined would be most insightful. We arrived at our customer’s page level comments as a good starting point.
_Lesson learned: Most things take longer than expected, so more time should be allowed for the discovery portion of a content audit._
- Survey Comments Analysis. What users were telling us in the survey comments was the key to setting our goals. We systemized the analysis in a way that we could determine what would eventually become our content goals: determine what users were trying to do in our pages, how they were trying to do it, and change the negative sentiment of some of our pages with better content. The result of the systematization informed the usability, information architecture tests, and the content analysis heuristics.
_Lesson learned: Challenge everyone’s assumptions about the content and users’ interaction with it, and turn survey comments into quantitative data. It helps to easily identify trends and opens up a conversation on user versus “creator.”_
- Content Analysis. As part of the discovery phase, there was a content analysis heuristics where we compared existing content with the survey comments and the analytics data, and we made a list of potential content to test, improve, move, or delete. The content analysis heuristics took the form of regular face-to-face meetings of the desk members with a UX team member. We considered issues with the quality of our content, such as usefulness and relevance, clarity and accuracy, completeness, style; with the information architecture such as usability and findability, completeness, consistency, and appropriate structure of our assets.
_Lesson learned: We don’t know our web content until we audit it, and all team members are figuring it out together._
- Testing. We know that optimization is an ongoing, iterative process, and that next time our content is ready for another audit our objectives and planning will be different.
_Lesson learned: There is no such a thing as a standard content audit process. Data (or external circumstances) will set the stage for the next steps. It is imperative to lay out a framework and context for what you’re auditing, why, and what your main objectives are. The goal is always the same — improve the public’s experience with your content._
Next Steps
With the results of the content analysis and audit in combination with the report of findings and recommendations, we’re revamping our content on various USA.gov pages.Andrea M. Castelluccio, Ph.D., is a bilingual member of the USAGov content team. This was originally published on the USAGov blog{.markup–anchor.markup–p-anchor}.