{
    "version" : "https://jsonfeed.org/version/1",
    "content" : "news",
    "type" : "single",
    "title" : "The Data Briefing: the Promise &#8211; and Perils &#8211; of Artificial Emotional Intelligence |Digital.gov",
    "description": "The Data Briefing: the Promise &#8211; and Perils &#8211; of Artificial Emotional Intelligence",
    "home_page_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/","feed_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2017/06/14/the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence/index.json","item" : [
    {"title" :"The Data Briefing: the Promise \u0026#8211; and Perils \u0026#8211; of Artificial Emotional Intelligence","summary" : "People are quick to treat computer programs as being more aware than computer code could be. However, can computer programs – especially chatbots – have a significant effect on people? Especially people suffering from depression or anxiety? A recent academic study shows some promising results from using a chatbot as a complement to a human therapist.","date" : "2017-06-14T11:37:13-04:00","date_modified" : "2025-01-27T19:42:55-05:00","authors" : {"bbrantley" : "Bill Brantley"},"topics" : {
        
            "artificial intelligence" : "Artificial intelligence",
            "emerging-tech" : "Emerging tech",
            "innovation" : "Innovation",
            "mobile" : "Mobile"
            },"primary_image" : { "uid" : "creative-brain-concept-thanaphiphat-istock-thinkstock-494897999", "alt" :
  "Illustrated concept of the creative brain.", "width" :
  "4556", "height" :
  "3644", "credit" :
  "", "caption" :
  "thanaphiphat/iStock/Thinkstock", "format" :
  "png" },"branch" : "bc-archive-content-3",
      "filename" :"2017-06-14-the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence.md",
      
      "filepath" :"news/2017/06/2017-06-14-the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence.md",
      "filepathURL" :"https://github.com/GSA/digitalgov.gov/blob/bc-archive-content-3/content/news/2017/06/2017-06-14-the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence.md",
      "editpathURL" :"https://github.com/GSA/digitalgov.gov/edit/bc-archive-content-3/content/news/2017/06/2017-06-14-the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence.md","slug" : "the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence","url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/2017/06/14/the-data-briefing-the-promise-and-perils-of-artificial-emotional-intelligence/","content" :"\u003cp\u003eThe first chatbot, ELIZA, was created back in 1964 to demonstrate that communication between humans and computers would be superficial. However, much to Dr. Weizenbaum’s (ELIZA’s creator) surprise, people easily formed friendly relationships with the computer program. People forming relationships with ELIZA was especially surprising considering just \u003ca href=\"http://www.filfre.net/2011/06/eliza-part-2/\"\u003ehow simple the program was regarding generating conversational responses\u003c/a\u003e. ELIZA essentially parroted back what the users typed but, this was enough to convince people that the program seemed to care about the person. The \u003ca href=\"https://en.wikipedia.org/wiki/ELIZA_effect\"\u003eELIZA Effect\u003c/a\u003e was coined to describe how users attribute humanlike motives to computer programs. You can talk to ELIZA yourself at this \u003ca href=\"http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm\"\u003esite\u003c/a\u003e.\u003c/p\u003e\n\n\n\n\n\n\n\n\u003cdiv class=\"image\"\u003e\n  \u003cimg\n        src=\"https://s3.amazonaws.com/digitalgov/creative-brain-concept-thanaphiphat-istock-thinkstock-494897999.png\"alt=\"Illustrated concept of the creative brain.\"/\u003e\u003cp\u003ethanaphiphat/iStock/Thinkstock\u003c/p\u003e\u003c/div\u003e\n\n\n\u003cp\u003ePeople are quick to treat computer programs as being \u003ca href=\"https://chatbotsmagazine.com/why-people-treat-bots-like-people-1c3d7afafca8\"\u003emore aware than computer code could be\u003c/a\u003e. However, can computer programs – especially chatbots – have a significant effect on people? Especially people suffering from depression or anxiety?\u003c/p\u003e\n\n\n\u003cdiv class=\"image image-right\"\u003e\n  \u003cimg\n    src=\"https://s3.amazonaws.com/digitalgov/woebotapp_w200.png\"  alt='Screen capture of a mobile phone with Woebot chat messages.'\n    srcset=\"https://s3.amazonaws.com/digitalgov/woebotapp_bu.jpg 48w,https://s3.amazonaws.com/digitalgov/woebotapp_w400.png 400w,https://s3.amazonaws.com/digitalgov/woebotapp_w200.png 200w\"\n    sizes=\"(max-width: 600px) 40vw, 400px\"\n  /\u003e\u003c/div\u003e\n\n\n\u003cp\u003eA recent academic study shows some promising results from \u003ca href=\"https://mental.jmir.org/2017/2/e19/\"\u003eusing a chatbot as a complement to a human therapist\u003c/a\u003e. Called \u003ca href=\"https://www.woebot.io/\"\u003eWoebot\u003c/a\u003e, the chatbot used \u003ca href=\"https://psychcentral.com/lib/in-depth-cognitive-behavioral-therapy/\"\u003ecognitive behavioral therapy\u003c/a\u003e techniques to “\u003ca href=\"https://chatbotsmagazine.com/a-therapist-bot-actually-works-e27c72b9632e\"\u003ehelp people identify and manage symptoms of anxiety and depression, through identifying and changing patterns of distorted negative thinking\u003c/a\u003e.” Students who used Woebot were found to have significantly reduced symptoms of anxiety and depression compared to students who only used a self-help book. Please note that this is only one study of 70 students over a two-week period but, it shows promise.\u003c/p\u003e\n\u003cp\u003eOn September 30, 2015, the \u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/2015/10/07/the-data-briefing-white-house-asks-data-scientists-and-app-developers-to-help-suicide-prevention-efforts/\"\u003eWhite House called on data scientists and app developers to help in creating suicide prevention apps\u003c/a\u003e. A chatbot like Woebot is on call 24/7, does not judge, and can keep the person talking while first responders are being called to the scene. Similarly, Facebook is working on \u003ca href=\"https://www.scientificamerican.com/article/can-facebooks-machine-learning-algorithms-accurately-predict-suicide/\"\u003emachine-learning algorithms to detect suicidal thoughts in Facebook postings\u003c/a\u003e. The Facebook algorithms could be embedded in apps provided by the Veterans Administration or the Substance Abuse and Mental Health Services Administration.\u003c/p\u003e\n\u003cp\u003e\u003ca href=\"http://affect.media.mit.edu/\"\u003eAffective computing\u003c/a\u003e, is a growing computer science field that researches ways to influence emotion. Thanks to affective computing research, computer programs can now successfully detect human emotions from scanning faces. Companies are working now to determine how to embed emotion-detecting sensors into \u003ca href=\"http://www.telegraph.co.uk/technology/2016/01/21/affective-computing-how-emotional-machines-are-about-to-take-ove/\"\u003ewearables, smartphones, and other Internet-of-Things devices\u003c/a\u003e. Imagine how emotion-detecting technology can help improve relations between citizens and the American government.\u003c/p\u003e\n\u003cp\u003eDetecting emotions while a citizen is on the phone with a government representative may seem like a joke but, think about how an affective computing chatbot can detect when a citizen is becoming frustrated during a call to a government agency. Calming techniques could lower the tension and bring the interaction back to a more productive result. Emotion-detecting chatbots can help veterans to deal with emotional issues brought on by post-traumatic stress syndrome or wartime injuries. Many affective computing applications can help people live better and happier lives through effective emotional management.\u003c/p\u003e\n\u003cp\u003eAffective computing can also be a dangerous tool of manipulation. Here also, the government must step in to guard against the abuse of affective computing and to protect the American public from emotional manipulation and abuse. \u003ca href=\"http://www.zdnet.com/article/emotional-intelligence-is-the-future-of-artificial-intelligence-fjord/\"\u003eArtificial emotional intelligence has great potential\u003c/a\u003e but, like all technologies, it must be wisely used.\u003c/p\u003e\n\u003chr\u003e\n\u003cp\u003e\u003cstrong\u003eDisclaimer\u003c/strong\u003e: All references to specific brands and/or companies are used only for illustrative purposes and do not imply endorsement by the U.S. federal government or any federal government agency.\u003c/p\u003e\n\u003cp\u003e\u003cem\u003eEach week, \u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/topics/emerging-tech/\"\u003eThe Data Briefing\u003c/a\u003e showcases the latest federal data news and trends. Visit this blog every week to learn how data is transforming government and improving government services for the American people. If you have ideas for a topic or have questions about government data, please contact me via email.\u003c/em\u003e\u003c/p\u003e\n\u003cp\u003e\u003cem\u003e\u003ca href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/authors/bbrantley/\"\u003eDr. William Brantley\u003c/a\u003e is the Training Administrator for the U.S. Patent and Trademark Office’s Global Intellectual Property Academy. All opinions are his own and do not reflect the opinions of the USPTO or GSA.\u003c/em\u003e\u003c/p\u003e\n"}
  ]
}
