{"id":2250,"date":"2019-03-19T16:55:11","date_gmt":"2019-03-19T16:55:11","guid":{"rendered":"https:\/\/www.emrsystems.net\/blog\/?p=2250"},"modified":"2019-03-19T16:55:11","modified_gmt":"2019-03-19T16:55:11","slug":"emr-usage-easier-with-voice-tools-ai-and-nlp","status":"publish","type":"post","link":"https:\/\/emrsystems.net\/blog\/emr-usage-easier-with-voice-tools-ai-and-nlp\/","title":{"rendered":"EMR Usage Easier with Voice Tools, AI and NLP"},"content":{"rendered":"<p style=\"text-align: justify;\">Voice-based documentation, Natural Language Processing (NLP), and Artificial Intelligence (AI) are capable of eradicating many issues while using Electronic Health Records (EHR) software.<\/p>\n<p style=\"text-align: justify;\">Machine learning and Artificial Intelligence (AI) are the new features and applications that have been added to the healthcare industry. However, some of the healthcare providers have already been using some pieces of these features for a long time without even realizing it.<\/p>\n<p style=\"text-align: justify;\">In many of the diagnostic disciplines of healthcare, like <a href=\"https:\/\/www.emrsystems.net\/pathology-emr\/\">Pathology<\/a>\u00a0and <a href=\"https:\/\/www.emrsystems.net\/radiology-imaging-emr\/\">Radiology<\/a>, the voice recognition tools are mainly used for the documentation of clinical reports and notes into the\u00a0<a href=\"https:\/\/www.emrsystems.net\/all-emr-software\/\">Electronic Medical Records (EMR) software<\/a>.<\/p>\n<p style=\"text-align: justify;\"><strong>Voice-to-Text\/Speech-to-Text Applications<\/strong><\/p>\n<p style=\"text-align: justify;\">Voice-to-text applications are also gaining popularity very fast in practices and in the primary healthcare industry. These speech-to-text applications are based upon NLP (Natural Language Processing), which is a common kind of machine learning. Its purpose is to turn some kind of sound into a text.<\/p>\n<p style=\"text-align: justify;\">The machine learning system is also used to identify many useful elements in the text, like the number of dosage for a drug and the name of the medicine.<\/p>\n<p style=\"text-align: justify;\">For all the physicians and healthcare providers across the globe, natural language processing can revolutionize their interaction with the\u00a0<a href=\"https:\/\/www.emrsystems.net\/\">EMR system<\/a>.<\/p>\n<p style=\"text-align: justify;\">Senior VP of clinical information and Chief Information Officer at WellSpan, R. Hal Baker said, \u201cVoice recognition can help move the EHR from its necessary function as a documentation tool for the business of medicine into a communication tool for the practice of medical care, the two functions are intertwined, but they are also distinctly different.\u00a0 Voice tools and natural language processing make sure that our providers can convey meaning and context using the full breadth of the English language without succumbing too many of the challenges we often see with EHR use.\u201d<\/p>\n<p style=\"text-align: justify;\">It does not matter how smooth or productive the workflow is designed, there always remain some issues in asking doctors, nurses and other healthcare providers to go through some very complex tasks in one time.<\/p>\n<p style=\"text-align: justify;\">Providers should be giving patients the best treatment without dividing their attention. They should make use of all the features and applications provided in the Electronic Health Records (EHR) software system.<\/p>\n<p style=\"text-align: justify;\">Baker further said, \u201cEveryone\u2019s capacity for attention is limited, that\u2019s why we tell people not to text and drive.\u00a0 You simply cannot stay focused on both tasks at once, and one is a lot more critical for safety and getting where you\u2019re going than the other.\u201d He also said that \u201cTexting and treating is exactly the same.\u00a0 You\u2019re asking a provider to manage two discordant tasks at the same time.\u00a0 They compete for focus, and as a result, you\u2019re going to miss important parts of both.\u201d<\/p>\n<p style=\"text-align: justify;\">Some other experts need to be more competent with multitasking.<\/p>\n<p style=\"text-align: justify;\">While criticizing what mainstream executives do, he said, \u201cI know very few board chairs or senior executives who try to type notes when they\u2019re running a business meeting, it\u2019s not what they\u2019re in the room to do.\u00a0 It\u2019s the same with healthcare providers \u2013 being able to sit at the keyboard and type notes is very rarely what attracted these people to medical school or nursing school.\u201d<\/p>\n<p style=\"text-align: justify;\">Across the country, the burnout rate is reaching high levels. Recently, a survey was conducted, which shows that 83% of the organizations in the healthcare industry are unable to meet the expectations of the patients when it comes to documentation and administrative issues.<\/p>\n<p style=\"text-align: justify;\">\u201cTime has turned into the currency of healthcare in the modern era. Right now, the amount of time spent looking at a screen and clicking a mouse is\u00a0becoming unsustainable. We\u00a0need to start employing new strategies to solve the problem.\u201d<\/p>\n<p style=\"text-align: justify;\">The tools of voice recognition can be more reliable for the new tactic. \u201cThe promise of voice recognition goes beyond dictating clinical notes as a replacement for typing or hand-writing them, in a perfect world, we\u2019ll be able to have a narrative conversation without even thinking about how it\u2019s being recorded,\u201d he stated.<\/p>\n<p style=\"text-align: justify;\"><strong>Interaction with the EMR and Practice Management (PM)<\/strong><\/p>\n<p style=\"text-align: justify;\">The mainstream interaction with the EMR and\u00a0<a href=\"https:\/\/www.emrsystems.net\/practice-management-emr\/\">Practice Management (PM) software<\/a>\u00a0cannot be completely replaced with the natural language processing tools. This is because these tools are not refined enough yet. However, Baker strongly believes that they are ready for a go.<\/p>\n<p style=\"text-align: justify;\">He further asserted that \u201cProducts like Alexa, Siri, and Google Home have shown us that\u00a0voice recognition with AI behind it\u00a0can do a pretty good job of following verbal instructions. As the industry refines those capabilities, it\u2019s becoming much less of a leap to think that I\u2019ll be able to say, \u2018We are going to put Mrs. Smith on 500mg of amoxicillin, four times a day for seven days.\u00a0 Send that prescription to the Walgreens on Queens Street.\u2019\u201d<\/p>\n<p style=\"text-align: justify;\">Those two sentences have a very simple meaning, and the current virtual assistants may be able to carry out a certain role if they achieve HIPAA capability in the coming years.<\/p>\n<p style=\"text-align: justify;\">\u201cBut if I were to enter that order into the EHR myself, it would take me somewhere between 8 and 15 clicks and several keystrokes. It would be a major benefit to my relationship with my patient if I could simply say that sentence out loud, confirm it with the patient, and continue our conversation without losing my focus on the person in front of me.\u201d Baker mentioned.<\/p>\n<p style=\"text-align: justify;\"><strong>Natural Language Processing and Voice Tools<\/strong><\/p>\n<p style=\"text-align: justify;\">The healthcare industry is at the initial stages of reaching that vision, but natural language processing and voice tools are already working on providing a better interactive platform for patients and the providers and on lessening the complex work with the electronic health record software.<\/p>\n<p style=\"text-align: justify;\">At WellSpan health organization, Nuance voice recognition tech and\u00a0<a href=\"https:\/\/www.emrsystems.net\/epic-ehr-software\/\">Epic EHR software<\/a>\u00a0have teamed up to provide more natural and efficient patient-provider conversation platform.<\/p>\n<p style=\"text-align: justify;\">Baker said, \u201cThe goal is to do things\u00a0<em>with<\/em>\u00a0the patient, not\u00a0<em>to<\/em>\u00a0them. \u201cI\u2019m a primary care provider by background, and when I dictate my notes in front of the patient, he or she gets to hear what I\u2019m saying and make sure that it\u2019s correct.\u00a0 If I\u2019m wrong, I can just go back and fix the error right there with their confirmation. It\u2019s a much more cooperative approach \u2013 not to mention a more efficient one.\u00a0 I can talk to both the record and the patient at the same time, so I don\u2019t have to walk out of the room and recount the entire visit again at some later time.\u00a0 That lets me spend a greater percentage of my time in the patient\u2019s presence.\u201d<\/p>\n<p style=\"text-align: justify;\">Wellspan has acquired a very cooperative approach by taking active part in OpenNotes, which helps patients to access their whole health record from a patient portal, immensely reducing their frustrations.<\/p>\n<p style=\"text-align: justify;\">\u201cWe find that being transparent with the patient from the beginning of the documentation process is a significant benefit,\u201d said Baker. \u00a0\u201cPeople feel more invested in their care, and even more confident in their provider and their data because they participated in the process of creating their own record and they have experienced their provider listening to them. Patients have a baseline expectation that they\u2019re being listened to, but there are a lot of situations where that isn\u2019t completely evident.\u00a0 It\u2019s very clear that they are the provider\u2019s priority when they\u2019re hearing their story repeated back to them.\u00a0 It\u2019s a much different experience than asking the patient to wait quietly while the provider puts his head down and types for five minutes.\u201d<\/p>\n<p style=\"text-align: justify;\">It is not that only the patients and the providers take advantage of the dictation tools. This type of documentation is reliable, efficient and of very high quality, and can be used more handily for analytics downstream.<\/p>\n<p style=\"text-align: justify;\">If the NLP tools are allowed to point out important features within the text, and deduce these elements into data formats, then this helps providers to interact naturally with the health records, according to Baker.<\/p>\n<h3 style=\"text-align: justify;\">Human Medical Scribes<\/h3>\n<p style=\"text-align: justify;\">The human medical scribes have been gaining popularity amongst healthcare providers who want to reduce multitasking work, without using the advanced abilities of virtual assistants.<\/p>\n<p style=\"text-align: justify;\">According to the American College of Medical Scribe Specialists, the profession is going to see immense growth in the near future.<\/p>\n<p style=\"text-align: justify;\">In 2015, about 15,000 scribes were active in hospitals and ambulatory settings. However, by 2020, the number is expected to reach 100,000 as clinics want more help with documentations, according to the organization.<\/p>\n<p style=\"text-align: justify;\">Baker acknowledged that, \u201cScribes can be a very viable option. Especially because humans still have a better ability to interpret subtleties of language than a virtual assistant. \u00a0Scribes\u00a0have been used effectively\u00a0in several settings to improve the efficiency of providers, and they can play a valuable role.\u201d<\/p>\n<p style=\"text-align: justify;\">\u201cBut I believe the patient-provider dynamic subtly changes when there\u2019s a third person in the room,\u201d he added.\u00a0 \u201cThe sense of confidentiality changes, through no fault of the scribe themselves.\u00a0 You could think about utilizing a human scribe remotely, through video or audio, but then you are running into new questions of data privacy and security, not to mention infrastructure investment.\u201d<\/p>\n<p style=\"text-align: justify;\">He further said about the scribes that they are only human, and can have risks of getting confused as the provider.<\/p>\n<p style=\"text-align: justify;\">\u201cIn contrast to people, computers are eternally vigilant. They don\u2019t accidentally tune out; they don\u2019t think about what\u2019s for lunch.\u00a0 Computers might make mistakes, but we can go back into the records and look at exactly what the mistake was and why it was made \u2013 and we can improve their programming so that they won\u2019t make that mistake again,\u201d said Baker.<\/p>\n<p style=\"text-align: justify;\">\u201cIt would be a very different world if we could do that with humans, but we can\u2019t.\u00a0 So there\u2019s an advantage there to using virtual assistants or ambient computing devices that can take some of the variability out of the equation.\u201d<\/p>\n<p style=\"text-align: justify;\">There is still some work to be done to fully activate voice recognition virtual assistants for routine use, NLP tools are already live and are working to improve providers\u2019 interaction with their electronic health record software.<\/p>\n<p style=\"text-align: justify;\">\u201cVoice has untapped potential to keep improving the provider experience, as well as the patient experience,\u201d he said.<\/p>\n<p style=\"text-align: justify;\">\u201cI believe this is a very good place to be putting the creative energy of healthcare, because provider exhaustion and burnout are affecting nurses, physicians, and just about everyone else involved in care right now.\u00a0 We need creative solutions, and I firmly believe voice-based tools are going to be a major part of that process.\u201d<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Voice-based documentation, Natural Language Processing (NLP), and Artificial Intelligence (AI) are capable of eradicating many issues while using Electronic Health Records (EHR) software. Machine learning and Artificial Intelligence (AI) are the new features and applications that have been added to the healthcare industry. However, some of the healthcare providers have already been using some pieces <a href=\"https:\/\/emrsystems.net\/blog\/emr-usage-easier-with-voice-tools-ai-and-nlp\/\"> [&#8230;]<\/a><\/p>\n","protected":false},"author":5,"featured_media":2246,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":"","_links_to":"","_links_to_target":""},"categories":[9,11,13,19,21],"tags":[66,140,147,164,172,268,393],"class_list":["post-2250","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ehr-software","category-electronic-medical-records","category-services-emr","category-healthcare-news","category-hipaa","tag-artificial-intelligence","tag-ehr","tag-ehr-software","tag-emr","tag-emr-software","tag-machine-leaning","tag-voice-recognition"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/posts\/2250","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/comments?post=2250"}],"version-history":[{"count":0,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/posts\/2250\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/media\/2246"}],"wp:attachment":[{"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/media?parent=2250"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/categories?post=2250"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/emrsystems.net\/blog\/wp-json\/wp\/v2\/tags?post=2250"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}