Interact Conversationally with AWS HealthLake

via aws.amazon.com => original post link

Large language models (LLMs) are revolutionizing the way we interact with technology, and AWS HealthLake is no exception. HealthLake is a secure, HIPAA-eligible data lake that allows healthcare organizations to store, transform, and analyze their health data at scale. By combining HealthLake with an LLM, healthcare providers can interact conversationally with their data, gaining insights and making decisions faster than ever before. One way to use an LLM with HealthLake is through a chatbot interface. A chatbot is a computer program that uses natural language processing (NLP) to simulate a human conversation.

With a chatbot, healthcare providers can ask questions about their data and receive real-time answers. Users can ask questions about both their structured data (e.g., Electronic Health Records (EHR)) and their unstructured data (e.g., doctor’s notes). For example, a provider could ask the chatbot, “What is the average LDL value for all patients since 2017?” In this example, the chatbot would construct a query using structured query language (SQL), and then run this query against HealthLake, analyze the data, and provide the answer. Or, consider a case where Kendra has indexed the doctor’s notes in HealthLake. A clinician could ask to “Search doctors notes for Tommy814’s socioeconomic status.” The chatbot could extract some key words (Tommy814 and socioeconomic), query, analyze Kendra’s top suggestions, and summarize the suggestions as they relate to the question.