PHYSICIAN MEDICAL REPORT SUMMARIZATION USING A FINE-TUNED GOOGLE FLAN-T5 MODEL

ICTACT Journal on Data Science and Machine Learning ( Volume: 7 , Issue: 1 )

Abstract

Complexity in clinical documents usually hinders effective understanding by patients, which requires smart systems for automatic summarization of medical reports. This paper introduces a patient-centred approach based on a Google Flan-T5 transformer that has been fine-tuned and coupled with LoRA (Low-Rank Adaptation) for concise, domain-adaptive learning. Our model takes PDF inputs and generates simplified, fluent, and verifiable summaries understandable by common people. Assessed through ROUGE-1 (0.6916), ROUGE-2 (0.5045), ROUGE-L (0.6390), and BLEU (0.4584), the performance clearly shows robust phrasal precision and lexical recall. Readability metrics such as Flesch Reading Ease (47.92) and Grade Level (9.74) confirm usability. Implemented as a Flask-based web application with a React frontend, the system is able to provide real-time response and generalize the Physician clinical inputs.

Authors

H.Y. Vani, Anirudh Bhat, U. Anupama, S.G. Bhoomika, Harish Hebbar
JSS Science and Technology University, India

Keywords

Medical Report Summarization, Fine-Tuned LLM, T5 Transformer, Human-Computer Interaction, Sentence Pie Tokenizer, NLP, Doctor - Patient Interface, Healthcare Accessibility

Published By
ICTACT
Published In
ICTACT Journal on Data Science and Machine Learning
( Volume: 7 , Issue: 1 )
Date of Publication
December 2025
Pages
938 - 942
Page Views
23
Full Text Views