The prompt
Copy and paste this into your AI tool of choice. The prompt assumes you can attach or paste the input documents inline; substitute as needed for the tool’s interface.
You are a medical records analyst. The user will provide a set of medical records related to a personal-injury case. Your task is to produce a structured, date-ordered chronology of the patient's medical encounters relevant to the injury and treatment, formatted for use in a demand letter, mediation memorandum, or trial exhibit.
Format the output as a chronological table with these columns:
DATE: The encounter date in YYYY-MM-DD. Where the records show only month and year, use YYYY-MM-XX.
PROVIDER: The healthcare provider or facility name as it appears in the record.
ENCOUNTER_TYPE: ER, urgent care, primary care, specialist, physical therapy, surgery, imaging, lab, follow-up, other.
CHIEF_COMPLAINT: The patient-reported complaint, in the patient's words where possible.
EXAMINATION_FINDINGS: Objective findings from the encounter (vital signs, physical examination, observations).
DIAGNOSTIC_TESTS: Tests ordered or performed at this encounter (X-ray, MRI, CT, lab panel, etc.).
ASSESSMENT: The provider's assessment / diagnosis.
PLAN: The treatment plan from the encounter.
MEDICATIONS: Medications prescribed, refilled, or discontinued.
RECORD_PAGE: The page number(s) in the produced records where this encounter appears.
After the chronology table, produce three summary sections:
SECTION 1: TREATMENT TIMELINE NARRATIVE
A 2-4 paragraph narrative describing the treatment course in chronological order, focused on the injury at issue. Use neutral, factual language. Cite to the table rows by date.
SECTION 2: KEY ENTRIES
The 5-10 entries most likely to be material to a damages or liability analysis, with the reason each is material noted in one sentence.
SECTION 3: GAPS, INCONSISTENCIES, AND FLAGS
- Records gaps (e.g., a 6-month period with no entries)
- Inconsistencies between providers (e.g., one provider notes "no prior history" while another notes prior treatment)
- Pre-existing conditions referenced in the records
- Indications of subsequent injury or aggravation
- Quality-of-records flags (illegible, partial, missing pages)
Constraints:
- Use the records' language verbatim for diagnoses and assessments. Do not paraphrase clinical findings.
- Where an entry is unclear or incomplete in the records, write [UNCLEAR] or [PARTIAL] rather than guessing.
- Distinguish between what the patient reported (subjective) and what the provider observed/measured (objective). Both go in the same row but in different columns.
- Maintain HIPAA-conformant handling: patient name, date of birth, and medical record number can be referenced once at the head of the chronology and abbreviated thereafter (e.g., "Pt." rather than the full name).
- Do not include any analysis of legal causation. The chronology is fact-extraction; legal causation is the lawyer's analysis.
Output in this order:
1. Patient identifier block (one line: name, DOB, MRN, date range covered).
2. Chronology table.
3. Treatment timeline narrative.
4. Key entries.
5. Gaps, inconsistencies, and flags.
Input
Input format
Medical records produced by treating providers. Typically PDFs from hospital medical-records departments, urgent-care centres, primary-care offices, specialists, physical therapists, and imaging centres.
For best results: bookmark the records by provider in the PDF and ensure the PDF text is searchable (run OCR on scanned records before submitting if necessary).
For privacy: ensure the AI tool you are using has an executed Business Associate Agreement covering PHI processing. Do not use consumer-tier AI for medical records.
Expected output
Output format
A structured chronology table (one row per encounter, typically 30-200 rows depending on case complexity) followed by three analysis sections. Total length 8-25 pages depending on records volume.
The chronology is structured for direct copy-paste into a demand letter, mediation memorandum, or expert-disclosure exhibit. The table format converts cleanly to Excel.
Verification — what the lawyer must do after
- Sample-check verification. Pick 10-20 chronology rows at random and verify each against the source records. AI medical-records chronology accuracy is typically 90-95%; the lawyer’s sample check catches the rest.
- Verify the gaps section. The gaps the AI flags are exactly what defence counsel will probe at deposition. Confirm each is real before the document leaves the firm.
- Cross-check pre-existing conditions. Pre-existing references in the records may or may not be material; the lawyer’s judgement determines materiality. The AI extracts; the lawyer evaluates.
- HIPAA chain-of-custody. Document the chain of custody from the records-producer to the AI tool to the firm’s case file. The chronology output should not have its own retention beyond the firm’s standard records retention.
⚠ Risks and failure modes
- HIPAA risk: Using an AI tool without an executed BAA to process PHI is a HIPAA violation regardless of how good the chronology is. Confirm the BAA before uploading.
- Specialist-terminology risk: General-purpose AI sometimes mishandles specialist medical abbreviations (e.g., reading “c/o” as “care of” rather than “complains of”). Specialist medical-records tools handle these reliably; general AI does not always.
- Demand-letter risk: The chronology is the foundation for the demand letter; the demand letter is not the chronology. Use the chronology as input; do not let AI write the demand-letter prose. (See the IXSOR Personal Injury case study.)
- Discovery-disclosure risk: The chronology may be discoverable in subsequent litigation if not protected as work-product. Generate it through firm-internal tools, not consumer-tier AI, to maintain the work-product claim.
Vendor compatibility
Use a HIPAA-compliant tool with an executed Business Associate Agreement. Specialist tools (RecordsONE, Casepoint, the medical-records modules in CoCounsel and Lexis+ AI) outperform general-purpose AI on specialist medical terminology. For simpler chronologies, Claude Enterprise or GPT-4 Enterprise work; verify the BAA covers PHI processing.
Citations and further reading
- Health Insurance Portability and Accountability Act (HIPAA), 42 U.S.C. § 1320d et seq.
- IXSOR: AI in Personal Injury — vendor-agnostic case study.
- ABA Formal Opinion 512.
- United States v. Heppner — case authority on consumer-tier vs enterprise-tier AI for client documents.
- IXSOR: Federal Rule of Evidence 707 — relevant for any chronology used as substantive evidence.