Oct 4, 2024

AI in documentation: responsibility and record-keeping

AI in Recordkeeping: Responsibility, Control, and Documentation for Dentists

AI-based functions are increasingly used in and around recordkeeping – such as speech-to-text, summaries, text suggestions, and automatic “fill-ins” based on previous notes. For many clinics, this appears as an efficiency in documentation work.

Simultaneously, the record is a core in healthcare: It must be verifiable, understandable to other qualified healthcare professionals, and linked to who actually provided the information. 

This article explains where the question of responsibility arises when AI is used in recordkeeping in dental clinics and what management and documentation principles should typically be in place to ensure control and traceability in practice.

What is meant by AI use in dental health?

In the context of recordkeeping, “AI use” often means that systems help produce, structure, or suggest record text – without it always being perceived as “an AI tool.” This can include functions like:

  • Speech-to-text and automatic transcription: the clinician dictates, the system drafts.

  • Summarization and structuring: long notes are compressed, or information is suggested to be placed in fixed fields.

  • Text suggestions and standard formulations: the system suggests sentences based on context (for example, symptoms, procedures, previous notes).

  • Reuse and autofill: previous record data is used to suggest content in new notes.

The common theme is that AI can affect what is writtenhow it is formulated, and which details are emphasized or omitted. This makes recordkeeping more vulnerable to sliding effects: drafts may appear “almost finished,” and control may be less active than assumed.

Where does the responsibility question arise?

The responsibility question rarely arises because “AI makes a mistake” in isolation. It arises when the clinic's practices do not clearly delineate who is responsible for content, quality, and verifiability – and when it is not possible to reconstruct what actually happened.

Typical situations include:

  1. When AI produces text that looks plausible
    Record text can be linguistically correct, yet professionally inaccurate, overly general, or incorrectly nuanced. This is especially relevant for summaries and text suggestions, where small changes can impact what is actually documented.

  2. When it becomes unclear who has “recorded” the information
    The record should make it possible to see who has recorded information, and it should be understandable to other qualified healthcare professionals. If the record text is practically “machine-produced,” the clinic must still be able to connect the content to human judgment and responsibility.

  3. When correction and rectification become unclear
    Errors and omissions can occur in any documentation process. There are clear expectations for rectifying incorrect or incomplete record information and for ensuring that rectification occurs in a traceable manner (not by “erasing” history). AI-supported recordkeeping increases the need for concrete routines for quality assurance and rectification.

  4. When data flow and privacy become part of the record process
    Recordkeeping involves the processing of personal information, with fundamental principles such as accuracy, integrity/confidentiality, and accountability. AI functions can change where data is processed, who has access, and what is logged – without it being visible in the user interface.

  5. When supplier and clinic responsibility mix
    Suppliers can have responsibilities related to the product's characteristics, but the clinic must manage usage in its own context: what tasks are supported, who can use the function, and how quality is controlled. This is particularly relevant if an AI function falls under “high-risk” logic in the EU AI Act in its specific use; then, organizations deploying the system have obligations related to, among other things, human oversight, relevant input data, and logging. 

Common misconceptions

«AI can write the record as long as I quickly skim over it»

Quick skimming can give a false sense of security because AI text often appears consistent and professional. The risk is in the details: incorrect time indications, unsupported conclusions, omissions, or “drift” in clinical precision.

In practice, recordkeeping must still be a documentation of healthcare and assessments as they were actually performed. Recordkeeping and content requirements are linked to those providing healthcare, not the tool aiding in text production. 

«If the text is generated by the system, it’s the supplier's responsibility»

Supplier responsibility may be relevant at the product level, but it doesn't change the fact that the clinic must stand behind how recordkeeping is actually conducted: what functions are used, what control points exist, and how errors are captured.

Additionally, the record is part of the clinic's organizational management. For example, the Norwegian Directorate of Health's comments on recordkeeping obligations indicate that healthcare institutions should have one individual with overarching responsibility for each record when several collaborate. AI functions do not change this need – they can increase it.

«Recordkeeping with AI is just a privacy issue»

Privacy is one aspect, but responsibility also concerns professional verifiability and documentation quality. The record must be understandable for other qualified healthcare professionals, and it must be clear who provided the information. 

This means that AI content must not only be “safely stored” but also professionally precise, contextually correct, and linked to a real human assessment.

«We don't need extra traceability – the record is in the system»

Storing the record electronically is not the same as having a traceable process. In AI-supported text production, it can be important to be able to explain:

  • what was an AI draft,

  • what was altered by the practitioner,

  • and what assessments were made before signing.

For certain types of AI systems (depending on classification and use), logging may also be an explicit expectation under the EU AI Act for deployers of high-risk AI. The point in the recordkeeping context remains: without traceable practices, it becomes difficult to verify quality and responsibility afterward.

«This can be resolved with a simple internal guideline»

General statements like “use AI with caution” offer little guidance in a hectic clinical workday. AI-supported recordkeeping often requires more concrete clarifications: when to use the function, when not to use it, and which minimum controls must be performed before the record is considered professionally and documentation-wise robust.

What should be in place in practice?

For dental clinics that use (or are considering using) AI in recordkeeping, there are some management principles that typically provide control without making the processes burdensome:

  • Mapped actual use of AI in the record process
    A simple overview: where is speech-to-text, summarization, or text suggestions used, and what does it affect in the record (free text, structured fields, coding/procedure registration)?

  • Clear responsibility for record content and changes
    Establish who is responsible for quality before signing, and how overarching responsibility for the record is managed when several contribute. 

  • Defined control points before signing
    Not as “extra work,” but as a standardized check: do facts, timeline, clinical assessments, and measures align? Is the text precise and verifiable? This should be adapted to the risk level of the specific AI functions.

  • Routines for correction and error handling
    AI can increase the risk of systematic errors (the same error type repeated). The clinic should have a practical way to capture, correct, and document corrections, in line with requirements for traceable rectification (not deletion). 

  • Privacy and security integrated into record work
    Ensure that fundamental privacy principles such as accuracy, integrity/confidentiality, and accountability are actually maintained in the specific solution and usage. 

  • Traceability that withstands inspection and verification
    The clinic should be able to explain how AI is used in recordkeeping at a practical level: what is automated, what always requires human judgment, and how this can be documented afterward.

Actera's Role in This

Actera is established to provide dental health practices with structure around the responsible use of AI. 

We do not engage in technology development or clinical decisions, but in management structure, lines of responsibility, and documentation – so that AI can be used in a secure, predictable, and verifiable manner. 

Concluding Reflection

AI will increasingly become part of day-to-day recordkeeping, often as features that “simply exist” within the tools the clinic uses. This makes it more, not less, important to have clear frameworks for responsibility, control, and traceability.

When the record must be understandable, verifiable, and linked to who has entered the content, managing AI use becomes a question of practice – not technology. With clear role anchoring and documented control points, it becomes easier to manage errors, inspections, and tool changes over time.

Lawyer portrait photo

Related articles

Related articles

AI in documentation: responsibility and record-keeping

AI in dental clinics: who is responsible?

AI and Patient Information: Traceability and Accountability

Business Intelligence in Dental Care: Data as Decision Support

Deviations in AI at Dental Clinic: Responsibility and Follow-up

Third-party AI in dental clinics: what cannot be outsourced?

Documentation of AI Use in Dentistry

Human oversight in the use of AI in dental clinics

Responsible AI in Dental Clinics: Which Roles Need to be Clear?

AI in documentation: responsibility and record-keeping

AI in dental clinics: who is responsible?

AI in documentation: responsibility and record-keeping

AI in dental clinics: who is responsible?

AI and Patient Information: Traceability and Accountability