Bridging the gap between models and reality: development of a research environment for an object-oriented hospital information system to integrate artificial intelligence and robotics into clinical practice

Requirements for an object-oriented HIS

Following Haux [15], the expansion into regional and global health information systems, the integration of patients as active users, and the enhancement of functionality beyond basic administration to encompass proactive therapy planning are not yet fully addressed by current HIS solutions. Particularly relevant are the demands for strategic information management, the expansion of data modalities to the molecular level, and a strategy for including new, even sensor-based technologies.

Regarding the acceptance of HIS, human factors have a more pronounced impact than organizational or financial factors. A fundamental understanding of how IT solutions improve work processes is crucial. From a technical point of view, enabling continuous development focused on future functionalities and the reliability of data management are the most significant elements regarding acceptance [24].

Clinical software solutions that facilitate the seamless replacement of outdated applications with advanced, e.g., AI-driven solutions would be highly advantageous, particularly in terms of ensuring their maintainability within clinical practice [25]. Particularly, the sustainable integration of medical robotic systems into clinical routine remains under-researched. Nonetheless, it is essential to develop systematic implementation strategies for MR, as these systems offer significant potential to enhance quality in patient care [26].

Process analyses on the surgical wards of the TUM University Hospital in Munich supplemented the results of the literature research. The expert team then formulated basic requirements for oHIS within several focus group discussions (Table 1). It became apparent that the most significant challenges lay in the tension between a complex hospital environment and the demand for the most straightforward and intuitive design.

Fundamental principles

The fundamental principles of an oHIS must remain valid with the integration of future technologies that are still unimaginable today. We overcame this issue by applying the dimensions of the real world to the PDT. Dimensions like location and time will never change as they are imposed by natural laws. Thus, we introduced a Location dimension to represent the state that a particular object is situated in a defined Location at a specific Time.

However, to comprehensively map the processes in a hospital, a further dimension was required, namely that of ’purpose’ or ’meaning’. No human or material object exists without context, even when waiting for an examination or being stored for later use. Thus, the third dimension of the oHIS framework is that of Context. Hence, all objects operate at runtime in an abstract space that is spanned between Time, Location, and Context. Figure 3 illustrates interactions among objects within a virtual space.

The next step was implementing the representation of actually existing agents and objects. Here, we strictly followed the principles of OOP. The main ideas of OOP are encapsulation, inheritance, information hiding, data abstraction, and polymorphism. In order for an object to be created, it must belong to a class defined in the program. A class defines a set of methods and data that all objects created from it will contain. Therefore, encapsulation is the bundling of data and methods. It hides inner details and only exposes what is necessary, restricting direct access to some components, promoting modularity and preventing unintended interference.

Classes can inherit methods and data from other classes. For example, if the Doctor class inherits from a Person class, then the Doctor class will also have all the characteristics of the Person class. Hence, the ultimate superclass provides the most abstract interpretation of what it means for an object to be of a certain category. Taking this concept of abstraction from OOP principles, we model the agents flexibly by starting from an abstract root class from which many different classes can be derived. Accordingly, we included a future object class to anticipate technological applications or roles within patient care that do not yet exist. Thus, future objects and models can be registered when available without compromising the system’s stability. (Fig. 4)

Fig. 3figure 3

Concept of object representation and object interaction in the three-dimensional space of Time, Location, and Context: t0 (Timepoint 0): An ultrasound device (blue), a doctor (purple), and a patient (orange) are situated in the separated contextual and local spaces storage (transparent blue), examination room (transparent purple), and ward (transparent green). The patient is interacting (red lines) with a nurse (green). t1: The patient moves to the examination room and interacts with the doctor. t2: The ultrasound device moves from the storage to the examination room and interacts with the patient

One of the key concepts in OOP is polymorphism, which allows objects to take on different forms, enabling the same method or interface to work differently, depending on the object it interacts with. For example, a Doctor class with a method like Examine Patient could have subclasses such as Pediatrician or Surgeon, each providing a specialized implementation of Examine Patient.

Another OOP principle, information hiding, restricts access to internal object details through a controlled interface. For instance, a Patient class containing sensitive data like medical history might provide summary methods instead of direct access, ensuring data protection while maintaining simplicity and security. This approach promotes modularity and robustness [27].

Thus, inspired by OOP, we propose modeling the hospital as a system of interconnected components represented as objects with attributes and functionalities that interact within specific contexts and timepoints.

Fig. 4figure 4

Hierarchical classification structure for current and future anticipated object representations in an object-oriented hospital information system (HIS)

Functionality of the prototype

Following the work packages, the OMNI-SYS research framework includes a virtual space spanned by Time, Location, and Context and a comprehensive object representation for people, things, and models. In the first functional version, the person groups of patients, nurses, and doctors are already considered central actors. On the ’things’ and ’model’ side, non-human objects like robots and models of all kinds can be registered in the system. Object interactions can be transferred from the real world to the conceptual space by updating the system’s status as objects in the real world move to the same location and interact in the same context. This is made possible using QR codes or RFID tags, meaning that no input via a GUI is initially required during an interaction in the real world. The natural interaction is disturbed as little as possible.

To document examinations and create orders, as well as to provide an overview of all objects (devices and persons) in their own area of responsibility for medical professionals, we have created an exemplary intuitive GUI prototype in which the user is always situated in the center. Other relevant objects are displayed in a view similar to the solar system (solar view) surrounding the user like planets. When the user hovers over an object, more details regarding its properties can be viewed. The GUI prototype is displayed in Fig. 5a.

Fig. 5figure 5

Two exemplary GUI prototypes of the oHIS research framework. a Solar system view. b Network view

In addition to the solar system view, Fig. 5b displays a more scientific network view that also depicts objects and their relationships relative to the central user. Moreover, in this view all detailed interactions, relations and properties of surrounding objects are displayed with a deliberate and variable path depth. In this network view, both the location and context are also displayed as nodes to reach a flat, non-3D rendering of the 3D conceptual space shown in Fig. 3. Objects such as doctors, nurses, devices, models, and patients can be related to more than one context to mirror reality as exactly as possible. It is, therefore, a projection of the 3D representation of objects in space, time, and context onto a graph database view, displaying objects moving in real time.

Already the current version of the OMNI-SYS framework provides a broad opportunity for testing the idea of oHIS by intuitively changing the properties and relationships of the displayed objects within the graph network. It also provided an insight into the way data is related to the particular objects. For example, in line with the principle of data encapsulation, examination data such as sonography results would be exclusively stored within the corresponding patient object to achieve data security and privacy. The network view allows one to examine which kind of data belongs to a particular object. In addition, several GUI applications can be easily adapted within the research framework requiring small programming effort.

Finally, we designed an interface to edit object properties and relationships for quick testing and evaluation of object-oriented oHIS concepts. This feature allows users to modify object attributes and instantly view the changes in the network and solar view, eliminating the need to write queries to the graph database.

Integration of artificial intelligence models and medical robotics

A fundamental goal of developing an oHIS is the simple and future-proof integration of AI models and MR in a clinical environment. In principle, these two key technologies are integrated the same way as other existing agents. Every robotic system and every model also receive an object representation in the oHIS that is compatible with all other objects, allowing to interact as if they were real existing agents themselves. In future, robotic systems could access relevant patient and user data to adapt their behavior in the real world. If a robot or an AI model is defective, it can be removed by deactivating the object representation at runtime without affecting the integrity of the overall system. (Figs. 4 and 6)

Fig. 6figure 6

Integration of artificial intelligence models and medical robotics into the object-oriented HIS. a Interacting agents in the real world. b Object representations in the oHIS. c Integrated AI models. d Data storage

Fig. 7figure 7

Object representation and interaction in the 3D space of Time, Location, and Context: t0 (Timepoint 0): Main user (blue dot) in the examination room and examination context, patient (yellow dot) and nurse (gray dot) in the nursing ward and patient care context. Ultrasound device (gray triangle) in the storage but linked to examination context. t1: Patient moves to the examination room, interacts with the doctor. t2: Ultrasound device moves to the examination room and interacts with the patient

Evaluation of the prototype

The prototype was evaluated using the sonographic examination use case shown in Fig. 3 with results displayed in Fig. 7. Objects such as doctor, nurse, ultrasound device, and patient were simulated as nodes in the graph database, along with location and context. Relationships between objects were represented as directed edges. For example, at time t0, the patient ‘Bob’ is linked to the location ‘Nursing ward’ via the edge located_in and to the context ‘Patient care’ via situated_in. Other relationships were also simulated, such as ‘Bob’ owning a medical record written by the doctor. The prototype was tested by team members acting as ‘Bob’ (the patient) and the doctor. QR codes were created for each location and context, and for the patient, encoding their unique ID.

During evaluation, the actor patients, recruited from our team, used their phone’s camera to scan a QR code containing a link that sent an HTTP request to the web-based application, registering them in the system. Initially located in the nursing ward, the patient’s goal was to move to the examination room for an ultrasound. Upon arrival, the patient scanned the examination room’s QR code to update their location and context. Since the patient had already scanned their identification QR code, the system used cached cookies to recognize them and update their location accordingly.This change shown in Fig. 7 removed the nursing ward and the nurse nodes from patient object since they were no longer relevant to the main user (doctor). Also the ultrasound device was registered as object. The system reliably logged changes in object properties and relationships, along with timestamps, enabling the collection of a comprehensive event-based dataset. Such a dataset can easily be utilized for process mining techniques to model workflows within the hospital environment, offering valuable insights into potential areas for improvement. In future, e.g., sensors that measure vital signs could be integrated into the system, linking patient objects to the corresponding sensor data and AI models could instantly evaluate the device signals stored as properties of the related patient object. Although a phone-based QR scanner was used during prototype evaluation, it is not intended to be the sole method of data input. Further development will explore alternative interfaces such as sensors or patient wristbands with RFID chips to support patient registration and data capture. This is particularly important in cases where patients are not independent or are unable to interact with the system directly, such as those who are unconscious. Moreover, patient registration in this case was linked to the registration of the patient’s location. However, the medical data of the patient are still the responsibility of the doctor to be entered into the system.

The graph database approach was efficient to mirror the interactions and localizations of the objects involved properly regarding this use case. However, for real-world implementations, especially in critical domains where consistency, availability, scalability, and security are crucial [28], relational databases may offer a more robust solution [29]. Their vertical scalability helps avoiding the complexities that arise as interconnected nodes and edges increase in graph databases [29].

The evaluation results were finally presented and discussed in an interdisciplinary team meeting at the end of the project period to set the direction for further development of the research framework. The most important conclusions are addressed in the discussion section.

Structured interviews with the involved physicians indicated a high degree of acceptance and understanding of the principles of oHIS. The proposed GUIs were considered as already functional; however, extended functionality was demanded when aiming to integrate an oHIS into clinical routines.

Comments (0)

No login
gif