Digital myopia of nursing and geriatric care

nursing care

The growing use of smart control and monitoring tools in nursing increasingly decouples it from ethical and moral considerations—this is the core thesis of a now internationally published analysis by a researcher at Karl Landsteiner University for Health Sciences in Krems (Austria). The argument, published in Nursing Philosophy, sets out the change in role that digital monitoring and AI-based decision-making processes are bringing about for nurses: Perceiving care recipients as people with individual needs could fall victim to standardized, “smart” decision-making processes. However, involving nursing staff in the development of monitoring technologies or redefining the nursing profession could provide a remedy.

Digital monitoring has long been part of everyday life in nursing—and in the long-term care sector, it covers many areas of life: Vital functions are monitored, as are daily activities and behaviors. Sensors on clothing record physiological data—and those in the floor record walking and falling behavior. Smart mattresses provide information about sleep patterns—and door sensors raise alerts when people leave their accommodation. Increasingly, the data collected is being integrated and analyzed by artificial intelligence (AI) for deviations from desired patterns. But how does the ever-growing use of these digital assistants affect what is known as the “nursing gaze?” This has now been analyzed for the first time worldwide by Prof. Giovanni Rubeis, Head of the Department of Biomedical Ethics and Health Care Ethics at Karl Landsteiner University for Health Sciences (KL Krems) in Austria.

Data double

The “nursing gaze” describes viewing the person being cared for both as an individual personality and as an embodiment of a medical or age-related need. “But the greatly increased use of digital control and monitoring tools is increasingly narrowing this view to quantifiable, standardized values,” argues Prof. Rubeis. “Pain assessment, progression prognosis and treatment recommendations are increasingly done by algorithms. In fact, Prof. Rubeis is already talking about the creation of the “data double,” that digital representation of a patient in need of care that consists of purely technical values. “The more we hide the person in need of care behind their data, the more we also decouple decision-making processes about measures from their individual needs,” notes Prof. Rubeis. “And—despite all the advantages that these technologies naturally offer—decisions are based on standard assumptions that cannot be optimal for every individual. This then begins a dehumanization of care.”

Moral immunization

The use of digital and AI-based technologies, however, says Prof. Rubeis, is often portrayed in a purely positive light for nursing staff—thus “immunizing” them from moral judgments. “Digital monitoring technologies are seen as a means to achieve a higher quality of life and a life free of limitations,” Prof. Rubeis said. “This also relieves caregivers of their moral responsibility to assess whether these noble goals are even being achieved with the chosen technologies. Or whether those affected even want to live such a controlled life. The “nursing gaze” becomes blind in one eye.”

Source: Read Full Article