The assisted-living facility in Edina, Minnesota, where Jean H. Peters and her siblings moved their mother in 2011, looked lovely. “But then you start uncovering things,” Peters said.
Her mother, Jackie Hourigan, widowed and developing memory problems at 82, too often was still in bed when her children came to see her in midmorning.
“She wasn’t being toileted, so her pants would be soaked,” said Peters, 69, a retired nurse-practitioner in Bloomington, Minnesota. “They didn’t give her water. They didn’t get her up for meals.” She dwindled to 94 pounds.
Most ominously, Peters said, “we noticed bruises on her arm that we couldn’t account for.” Complaints to administrators — in person, by phone and by email — brought “tons of excuses.”
So Peters bought an inexpensive camera at Best Buy. She and her sisters installed it atop the refrigerator in her mother’s apartment, worrying that the facility might evict her if the staff noticed it.
Monitoring from an app on their phones, the family saw Hourigan going hours without being changed. They saw and heard an aide loudly berating her and handling her roughly as she helped her dress.
They watched as another aide awakened her for breakfast and left the room even though Hourigan was unable to open the heavy apartment door and go to the dining room. “It was traumatic to learn that we were right,” Peters said.
In 2016, after filing a police report and a lawsuit, and after her mother’s death, Peters helped found Elder Voice Advocates, which lobbied for a state law permitting cameras in residents’ rooms in nursing homes and assisted-living facilities. Minnesota passed it in 2019.
I’d say putting up cameras violates the person’s dignity, but knowing how hellish these places can end up I’m not surprised well-meaning people have to do that to protect their loved ones.
A thought. We should probably have an AI model trained on proper elder care. Detecting when someone is gotten out of bed, clothed and/or changed, brought food, If they’re eating the food, falls, shouting, screaming, crying, gruff tones. On an average day a human wouldn’t need to look at the footage. An administrator would receive a message that they need to go back and review certain clips.
They actually already have a thing that basically turns the people on camera to stick figures. You can turn it off if you need to, but a quick check can see that they’re up and moving around vs fallen down without invading their privacy too much at least
There could be room for error there. An alert could read that a patient is getting out of bed when they’re just tossing and turning. Not that this isn’t a good idea, I would just use it in addition to human observation.