Your computer usage can predict early signs of Alzheimer's

Your computer knows if you're going to get Alzheimer's

Researchers at Oregon Health and Science University have discovered that infrequent use of a computer may be indicative of an early decline in cognitive abilities among older adults.

For some time, it's been known that the size of an area of the brain called the hippocampus, which is important for memory function, can be used to predict the eventual development of dementia. A smaller hippocampus is a well-known sign of Alzheimer's disease.

Now, in a study involving 27 cognitively-healthy adults aged 65 or older, researchers used an MRI machine to measure the volume of the hippocampus. Data on computer use among the participants was also gathered over a one-month period, using mouse movement detection software.

Lower Hippocampal Volume

Their results show that an extra hour of computer use every day was associated with a .025% larger hippocampal volume. As such, they conclude, low computer use may be able to predict cognitive decline.

It's unclear whether there's a causative relationship at work here, but Lisa Silbert (who led the research) wrote in a paper describing the study: "Successful computer use likely requires the ability to effectively call upon multiple cognitive domains, including executive function, attention, and memory."

The team will continue to follow the study participants to see how their cognitive ability develops in the coming years.

Silbert wrote: "Continuous monitoring of daily computer use may detect signs of preclinical neurodegeneration in older individuals at risk for dementia."

Duncan Geere
Duncan Geere is TechRadar's science writer. Every day he finds the most interesting science news and explains why you should care. You can read more of his stories here, and you can find him on Twitter under the handle @duncangeere.
Latest in Computing Components
Zotac Gaming RTX 5090 Graphics Card
Nvidia Blackwell stock woes are compounded by price hikes as more RTX 5090 GPUs soar in pricing, and I’m sick and tired of it all at this point
Nvidia app
Tired of manually optimizing your games? Nvidia's new G-Assist could save you time
Nvidia RTX 5080 against a yellow TechRadar background
RTX 5080 24GB version teased by MSI - is it time to admit that 16GB isn't enough for 4K?
Nvidia AMD
Nvidia rumors suggest it's working on two affordable GPUs to spoil AMD's party
An Nvidia RTX 5080 vs RTX 4080 Super against a two-tone background
Nvidia RTX 5080 vs RTX 4080 Super: should you upgrade to the latest Blackwell GPU?
An Intel Arc B580 vs Nvidia RTX 4060 against a two-tone background
Intel Arc B580 vs Nvidia RTX 4060: Which mainstream GPU is right for you?
Latest in News
The Witcher 4
You're probably not playing The Witcher 4 until 2027 at the earliest, per CD Projekt's latest financial update
Matt Murdock smiling in Daredevil: Born Again episode 5 and Kamala Khan looking stunned in The Marvels
Daredevil: Born Again episode 5 just revealed what Kamala Khan has been up to since The Marvels, and now I'm more excited for the next superhero team to appear in the MCU
Google Pixel Watch 3, 41mm and 45mm
Google says it will fix broken Wear OS 5.1 update, but why does this keep happening?
DeepSeek
DeepSeek’s new AI is smarter, faster, cheaper, and a real rival to OpenAI's models
Open AI
OpenAI unveiled image generation for 4o – here's everything you need to know about the ChatGPT upgrade
Apple WWDC 2025 announced
Apple just announced WWDC 2025 starts on June 9, and we'll all be watching the opening event