The human rights watch has produced a report on the use of sedation in nursing homes. The report titled “they want docile” highlights the plight of people with dementia being chemically restraint through overmedication of antipsychotic drugs.
Too many times I’m given too many pills…. [Until they wear off], I can’t even talk. I have a thick tongue when they do that. I ask them not to [give me the antipsychotic drugs]. When I say that, they threaten to remove me from the [nursing] home. They get me so I can’t think. I don’t want anything to make me change the person I am.
—Walter L., an 81-year-old man given antipsychotic drugs in a Texas nursing facility, December 2016.
It used to be like a death prison here. We cut our antipsychotics in half in six months. Half our residents were on antipsychotics. Only 10 percent of our residents have a mental illness.
—A director of nursing at a facility in Kansas that succeeded in reducing its rate of antipsychotic drug use, January 2017.
This was to describe the practice of groundlessly asserting that design solutions would change behaviour in a predictable and positive way.
It was a new phrase but the belief system behind it – that buildings shape behaviour – had allowed the heroes of architecture to make all kinds of outlandish claims.
A hopeful history
Leon Battista Alberti, an Italian Renaissance-era architect, claimed in the 1400s that balanced classical forms would compel aggressive invaders to put down their arms and become civilians.
Frank Lloyd Wright, the US architect who designed one of the most famous buildings in America, Fallingwater, similarly believed appropriate architecture would save the US from corruption and turn people back to wholesome endeavours.
British author and thinker Ebenezer Howard believed companies would be more efficient if their employees lived in village-like garden communities.
Swiss-born French architect Le Corbusier made claims about how his Villa Savoye building in France would heal the sick – and when it did just the opposite, he only avoided court because of the commencement of the second world war.
It took a long list of failures over the millennia before postmodern theorists took to critiquing architectural fantasy with malevolent vengeance. The high-point of this trend was the delight shared over the demolition of the famously dangerous and dysfunctional Pruitt-Igoe urban housing complex in St Louis in the US.
It was designed by architects George Hellmuth, Minoru Yamasaki and Joseph Leinweber to provide “community gathering spaces and safe, enclosed play yards.” By the 1960s, however, it was seen as a hotspot for crime and poverty and demolished in the 1970s.
The loss of faith in architecture’s power has been regrettable. Architects’ well-meant fantasies once routinely provided clients with hope and sometimes even with results.
Without this promise, the profession was left inept before the better structural knowledge of engineers, the cumulative restrictions imposed by generations of planners, the calculations of project-managers and the expediency of a draughtsman’s CAD (computer-aided design) skills in turning a client’s every whim into reality.
Without fiction, architecture has become a soulless thing. But was determinism dismissed too soon? Is there a role for imagined futures without rationalist restrictions?
The fact is that environments do affect us, regardless of whether by design or by accident. In 2008, researchers in the UK found that a ten-minute walk down a South London main street increased psychotic symptoms significantly.
In my own research, I find that the healthier a person is, the more a good environment will affect them positively and the less a bad one will affect them negatively. Mentally ill patients show about 65 times more negative reactivity to bad environments than controls and all these reactions translate directly into symptoms.
The same patients have about half the positive responsiveness. That’s fewer smiles, less laughter and a reported drop in feeling the “fun of life”.
But that’s not all. The potential for architecture is richer still. The ease with which architecture can embrace sublime aesthetics makes it great for generating awe.
The psychological effects of architecture are difficult to prove, but difficulty doesn’t dilute the value of a building that hits the right notes and creates a sense of awe. Each building type has different functions, and for each there’s an imperative to use the building to help create an optimal mood, desire or sense of coherence, security or meaning.
Most of these focus on health care design, because that’s where behavioural changes have life and death consequences.
But nobody dares make any promises. As such, research rarely opens the black box of environmental psychology, leaving findings unexplained and prone to failure.
To give architecture back its mojo, a new interest in how architecture changes us must be fostered. Clients have to learn to trust architects again and research funding bodies have to re-gear to encourage research into how buildings affect our mood, health and behaviours.
Finally, architecture schools have to teach students how they might predict psychological, emotional, healing and functional effects.
All innovation, ultimately, is led by the imagination – even if that means taking risks and sometimes getting it wrong.
The most common question I get asked is “Will my child get Alzheimer’s disease?” In my experience, this concern is one of the biggest worries for sufferers, and given the devastating effects of the disease, it is not hard to see why it is a difficult thought to contemplate.
For those people with a familial form of Alzheimer’s disease, the answer is quite straightforward. This type of disease is caused by one or more mutation(s) in one of three genes: the amyloid precursor protein (APP), Presenilin 1 (PSEN1) and Presenilin 2 (PSEN2). All of these genes are involved in the production of the amyloid protein. This protein accumulates to form sticky buildups known as plaques, which are found between the cells of the Alzheimer brain and are characteristic of disease.
Those of us who are concerned that they may be at risk from familial Alzheimer’s disease can get a definitive answer through one of the many genetic tests available. A single copy of the mutated gene inherited from an affected parent will ultimately cause disease, with symptoms likely to be noticed before the age of 65 and typically between 30 and 60 years of age. Anyone concerned that they may suffer from this form of Alzheimer’s should seek a referral to a genetic counsellor.
Fortunately, families with a familial form of disease represent less than 1% of all families afflicted by this debilitating disease. For the remaining Alzheimer’s disease families, the answer as to the inheritance of disease is much less clear, and disease onset is certainly not inevitable.
A combination of both genetic and environmental factors, such as age and gender, contribute to non-familial (also known as sporadic) disease risk, but how these risk factors interact and how many risk factors are required to cause disease is still unknown.
The genetics of non-familial Alzheimer’s is complex: we know that nearly thirty genes, common in the general population, influence disease risk, with potentially hundreds more involved. Additionally, two genes of low frequency have consistently been identified, with an imminent publication by the International Genomics of Alzheimer’s Project, showing another two rare genes have a relatively large effect on disease risk.
Perhaps most excitingly for researchers, genetics scientists have shown that four biological processes in Alzheimer’s disease – that were not previously thought to play a casual role in disease onset – are actually involved. The first process is the immune response, in particular the actions of immune cells and how these potentially dysfunction, attacking the brain, which results in brain cell death.
The second is the transport of molecules into the cell, suggesting that there is a mechanism for the movement of damaging proteins into the brain. The third process that has a role in the onset of Alzheimer’s is the synthesis and breakdown of fatty molecules. And the fourth is the processing of proteins that alters protein breakdown, movement, activity and interactions – all of which are essential for normal protein function.
Age is the greatest risk factor for disease, with the likelihood of developing Alzheimer’s roughly doubling every five years over the age of 65. Women also have more chance of developing the disease than men, potentially due to a reduction in female hormones after menopause.
Medical conditions that increase risk for dementia include cardiovascular factors (type 2 diabetes, high blood pressure, cholesterol levels, and obesity), and depression. While lifestyle factors such as physical inactivity, a diet that increases cholesterol, smoking and excessive alcohol intake, have all been shown to influence disease risk.
Even for those with a high number of genetic, environmental and lifestyle risk factors, Alzheimer’s disease is not inevitable. Likewise, individuals with a low number of risk factors for disease are not precluded from developing Alzheimer’s.
Given this lack of certainty and the lack of effective treatments for Alzheimer’s, most experts don’t recommend genetic testing for non-familial disease. This thinking may well evolve in the future, however, when research identifies new risk genes and improves our understanding of the dysfunctional processes in Alzheimer’s disease.
Answering the burning question, whether you will pass Alzheimer’s disease on to your children, is therefore still a near impossibility. But, as early diagnostic techniques improve, and with the prospect of a number of vaccines and therapeutics currently in clinical trials, risk prediction for Alzheimer’s disease may become mainstream and part of a developing precision medicine culture.
16 January 2018 | New York – Study Finds Racial Differences in Reporting and Overall Trend of Underreporting Cognitive Impairment
An increasing number of older adults are reporting cognitive impairment in their families over the past two decades, according to a new study led by researchers at NYU Rory Meyers College of Nursing and East Carolina University’s Brody School of Medicine.
The study, which also finds ethnic and racial differences in reporting cognitive impairment, is published in Preventing Chronic Disease, a journal of the Centers for Disease Control and Prevention.
The aging population in the U.S. is growing rapidly, with the number of people age 65 and over in 2010 (40.2 million) projected to more than double by 2050. With the rapid increase in the aging population, the size of the population with cognitive impairment and dementia will continue to accelerate, highlighting the importance of identifying cognitive changes.
“Cognitive impairment may serve as a precursor to future dementia. Early detection of cognitive impairment can facilitate timely medical treatments, appropriate care planning, and prevention efforts,” said Bei Wu, PhD, Dean’s Professor in Global Health and director of Global Health & Aging Research at NYU Meyers, co-director of NYU Aging Incubator, and the study’s senior author.
The study sought to examine the trends of self-reported cognitive impairment among five major racial/ethnic groups from 1997 to 2015 in the United States. The researchers used data from the National Health Interview Survey, including 155,682 individuals age 60 and above in their sample. The large sample included people of a variety of races and ethnicities, including Asian Americans, Blacks, Hispanics, Native Americans, non-Hispanic Blacks, and non-Hispanic Whites.
Rather than using a screening test or clinical examination to evaluate cognitive impairment, respondents were asked to report if any family member was “limited in anyway because of difficulty remembering or because of experiencing periods of confusion.”
The researchers found an increasing trend in self-reported cognitive impairment: the overall rate increased from 5.7 percent in 1997 to 6.7 percent in 2015 among older adults in the U.S. This finding may suggest that awareness of cognitive impairment, perhaps from heightened public attention to and interest in Alzheimer’s disease, has improved to some extent.
When looking at each racial/ethnic group, however, the increasing trend was significant only among White respondents. In Whites, the rate of self-reported cognitive impairment increased from 5.2 percent in 1997 to 6.1 percent in 2015. Asian American, Black, Hispanic, and Native American respondents had higher rates of self-reported cognitive impairment than Whites, but these rates did not significantly increase from 1997 to 2015.
Regardless of the overall increasing trend, the rates of self-reported cognitive impairment were still low, which may suggest underreporting. The researchers note that the rates of self-reported cognitive impairment are much lower than the estimated prevalence of cognitive impairment. For adults 65 years and older, the rate of self-reported cognitive impairment was 6.3 percent in 2000 and 7.5 percent in 2012, while the estimated prevalence of cognitive impairment in the same age group was 21.2 percent in 2000 and 18.8 percent in 2012.
These findings underscore the need to further promote awareness of cognitive impairment, especially in minority populations. Different cultures hold different beliefs and perceptions of disease and aging. For instance, research has found that compared to Whites, minorities are less likely to seek treatment for psychiatric symptoms because of lack of access to care or due to stigma.
“Culturally specific health education is needed in individuals, family members, and healthcare providers to improve awareness and knowledge of signs and early symptoms of Alzheimer’s and other dementia,” said Huabin Luo, PhD, of East Carolina University.
In addition to Wu and Luo, Gary Yu of NYU Meyers coauthored the study.
Finding a cure for neurodegenerative diseases such as Alzheimer’s is challenging. They’re difficult to diagnose, and drugs struggle to get into the brain as the brain’s blood supply is largely separate to the rest of the body. Not surprisingly, several companies have left this territory in recent years. This week, pharmaceutical giant Pfizer announced it will stop research into developing drugs to treat Alzheimer’s disease, after costly failed attempts over the past decade.
In recent years some clinical trials involving potential dementia drugs have had disappointing setbacks. In 2012, Pfizer and Johnson & Johnson halted development of the antibody drug bapineuzumab, after it failed in late-stage trials to treat patients with mild to moderate Alzheimer’s.
Despite this week’s announcement, Pfizer’s support of the UK’s Dementia Discovery Fund, an initiative involving the government, major pharmaceutical companies, and Alzheimer’s Research UK, may be where their money can make the most impact in this space. The fund aims to boost dementia research investment by financing early-stage drug development projects. And other pharma companies, such as Eli Lilly, Biogen and Novartis have continued to pursue dementia drug development with modest but promising success to date.
So what makes dementia such a difficult condition to treat with drugs, and is progress being made towards a treatment?
Despite the vast number of people affected globally, with an estimated 46.8 million people currently living with dementia, there is currently no cure. While current treatments manage symptoms (the latest drug to gain FDA approval was memantine, in 2003) they offer no prospect of recovery.
Part of the difficulty in finding treatments for dementia stems from the fact it’s not a single disease, but a complex health problem with more than 50 underlying causes. Dementia can be better thought of as an umbrella term describing a range of conditions that cause parts of the brain to deteriorate progressively.
Most drug treatments currently in development have targeted the pathology of Alzheimer’s disease, the most common form of dementia, which accounts for about 60 to 70% of all cases.
Finding a successful treatment for Alzheimer’s faces two major hurdles: the first being we still don’t know enough about the disease’s underlying biology. For example, we don’t know what exactly regulates the toxic build-up of amyloid-β plaques and tau tangles in the brain that are found in Alzheimer’s patients, which specific types of these are toxic, or why the disease progresses at different rates in different people.
It doesn’t help that symptoms of Alzheimer’s develop gradually and slowly and a diagnosis might only be made years after the brain has started to undergo neurodegenerative changes. To boot, it’s not uncommon for Alzheimer’s to be present as well as other forms of dementia.
The second major hurdle to finding a treatment is that drugs need to first cross the blood-brain barrier. The blood–brain barrier provides a defence against disease-causing pathogens and toxins that may be present in our blood, and by design exists to keep out foreign substances from the brain. The downside is that it also keeps the vast majority of potential drug treatments from reaching the brain.
Currently available medications such as those which block the actions of an enzyme that destroys an important chemical messenger in the brain for memory (acetylcholinesterase inhibitors) or blocks the toxic effects of another messenger, glutamate (memantine) temporarily manage symptoms. But new treatments are focused on slowing or reversing the disease process itself, by targeting the underlying biology.
One approach, called immunotherapy, involves creating antibodies that bind to abnormal developments in the brain (such as amyloid-β or tau), and mark them for destruction by a range of mechanisms. Immunotherapy is experiencing a surge of interest and a number of clinical trials – targeting both amyloid-β and tau – are currently underway.
It’s estimated only 0.1% of antibodies circulating in the bloodstream enter the brain – this also includes the therapeutic antibodies currently used in clinical trials. An approach my team is taking is to use ultrasound to temporarily open the blood-brain barrier, which increases the uptake of Alzheimer’s drugs or antibody fragments.
We’ve had success in mice, finding ultrasound can clear toxic tau protein clumps, and that combining ultrasound with an antibody fragment treatment is more effective than either treatment alone in removing tau and reducing Alzheimer’s symptoms. The next challenge will be translating this success into human clinical trials.
The task of dementia drug development is no easy feat, and requires collaboration across government, industry and academia. In Australia, the National Dementia Network serves this purpose well. It’s only through perseverance and continued investment in research that we’ll one day have a treatment for dementia.
With thanks to Queensland Brain Institute Science Writer Donna Lu.
When you enter a room, your brain is bombarded with sensory information. If the room is a place you know well, most of this information is already stored in long-term memory. However, if the room is unfamiliar to you, your brain creates a new memory of it almost immediately.
MIT neuroscientists have now discovered how this occurs. A small region of the brainstem, known as the locus coeruleus, is activated in response to novel sensory stimuli, and this activity triggers the release of a flood of dopamine into a certain region of the hippocampus to store a memory of the new location.
“We have the remarkable ability to memorize some specific features of an experience in an entirely new environment, and such ability is crucial for our adaptation to the constantly changing world,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience and director of the RIKEN-MIT Center for Neural Circuit Genetics at the Picower Institute for Learning and Memory.
“This study opens an exciting avenue of research into the circuit mechanism by which behaviorally relevant stimuli are specifically encoded into long-term memory, ensuring that important stimuli are stored preferentially over incidental ones,” adds Tonegawa, the senior author of the study.
Akiko Wagatsuma, a former MIT research scientist, is the lead author of the study, which appears in the Proceedings of the National Academy of Sciences the week of Dec. 25.
In a study published about 15 years ago, Tonegawa’s lab found that a part of the hippocampus called the CA3 is responsible for forming memories of novel environments. They hypothesized that the CA3 receives a signal from another part of the brain when a novel place is encountered, stimulating memory formation.
They believed this signal to be carried by chemicals known as neuromodulators, which influence neuronal activity. The CA3 receives neuromodulators from both the locus coeruleus (LC) and a region called the ventral tegmental area (VTA), which is a key part of the brain’s reward circuitry. The researchers decided to focus on the LC because it has been shown to project to the CA3 extensively and to respond to novelty, among many other functions.
The LC responds to an array of sensory input, including visual information as well as sound and odor, then sends information on to other brain areas, including the CA3. To uncover the role of LC-CA3 communication, the researchers genetically engineered mice so that they could block the neuronal activity between those regions by shining light on neurons that form the connection.
To test the mice’s ability to form new memories, the researchers placed the mice in a large open space that they had never seen before. The next day, they placed them in the same space again. Mice whose LC-CA3 connections were not disrupted spent much less time exploring the space on the second day, because the environment was already familiar to them. However, when the researchers interfered with the LC-CA3 connection during the first exposure to the space, the mice explored the area on the second day just as much as they had on the first. This suggests that they were unable to form a memory of the new environment.
The LC appears to exert this effect by releasing the neuromodulator dopamine into the CA3 region, which was surprising because the LC is known to be a major source of norepinephrine to the hippocampus. The researchers believe that this influx of dopamine helps to boost CA3’s ability to strengthen synapses and form a memory of the new location.
They found that this mechanism was not required for other types of memory, such as memories of fearful events, but appears to be specific to memory of new environments. The connections between the LC and CA3 are necessary for long-term spatial memories to form in CA3.
“The selectivity of successful memory formation has long been a puzzle,” says Richard Morris, a professor of neuroscience at the University of Edinburgh, who was not involved in the research. “This study goes a long way toward identifying the brain mechanisms of this process. Activity in the pathway between the locus coeruleus and CA3 occurs most strongly during novelty, and it seems that activity fixes the representations of everyday experience, helping to register and retain what’s been happening and where we’ve been.”
Choosing to remember
This mechanism likely evolved as a way to help animals survive, allowing them to remember new environments without wasting brainpower on recording places that are already familiar, the researchers say.
“When we are exposed to sensory information, we unconsciously choose what to memorize. For an animal’s survival, certain things are necessary to be remembered, and other things, familiar things, probably can be forgotten,” Wagatsuma says.
Still unknown is how the LC recognizes that an environment is new. The researchers hypothesize that some part of the brain is able to compare new environments with stored memories or with expectations of the environment, but more studies are needed to explore how this might happen.
“That’s the next big question,” Tonegawa says. “Hopefully new technology will help to resolve that.”
The research was funded by the RIKEN Brain Science Institute, the Howard Hughes Medical Institute, and the JPB Foundation.
How does the hospital environment affect our rehabilitation? New research from the University of Gothenburg, Sweden, into how the space around us affects the brain reveals that well-planned architecture, design and sensory stimulation increase patients’ ability to recover both physically and mentally. Digital textiles and multisensory spaces can make rehabilitation more effective and reduce the amount of time spent in care.
In an interdisciplinary research project, Kristina Sahlqvist has used research into the recovery of the brain to examine how hospitals can create better environments for rehabilitation.
“We want to help patients to get involved in their rehabilitation, a side effect of which can be an improvement in self-confidence,” says Sahlqvist, interior architect and researcher at the University of Gothenburg’s School of Design and Crafts (HDK).
The project drew on all the expertise used on a ward, with input from neurologists, rehabilitation doctors, nurses, psychologists, occupational therapists and physiotherapists. The result is a conceptual solution for an optimal rehabilitation ward.
“Our concept gives the ward a spatial heart, for example, where patients and their families can prepare food and eat together, which allows for a more normal way of spending time together in a hospital environment,” says Sahlqvist.
In tandem with her research work, she has teamed up with a designer and researcher at the Swedish School of Textiles in Borås on an artistic development project where they redesigned furniture, developed easy-grip cups and cutlery and used smart textiles, in other words textiles with technology embedded in them. The concept includes a table and chairs, a rug and a muff with integral heating, a cardigan with speakers and a soft bracelet that is also a remote control.
In order to measure and test the research theories Sahlgrenska University Hospital will be developing an intensive care room featuring multimodal stimulation, where all the senses are affected. The work involves an architect, doctors, hospital staff, musicians, a designer, an acoustician and a cognition specialist. In a bid to see what kind of results the environment can produce in practice, the researchers will take account of the entire social situation of patients, family and staff.
There are other interesting tricks in the field of neuroarchitecture, where it is possible, for example, to use spatial expressions to improve learning. Although these are currently used predominantly in schools, they could also have potential for the elderly.
“It’s worth wondering why there are so many educational models for preschool children but so few for the elderly. Many old people need a far more stimulating environment than they have at the moment,” says Sahlqvist.