An autocondimentor is someone who will put certainly salt and probably pepper on any meal you put in front of them whatever it is and regardless of how much it's got on it already and regardless of how it tastes.
Behavioural therapists working for fast food outlets around the universe have saved these outlets billions of whatever the local currency is by noting the autocondimenting phenomenon and advising their employers to leave seasoning out in the first place.
This strange phenomenon was described by Terry Pratchett on page 66 of his Discworld® novel 'Reaper Man' (1991).
Some studies link this behaviour to high and low self-monitors. High and low self-monitors exhibit distinct behaviours related to social situations and self-presentation[1]. High self-monitors are adept at adapting their behavior to fit different social contexts, while low self-monitors tend to be more consistent in their behavior, regardless of the situation.
Studies have found that this difference can be seen in how they approach tasks like tasting food and adding salt: high self-monitors will taste first, while low self-monitors may add salt (or any other condiment) based on their internal preference, regardless of taste.
[1] Prislin, Kovrlija: Predicting behavior of high and low self-monitors: an application of the theory of planned behavior in Psychological Reports - 1992
Strange Syndromes
Sometimes funny, sometimes gruesome, but always interesting columns about strange syndromes and phenomena.
Shrunken Pore Syndrome
Shrunken Pore Syndrome (SPS) is a recently identified kidney condition characterized by a disparity in the glomerular filtration rate (GFR) for different sized molecules, specifically affecting the filtration of medium-sized molecules like cystatin C compared to smaller molecules like creatinine. This difference is often observed when comparing cystatin C-based estimated GFR (eGFR) to creatinine-based eGFR, with SPS defined as eGFR(cystatin C) being significantly lower than eGFR(creatinine). SPS has been associated with increased mortality and morbidity, particularly in patients with cardiovascular and kidney diseases.
Your kidneys filter blood through tiny structures called glomeruli, which have pores that allow small molecules to pass through while retaining larger ones. Think of it as a sieve.
Normally, both small molecules like creatinine and slightly larger molecules like cystatin C are filtered efficiently.
However, in Shrunken Pore Syndrome, the pores in the glomeruli are thought to be narrowed or 'shrunken', selectively impacting the filtration of medium-sized molecules like cystatin C, while smaller molecules like creatinine are still filtered relatively well.
This leads to a lower eGFR calculated based on cystatin C levels compared to creatinine-based eGFR, indicating a selective impairment in the kidney's ability to filter larger molecules.
Is this bad, you might ask. The answer is: very bad. Shrunken Pore Syndrome is associated with increased mortality and morbidity, especially in individuals with diabetes type-2 and kidney diseases[1]. Shrunken Pore Syndrome has also been linked to the progression of atherosclerosis and poor outcomes in patients with cardiovascular disease[2]. Shrunken Pore Syndrome has been observed in severe cases of COVID-19, potentially contributing to poorer outcomes[3].
While the exact cause of Shrunken Pore Syndrome is still being investigated, it is hypothesized that changes in the glomerular filtration barrier, such as narrowing of the pores, contribute to the syndrome. Other factors like thickening of the glomerular basement membrane may also play a role. Inflammatory and neurohormonal mechanisms are also implicated in the progression of Shrunken Pore Syndrome.
[1] Bruce et al: The relationship between shrunken pore syndrome and all-cause mortality in people with type 2 diabetes and normal renal function: the Fremantle Diabetes Study Phase II in Diabetologica – 2025
[2] Xhakollari et al: The Shrunken pore syndrome is associated with poor prognosis and lower quality of life in heart failure patients: the HARVEST-Malmö study in ESC Heart Failure - 2021
[3] Larsson et al: Shrunken Pore Syndrome Is Frequently Occurring in Severe COVID-19 in International Journal of Molecular Science - 2022
![]() |
[Schematic view of possible pathophysiology of Shrunken Pore Syndrome] |
Your kidneys filter blood through tiny structures called glomeruli, which have pores that allow small molecules to pass through while retaining larger ones. Think of it as a sieve.
Normally, both small molecules like creatinine and slightly larger molecules like cystatin C are filtered efficiently.
However, in Shrunken Pore Syndrome, the pores in the glomeruli are thought to be narrowed or 'shrunken', selectively impacting the filtration of medium-sized molecules like cystatin C, while smaller molecules like creatinine are still filtered relatively well.
This leads to a lower eGFR calculated based on cystatin C levels compared to creatinine-based eGFR, indicating a selective impairment in the kidney's ability to filter larger molecules.
Is this bad, you might ask. The answer is: very bad. Shrunken Pore Syndrome is associated with increased mortality and morbidity, especially in individuals with diabetes type-2 and kidney diseases[1]. Shrunken Pore Syndrome has also been linked to the progression of atherosclerosis and poor outcomes in patients with cardiovascular disease[2]. Shrunken Pore Syndrome has been observed in severe cases of COVID-19, potentially contributing to poorer outcomes[3].
While the exact cause of Shrunken Pore Syndrome is still being investigated, it is hypothesized that changes in the glomerular filtration barrier, such as narrowing of the pores, contribute to the syndrome. Other factors like thickening of the glomerular basement membrane may also play a role. Inflammatory and neurohormonal mechanisms are also implicated in the progression of Shrunken Pore Syndrome.
[1] Bruce et al: The relationship between shrunken pore syndrome and all-cause mortality in people with type 2 diabetes and normal renal function: the Fremantle Diabetes Study Phase II in Diabetologica – 2025
[2] Xhakollari et al: The Shrunken pore syndrome is associated with poor prognosis and lower quality of life in heart failure patients: the HARVEST-Malmö study in ESC Heart Failure - 2021
[3] Larsson et al: Shrunken Pore Syndrome Is Frequently Occurring in Severe COVID-19 in International Journal of Molecular Science - 2022
Malignant Narcissism
Malignant Narcissism is a severe toxic personality disorder characterized by a combination of Narcissistic Personality Disorder traits, antisocial behavior, and sometimes paranoid or sadistic tendencies.
It’s not (yet) a formal diagnosis in the DSM-5 but a term used in psychology to describe individuals with an extreme, toxic form of narcissism. It is thus conceptualized as a subcategory of Narcissistic Personality Disorder.
The proposed symptoms include grandiosity and self-centeredness (an inflated sense of self-importance, craving constant admiration, and believing they are superior to others), lack of empathy (inability or unwillingness to recognize or care about others’ feelings or needs), manipulative and exploitative behavior (using others for personal gain, often with deceit or charm, without remorse), antisocial traits (disregard for rules, laws, or social norms, often engaging in aggressive or harmful behavior), paranoia or suspicion (a tendency to mistrust others, believing they are out to undermine or harm them), sadistic tendencies (deriving pleasure from others’ suffering or exerting control through fear or intimidation).
Unlike typical narcissism, which may involve arrogance but not necessarily malice, malignant narcissism includes a deliberate intent to harm or dominate others. It is often associated with destructive relationships, workplace toxicity, or even criminal behavior in extreme cases.
The term was first introduced by psychoanalyst Erich Fromm in 1964 and later expanded by others like Otto Kernberg in 1984. Fromm characterized the condition as a form of narcissism, in which the individual takes pride in their own inherent traits rather than their achievements, and thus does not require a connection to other people or to reality[1]. Fromm, a Holocaust survivor, suggested that malignant narcissism is a severe and destructive pathology that can lie at the heart of the inhumane acts exhibited by dictatorial tyrants such as Hitler and Stalin. Modern psychologists suggest that Donald Trump is also a candidate for a diagnosis of Malignant Narcissism.
While not a formal diagnosis, it is mentioned in alternative models of personality disorders as a severe subtype of Narcissistic Personality Disorder, adding traits like antisocial behavior.
[1] Fromm: Individual and Social Narcissism in The Heart of Man: Its Genius for Good and Evil (1964)
It’s not (yet) a formal diagnosis in the DSM-5 but a term used in psychology to describe individuals with an extreme, toxic form of narcissism. It is thus conceptualized as a subcategory of Narcissistic Personality Disorder.
The proposed symptoms include grandiosity and self-centeredness (an inflated sense of self-importance, craving constant admiration, and believing they are superior to others), lack of empathy (inability or unwillingness to recognize or care about others’ feelings or needs), manipulative and exploitative behavior (using others for personal gain, often with deceit or charm, without remorse), antisocial traits (disregard for rules, laws, or social norms, often engaging in aggressive or harmful behavior), paranoia or suspicion (a tendency to mistrust others, believing they are out to undermine or harm them), sadistic tendencies (deriving pleasure from others’ suffering or exerting control through fear or intimidation).
Unlike typical narcissism, which may involve arrogance but not necessarily malice, malignant narcissism includes a deliberate intent to harm or dominate others. It is often associated with destructive relationships, workplace toxicity, or even criminal behavior in extreme cases.
The term was first introduced by psychoanalyst Erich Fromm in 1964 and later expanded by others like Otto Kernberg in 1984. Fromm characterized the condition as a form of narcissism, in which the individual takes pride in their own inherent traits rather than their achievements, and thus does not require a connection to other people or to reality[1]. Fromm, a Holocaust survivor, suggested that malignant narcissism is a severe and destructive pathology that can lie at the heart of the inhumane acts exhibited by dictatorial tyrants such as Hitler and Stalin. Modern psychologists suggest that Donald Trump is also a candidate for a diagnosis of Malignant Narcissism.
While not a formal diagnosis, it is mentioned in alternative models of personality disorders as a severe subtype of Narcissistic Personality Disorder, adding traits like antisocial behavior.
[1] Fromm: Individual and Social Narcissism in The Heart of Man: Its Genius for Good and Evil (1964)
Hanlon's Razor
Most of us know about Occam's Razor, the problem-solving adage that recommends searching for explanations that are constructed with the smallest possible set of elements.
Attributed to the 14th-century English philosopher, theologian, and writer William of Ockham, it is frequently cited as Entia non sunt multiplicanda praeter necessitatem, which translates as "Entities must not be multiplied beyond necessity", although Occam never used these exact words in his writings. Popularly, the principle is sometimes paraphrased as "of two competing theories, the simplest explanation of an entity is to be preferred."
Hanlon's Razor
But there's also a competing adage that suggests we should "Never attribute to malice that which is adequately explained by stupidity." In essence, it's a reminder to consider incompetence or oversight as a more likely explanation for negative actions or outcomes than deliberate ill intent.
While Occam's Razor and Hanlon's Razor are both simplifying principles, Occam's Razor focuses on the simplest explanation possible, while Hanlon's Razor specifically focuses on the absence of malice as the simpler explanation.
Hanlon's Razor was a submission credited in print to Robert J. Hanlon in a compilation of various jokes related to Murphy's law in 'Murphy's Law Book Two: More Reasons Why Things Go Wrong!' (1980).
Much earlier, a similar quotation appeared in Robert A. Heinlein's novella 'Logic of Empire' (1941). The character Doc in the story describes the "devil theory" fallacy, explaining, "You have attributed conditions to villainy that simply result from stupidity."
Writer Terry Pratchett also seems to have grasped to difference between stupidity and intelligence when he remarks on page 215 of 'Hogfather': 'Real stupidity beats artificial intelligence every time.”
Attributed to the 14th-century English philosopher, theologian, and writer William of Ockham, it is frequently cited as Entia non sunt multiplicanda praeter necessitatem, which translates as "Entities must not be multiplied beyond necessity", although Occam never used these exact words in his writings. Popularly, the principle is sometimes paraphrased as "of two competing theories, the simplest explanation of an entity is to be preferred."
Hanlon's Razor
But there's also a competing adage that suggests we should "Never attribute to malice that which is adequately explained by stupidity." In essence, it's a reminder to consider incompetence or oversight as a more likely explanation for negative actions or outcomes than deliberate ill intent.
While Occam's Razor and Hanlon's Razor are both simplifying principles, Occam's Razor focuses on the simplest explanation possible, while Hanlon's Razor specifically focuses on the absence of malice as the simpler explanation.
Hanlon's Razor was a submission credited in print to Robert J. Hanlon in a compilation of various jokes related to Murphy's law in 'Murphy's Law Book Two: More Reasons Why Things Go Wrong!' (1980).
Much earlier, a similar quotation appeared in Robert A. Heinlein's novella 'Logic of Empire' (1941). The character Doc in the story describes the "devil theory" fallacy, explaining, "You have attributed conditions to villainy that simply result from stupidity."
Writer Terry Pratchett also seems to have grasped to difference between stupidity and intelligence when he remarks on page 215 of 'Hogfather': 'Real stupidity beats artificial intelligence every time.”
Dark Forest Hypothesis
Despite some deluded minds that continue to claim that alien UFO's exist, there is no known reliable or reproducible evidence that aliens have ever visited or attempted to contact Earth. No transmissions and no firm evidence of intelligent extraterrestrial life have been detected or observed.
That is a problem, because cosmologists presume that the universe is filled with a very large number of planets, where the probability is that some present the conditions hospitable for life. Some of these planets must be home to intelligent life. This discrepancy between the lack of conclusive evidence of advanced extraterrestrial life and the apparently high likelihood of its existence is called the Fermi Paradox.
The Dark Forest Hypothesis is the conjecture that many alien civilizations do exist throughout the universe, but they are both silent and hostile, maintaining their undetectability for fear of being destroyed by another hostile and undetected civilization[1]. Also, according to the dark forest hypothesis, since the intentions of any newly contacted civilisation can never be known with certainty, then if one is encountered, it is best to shoot first and ask questions later, in order to avoid the potential extinction of one’s own species.
Think of a dark and forbidden forest filled with armed hunters stalking though the forest in search of prey. Any sound made by potential prey would potentially result in a kill. Viewed the other way: any sound made by the hunters might attract a top predator.
Therefore, any space-faring civilization would view any other intelligent life as an inevitable threat. They will then strife to destroy any nascent life that makes itself known. As a result, the electromagnetic radiation surveys would not find evidence of intelligent alien life.
The hypothesis derives its name from Liu Cixin's science fiction novel 'The Dark Forest' (2008), although the concept predates the novel[2].
[1] Yu: The Dark Forest Rule: One Solution to the Fermi Paradox in Journal of the British Interplanetary Society - 2015
[2] Brin: The Great Silence - the Controversy Concerning Extraterrestrial Intelligent Life in Quarterly Journal of the Royal Astronomical Society - 1983
That is a problem, because cosmologists presume that the universe is filled with a very large number of planets, where the probability is that some present the conditions hospitable for life. Some of these planets must be home to intelligent life. This discrepancy between the lack of conclusive evidence of advanced extraterrestrial life and the apparently high likelihood of its existence is called the Fermi Paradox.
The Dark Forest Hypothesis is the conjecture that many alien civilizations do exist throughout the universe, but they are both silent and hostile, maintaining their undetectability for fear of being destroyed by another hostile and undetected civilization[1]. Also, according to the dark forest hypothesis, since the intentions of any newly contacted civilisation can never be known with certainty, then if one is encountered, it is best to shoot first and ask questions later, in order to avoid the potential extinction of one’s own species.
Think of a dark and forbidden forest filled with armed hunters stalking though the forest in search of prey. Any sound made by potential prey would potentially result in a kill. Viewed the other way: any sound made by the hunters might attract a top predator.
Therefore, any space-faring civilization would view any other intelligent life as an inevitable threat. They will then strife to destroy any nascent life that makes itself known. As a result, the electromagnetic radiation surveys would not find evidence of intelligent alien life.
The hypothesis derives its name from Liu Cixin's science fiction novel 'The Dark Forest' (2008), although the concept predates the novel[2].
[1] Yu: The Dark Forest Rule: One Solution to the Fermi Paradox in Journal of the British Interplanetary Society - 2015
[2] Brin: The Great Silence - the Controversy Concerning Extraterrestrial Intelligent Life in Quarterly Journal of the Royal Astronomical Society - 1983
Napoleon Complex
The Napoleon Complex, also known as Napoleon syndrome or short-man syndrome, is a popular, but not scientifically recognized, term describing a domineering or aggressive behavioural pattern, often associated with short men. It's believed that individuals with this 'complex' overcompensate for their perceived lack of stature through assertiveness and ambition. Both commonly and in psychology, the Napoleon complex is regarded as a derogatory social stereotype.
The term is named after Napoleon Bonaparte (1769-1821), the first emperor of the French, who, despite being of average height for his time at 1.67 metres, is often depicted as having a short stature. This, coupled with his ambitious and often aggressive leadership style, led to the association of his personality with the complex.
People often attributed with the Napoleon complex are said to exhibit certain traits, such as aggressiveness, assertiveness, ambition, and competitiveness.
They might be quick to anger or have a strong need to dominate situations (Aggressiveness). They tend to be forceful and direct in their communication towards others (Assertiveness). They may pursue highly ambitious goals as a way to compensate for perceived shortcomings (Ambition). They might be overly competitive, especially with taller individuals (Competitiveness).
Captain Mainwaring in Dad's Army is a perfect example of this complex.
It's important to understand that the Napoleon complex is not (yet) a recognized mental health condition. It's a social term used to describe a perceived pattern of behaviour.
While the term is widely used, research on whether shorter men are inherently more aggressive or ambitious is limited and has yielded mixed results. Some studies suggest that height might influence social dynamics, but it's not a definitive predictor of personality or behaviour.
Researchers found that men who were 1.63 metres in height were 50% more likely to show signs of jealousy than men who measured 1.98 metres[1]. Even evolutionary psychologists found evidence for the Napoleon complex in human males[2].
It is important to consider other factors that might contribute to aggressive or assertive behaviour, such as an inferiorly complex, personality traits, social experiences, and individual motivations.
[1] Buunk et al: Height predicts jealousy differently for men and women in Evolution and Human Behavior – 2008
[2] Knapen et al: The Napoleon Complex: When Shorter Men Take More in Psychological Science - 2018
The term is named after Napoleon Bonaparte (1769-1821), the first emperor of the French, who, despite being of average height for his time at 1.67 metres, is often depicted as having a short stature. This, coupled with his ambitious and often aggressive leadership style, led to the association of his personality with the complex.
People often attributed with the Napoleon complex are said to exhibit certain traits, such as aggressiveness, assertiveness, ambition, and competitiveness.
They might be quick to anger or have a strong need to dominate situations (Aggressiveness). They tend to be forceful and direct in their communication towards others (Assertiveness). They may pursue highly ambitious goals as a way to compensate for perceived shortcomings (Ambition). They might be overly competitive, especially with taller individuals (Competitiveness).
Captain Mainwaring in Dad's Army is a perfect example of this complex.
It's important to understand that the Napoleon complex is not (yet) a recognized mental health condition. It's a social term used to describe a perceived pattern of behaviour.
While the term is widely used, research on whether shorter men are inherently more aggressive or ambitious is limited and has yielded mixed results. Some studies suggest that height might influence social dynamics, but it's not a definitive predictor of personality or behaviour.
Researchers found that men who were 1.63 metres in height were 50% more likely to show signs of jealousy than men who measured 1.98 metres[1]. Even evolutionary psychologists found evidence for the Napoleon complex in human males[2].
It is important to consider other factors that might contribute to aggressive or assertive behaviour, such as an inferiorly complex, personality traits, social experiences, and individual motivations.
[1] Buunk et al: Height predicts jealousy differently for men and women in Evolution and Human Behavior – 2008
[2] Knapen et al: The Napoleon Complex: When Shorter Men Take More in Psychological Science - 2018
West Nile Virus Neuroinvasive Disease
West Nile Virus (WNV) is a emerging or re-emerging mosquito-borne virus, increasingly present in most European countries.
Though most of the infected individuals are asymptomatic (80%) or develop a self-limited flu-like disease (20%), 1% will develop the severe form of West Nile Virus which is called West Nile Virus Neuroinvasive Disease (WNV NID)[1]. The case fatality rate in West Nile Virus Neuroinvasive Disease cases is approximately 10%, since patients with West Nile Virus Neuroinvasive Disease may have prolonged Intensive Care Unit (ICU) stays with considerable long-term morbidity and mortality[2].
West Nile Virus Neuroinvasive Disease manifests irtself as meningitis, encephalitis, Acute Flaccid Paralysis (AFP), or a combination of those.
West Nile Virus Neuroinvasive Disease is characterized by fever and signs of meningeal inflammation, such as nuchal rigidity, photophobia, and nausea and vomiting. West Nile Virus encephalitis is associated with prolonged altered mental status, seizures, or focal neurological signs.
Patients with severe West Nile Virus encephalitis may present with stupor or coma. Acute paralysis associated with West Nile Virus infection has been attributed to a poliomyelitis-like syndrome, myeloradiculitis, and Guillain–Barré Syndrome (GBS). West Nile Virus poliomyelitis with or without brainstem involvement is the most common neuromuscular manifestation of a West Nile Virus infection, resulting in asymmetric paralysis.
The motor neurons in the anterior horns and in the brainstem are the major sites of pathology responsible for neuromuscular signs; however, inflammation may also involve motor axons (polyradiculitis) and peripheral nerves (GBS). In comparison to patients with poliomyelitis-like syndrome, those resembling GBS have symmetric weakness with sensory loss.
No vaccine for humans is yet available. Horses can be vaccinated with a choice of vaccines.
[1] Santini et al: Severe West Nile Virus Neuroinvasive Disease: Clinical Characteristics, Short- and Long-Term Outcomes in Pathogens – 2022. See here.
[2] Sejvar: Clinical manifestations and outcomes of West Nile virus infection in Viruses – 2014. See here.
Though most of the infected individuals are asymptomatic (80%) or develop a self-limited flu-like disease (20%), 1% will develop the severe form of West Nile Virus which is called West Nile Virus Neuroinvasive Disease (WNV NID)[1]. The case fatality rate in West Nile Virus Neuroinvasive Disease cases is approximately 10%, since patients with West Nile Virus Neuroinvasive Disease may have prolonged Intensive Care Unit (ICU) stays with considerable long-term morbidity and mortality[2].
West Nile Virus Neuroinvasive Disease manifests irtself as meningitis, encephalitis, Acute Flaccid Paralysis (AFP), or a combination of those.
West Nile Virus Neuroinvasive Disease is characterized by fever and signs of meningeal inflammation, such as nuchal rigidity, photophobia, and nausea and vomiting. West Nile Virus encephalitis is associated with prolonged altered mental status, seizures, or focal neurological signs.
Patients with severe West Nile Virus encephalitis may present with stupor or coma. Acute paralysis associated with West Nile Virus infection has been attributed to a poliomyelitis-like syndrome, myeloradiculitis, and Guillain–Barré Syndrome (GBS). West Nile Virus poliomyelitis with or without brainstem involvement is the most common neuromuscular manifestation of a West Nile Virus infection, resulting in asymmetric paralysis.
The motor neurons in the anterior horns and in the brainstem are the major sites of pathology responsible for neuromuscular signs; however, inflammation may also involve motor axons (polyradiculitis) and peripheral nerves (GBS). In comparison to patients with poliomyelitis-like syndrome, those resembling GBS have symmetric weakness with sensory loss.
No vaccine for humans is yet available. Horses can be vaccinated with a choice of vaccines.
[1] Santini et al: Severe West Nile Virus Neuroinvasive Disease: Clinical Characteristics, Short- and Long-Term Outcomes in Pathogens – 2022. See here.
[2] Sejvar: Clinical manifestations and outcomes of West Nile virus infection in Viruses – 2014. See here.
Cultural Cringe
The term 'Cultural Cringe' is originally coined by Australian writer, critic and teacher Arthur Angell Phillips (1900-1985) in his pioneering essay 'The Cultural Cringe' (1950), which set the early terms for post-colonial theory in Australia[1]. The term is now widely used to describe feelings of inferiority about one's own culture compared to another, often more dominant, culture.
'The Cultural Cringe' explored ingrained feelings of inferiority that local Australian intellectuals struggled against, and which were most clearly pronounced in the Australian theatre, music, art and letters. Phillips pointed out that the public widely assumed that anything produced by local dramatists, actors, musicians, artists and writers was necessarily deficient when compared against the works of European counterparts. The only ways local arts professionals could build themselves up in public esteem was either to follow overseas fashions, or, more often, to spend a period of time working in Britain. In some professions this attitude even affected employment opportunities, with only those who had worked in London being treated as worthy of appointment or promotion. Thus the cultural cringe brought about over the early to mid 20th century a pattern of temporary residence in Britain for so many young talented Australians across a broad range of fields, from the arts to the sciences.
The term 'Scottish Cringe' is a cultural phenomenon described by some commentators, politicians, and scholars. It refers to a sense of cultural inferiority or embarrassment some Scots may feel about their own cultural identity, particularly in relation to the perceived dominance of English or Anglocentric British culture within the United Kingdom.
In Scotland, this phenomenon manifests itself with the following symptoms:
- Feelings of low self-worth or embarrassment when expressing overt Scottish cultural identity[2], such as using the Lowland Scots or Scottish Gaelic languages, wearing kilts, or embracing traditional Scottish heritage.
- A perceived sense of inferiority relative to English culture, attributed by some to historical and ongoing dominance of English cultural norms within the UK, particularly centered in London.
- Internalized self-doubt about Scotland’s ability to govern itself or succeed independently, often linked to historical events like the 1707 Act of Union and centuries of cultural and political subordination.
The term is often used in discussions about Scottish identity, national confidence, and the psychological impact of historical and cultural dynamics, particularly post-Union and during debates about Scottish independence.
[1] Phillips: The Cultural Cringe in Meanjin (p. 299-203) - 1950
[2] Unger: Legitimating inaction: Differing identity constructions of the Scots language in European Journal of Cultural Studies - 2010
'The Cultural Cringe' explored ingrained feelings of inferiority that local Australian intellectuals struggled against, and which were most clearly pronounced in the Australian theatre, music, art and letters. Phillips pointed out that the public widely assumed that anything produced by local dramatists, actors, musicians, artists and writers was necessarily deficient when compared against the works of European counterparts. The only ways local arts professionals could build themselves up in public esteem was either to follow overseas fashions, or, more often, to spend a period of time working in Britain. In some professions this attitude even affected employment opportunities, with only those who had worked in London being treated as worthy of appointment or promotion. Thus the cultural cringe brought about over the early to mid 20th century a pattern of temporary residence in Britain for so many young talented Australians across a broad range of fields, from the arts to the sciences.
The term 'Scottish Cringe' is a cultural phenomenon described by some commentators, politicians, and scholars. It refers to a sense of cultural inferiority or embarrassment some Scots may feel about their own cultural identity, particularly in relation to the perceived dominance of English or Anglocentric British culture within the United Kingdom.
In Scotland, this phenomenon manifests itself with the following symptoms:
- Feelings of low self-worth or embarrassment when expressing overt Scottish cultural identity[2], such as using the Lowland Scots or Scottish Gaelic languages, wearing kilts, or embracing traditional Scottish heritage.
- A perceived sense of inferiority relative to English culture, attributed by some to historical and ongoing dominance of English cultural norms within the UK, particularly centered in London.
- Internalized self-doubt about Scotland’s ability to govern itself or succeed independently, often linked to historical events like the 1707 Act of Union and centuries of cultural and political subordination.
The term is often used in discussions about Scottish identity, national confidence, and the psychological impact of historical and cultural dynamics, particularly post-Union and during debates about Scottish independence.
[1] Phillips: The Cultural Cringe in Meanjin (p. 299-203) - 1950
[2] Unger: Legitimating inaction: Differing identity constructions of the Scots language in European Journal of Cultural Studies - 2010
Shit Life Syndrome
Shit Life Syndrome (SLS) is a phrase used by physicians in English speaking nations to describe the detrimental effect that a variety of poverty or abuse-induced disorders can have on patients.
It were US doctors that coined the phrase. Poor working-age Americans of all races are locked in a cycle of poverty and neglect, amid wider affluence. They are ill educated and ill trained. The jobs available are drudge work paying the minimum wage, with minimal or no job security. They are trapped in poor neighbourhoods where the prospect of owning a home is a distant dream. There is little social housing, scant income support and contingent access to healthcare. Finding meaning in such a life is close to impossible; the struggle to survive commands all intellectual and emotional resources. Yet turn on the TV or visit a middle-class shopping mall and a very different and unattainable world presents itself. Knowing that you are valueless, you resort to drugs, antidepressants and alcohol. You will eat junk food and watch your ill-treated body grow to being obese. It is not just poverty, but growing relative poverty in an era of rising inequality, with all its psychological side effects, that is the killer.
In 2017, Sarah O’Connor’s wrote an article for the Financial Times titled ‘Left behind: can anyone save the towns the economy forgot?’ See here. She observed the Shit Life Syndrome in the English coastal town of Blackpool. It even won the 2018 Orwell Prize for Exposing Britain’s Social Evils. O’Connor wrote: "Blackpool exports healthy skilled people and imports the unskilled, the unemployed, and the unwell. As people overlooked by the modern economy wash up in a place that has also been left behind, the result is a quietly unfolding health crisis."
She observed: “More than a tenth of the town’s working-age inhabitants live on state benefits paid to those deemed too sick to work. Antidepressant prescription rates are among the highest in the country. Life expectancy, already the lowest in England, has recently started to fall. Doctors in places such as this have a private diagnosis for what ails some of their patients: ‘Shit Life Syndrome’ … People with SLS really do have mental or physical health problems, doctors say. But they believe the causes are a tangled mix of economic, social, and emotional problems that they — with 10- to 15-minute slots per patient — feel powerless to fix. The relationship between economics and health is blurry, complex and politically fraught. But it is too important to ignore. Life expectancy, already the lowest in England, has recently started to fall.'
It were US doctors that coined the phrase. Poor working-age Americans of all races are locked in a cycle of poverty and neglect, amid wider affluence. They are ill educated and ill trained. The jobs available are drudge work paying the minimum wage, with minimal or no job security. They are trapped in poor neighbourhoods where the prospect of owning a home is a distant dream. There is little social housing, scant income support and contingent access to healthcare. Finding meaning in such a life is close to impossible; the struggle to survive commands all intellectual and emotional resources. Yet turn on the TV or visit a middle-class shopping mall and a very different and unattainable world presents itself. Knowing that you are valueless, you resort to drugs, antidepressants and alcohol. You will eat junk food and watch your ill-treated body grow to being obese. It is not just poverty, but growing relative poverty in an era of rising inequality, with all its psychological side effects, that is the killer.
In 2017, Sarah O’Connor’s wrote an article for the Financial Times titled ‘Left behind: can anyone save the towns the economy forgot?’ See here. She observed the Shit Life Syndrome in the English coastal town of Blackpool. It even won the 2018 Orwell Prize for Exposing Britain’s Social Evils. O’Connor wrote: "Blackpool exports healthy skilled people and imports the unskilled, the unemployed, and the unwell. As people overlooked by the modern economy wash up in a place that has also been left behind, the result is a quietly unfolding health crisis."
She observed: “More than a tenth of the town’s working-age inhabitants live on state benefits paid to those deemed too sick to work. Antidepressant prescription rates are among the highest in the country. Life expectancy, already the lowest in England, has recently started to fall. Doctors in places such as this have a private diagnosis for what ails some of their patients: ‘Shit Life Syndrome’ … People with SLS really do have mental or physical health problems, doctors say. But they believe the causes are a tangled mix of economic, social, and emotional problems that they — with 10- to 15-minute slots per patient — feel powerless to fix. The relationship between economics and health is blurry, complex and politically fraught. But it is too important to ignore. Life expectancy, already the lowest in England, has recently started to fall.'
Illusory Truth Effect
Some define truth by what it is not, rather than what it is. Some argue that fake news has no factual basis, which implies that truth is equated simply to facts[1].
The illusory truth effect is also known as the illusion of truth effect, validity effect, truth effect, or even the reiteration effect. It is the tendency to believe false information to be correct after repeated exposure to it.
This phenomenon was first identified in a 1977 study[2]. The study found that, hen truth is assessed, people tend to rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful.
“Repeat a lie often enough and it becomes the truth”, is a law of propaganda often (wrongly) attributed to Joseph Goebbels (1897-1945), chief propagandist for the Nazi Party. another veriant is the große Lüge ('big lie'), a gross distortion or misrepresentation of the truth primarily used as a political propaganda technique.
The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.
Even incidental exposure reaches further than previously thought, with potentially consequential implications for concerns around mis- and dis-information[3].
The illusory truth effect plays a significant role in fields such as advertising, news media, political propaganda, and religious indoctrination. Conspiracy theories rely heavily on the illusionary truth effect. People with below average intelligence and low level schooling are prone to believing anything that aligns with their pre-existing beliefs.
[1] Paskin D: Real or fake news: Who knows? in Journal of Social Media in Society – 2018
[2] Hasher et al: Frequency and the conference of referential validity in Journal of Verbal Learning and Verbal Behavior - 1977
[3] Mikell, Powell: Illusory implications: incidental exposure to ideas can induce beliefs in Royal Society - 2025
The illusory truth effect is also known as the illusion of truth effect, validity effect, truth effect, or even the reiteration effect. It is the tendency to believe false information to be correct after repeated exposure to it.
This phenomenon was first identified in a 1977 study[2]. The study found that, hen truth is assessed, people tend to rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful.
“Repeat a lie often enough and it becomes the truth”, is a law of propaganda often (wrongly) attributed to Joseph Goebbels (1897-1945), chief propagandist for the Nazi Party. another veriant is the große Lüge ('big lie'), a gross distortion or misrepresentation of the truth primarily used as a political propaganda technique.
The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.
Even incidental exposure reaches further than previously thought, with potentially consequential implications for concerns around mis- and dis-information[3].
The illusory truth effect plays a significant role in fields such as advertising, news media, political propaganda, and religious indoctrination. Conspiracy theories rely heavily on the illusionary truth effect. People with below average intelligence and low level schooling are prone to believing anything that aligns with their pre-existing beliefs.
[1] Paskin D: Real or fake news: Who knows? in Journal of Social Media in Society – 2018
[2] Hasher et al: Frequency and the conference of referential validity in Journal of Verbal Learning and Verbal Behavior - 1977
[3] Mikell, Powell: Illusory implications: incidental exposure to ideas can induce beliefs in Royal Society - 2025
Autogynephilia
Paraphilia can be described as an experience of recurring or intense sexual arousal to atypical objects, places, situations, fantasies, behaviors, or individuals. Paraphilias are contrasted with normal sexual interests, although the definition of what makes a sexual interest normal or atypical remains controversial.
The exact number and taxonomy of paraphilia is under debate. The DSM-5 adds a distinction between paraphilias and "paraphilic disorders", stating that paraphilias do not require or justify psychiatric treatment in themselves, and defining paraphilic disorder as "a paraphilia that is currently causing distress or impairment to the individual or a paraphilia whose satisfaction has entailed personal harm, or risk of harm, to others".
The DSM-5 has specific listings for eight paraphilic disorders: voyeuristic disorder, exhibitionistic disorder, frotteuristic disorder, sexual masochism disorder, sexual sadism disorder, pedophilic disorder, fetishistic disorder, and transvestic disorder. Other paraphilic disorders can be diagnosed under the Other Specified Paraphilic Disorder or Unspecified Paraphilic Disorder listings, if accompanied by distress or impairment.
But there's another type of paraphilia that seems to have been prevented from entering public awareness: Autogynephilia.
Autogynephilia (derived from Greek for 'love of oneself as a woman') is defined as a male's propensity to be sexually aroused by the thought of himself as a female. It is the paraphilia that is theorized to underlie transvestism and some forms of male-to-female (MtF) transsexualism. Autogynephilia encompasses sexual arousal with cross-dressing and cross-gender expression that does not involve women's clothing per se[1].
The problem with autogynephilia is that it offers predatory men the possibility to put on a wig and to claim that they 'are women' so they can 'rightfully' enter spaces that would normally be exclusively for women, like toilets, changing rooms, and prisons for females.
Political trans activists and their 'allies' have done everything they could to keep the concept of autogynephilia from public awareness. And yet, with excruciating slowness but apparent inevitability, it is doing just that.
Remember, autogynephilia is a disorder that needs treatment because of the symptoms used to diagnose any paraphilia: .. whose satisfaction has entailed personal harm, or risk of harm, to others.
[1] Anne A Lawrence: Autogynephilia: an underappreciated paraphilia in Advances in Psychosomatic Medicine - 2011
The exact number and taxonomy of paraphilia is under debate. The DSM-5 adds a distinction between paraphilias and "paraphilic disorders", stating that paraphilias do not require or justify psychiatric treatment in themselves, and defining paraphilic disorder as "a paraphilia that is currently causing distress or impairment to the individual or a paraphilia whose satisfaction has entailed personal harm, or risk of harm, to others".
The DSM-5 has specific listings for eight paraphilic disorders: voyeuristic disorder, exhibitionistic disorder, frotteuristic disorder, sexual masochism disorder, sexual sadism disorder, pedophilic disorder, fetishistic disorder, and transvestic disorder. Other paraphilic disorders can be diagnosed under the Other Specified Paraphilic Disorder or Unspecified Paraphilic Disorder listings, if accompanied by distress or impairment.
But there's another type of paraphilia that seems to have been prevented from entering public awareness: Autogynephilia.
Autogynephilia (derived from Greek for 'love of oneself as a woman') is defined as a male's propensity to be sexually aroused by the thought of himself as a female. It is the paraphilia that is theorized to underlie transvestism and some forms of male-to-female (MtF) transsexualism. Autogynephilia encompasses sexual arousal with cross-dressing and cross-gender expression that does not involve women's clothing per se[1].
The problem with autogynephilia is that it offers predatory men the possibility to put on a wig and to claim that they 'are women' so they can 'rightfully' enter spaces that would normally be exclusively for women, like toilets, changing rooms, and prisons for females.
Political trans activists and their 'allies' have done everything they could to keep the concept of autogynephilia from public awareness. And yet, with excruciating slowness but apparent inevitability, it is doing just that.
Remember, autogynephilia is a disorder that needs treatment because of the symptoms used to diagnose any paraphilia: .. whose satisfaction has entailed personal harm, or risk of harm, to others.
[1] Anne A Lawrence: Autogynephilia: an underappreciated paraphilia in Advances in Psychosomatic Medicine - 2011
Ketamine Bladder Syndrome
Ketamine is often used for its anesthetic and analgesic effects on cats, dogs, rabbits, rats, and other small animals. It is frequently used in induction and anesthetic maintenance in horses.
Ketamine is a dissociative anesthetic used medically for induction and maintenance of anesthesia. It is also used as a treatment for depression and in pain management. Ketamine is an NMDA receptor antagonist which accounts for most of its psychoactive effects.
At anesthetic doses, ketamine induces a state of dissociative anesthesia, a trance-like state providing pain relief, sedation, and amnesia.
As a result of its dissociative and paralytic effects, ketamine is increasingly popular for 'recreational use'. But regular use can have some unexpected detrimental effects on your health. Serious and frequently irreversible damage to the urinary tract is a recently recognised an important side effect of that recreational ketamine use.
Ketamine is toxic for your bladder and urinary tracks. Urinary toxicity occurs primarily in people who use ketamine routinely, with around 25% of frequent users having bladder complaints. These complaints include a range of disorders from cystitis to hydronephrosis to kidney failure.
The typical symptoms of ketamine-induced cystitis are frequent urination, dysuria, and urinary urgency sometimes accompanied by pain during urination and blood in urine[1]. The damage to the bladder wall has similarities to both interstitial and eosinophilic cystitis. The wall is thickened and the functional bladder capacity is as low as 10–150 millilitres.
Management of ketamine-induced cystitis involves ketamine cessation as the first step[2]. This is followed by NSAIDs and anticholinergics and, if the response is insufficient, by tramadol. The second line treatments are epithelium-protective agents such as oral pentosan polysulfate or intravesical (intra-bladder) instillation of hyaluronic acid.
Guess who is a regular user of ketamine 'for his depression'? Elon Musk.
[1] Castellani et al: What urologists need to know about ketamine-induced uropathy: A systematic review in Neurourology and Urodynamics – 2020
[2] Srirangam, Mercer: Ketamine bladder syndrome: an important differential diagnosis when assessing a patient with persistent lower urinary tract symptoms in Britisch Medical Journal - 2012
Ketamine is a dissociative anesthetic used medically for induction and maintenance of anesthesia. It is also used as a treatment for depression and in pain management. Ketamine is an NMDA receptor antagonist which accounts for most of its psychoactive effects.
At anesthetic doses, ketamine induces a state of dissociative anesthesia, a trance-like state providing pain relief, sedation, and amnesia.
As a result of its dissociative and paralytic effects, ketamine is increasingly popular for 'recreational use'. But regular use can have some unexpected detrimental effects on your health. Serious and frequently irreversible damage to the urinary tract is a recently recognised an important side effect of that recreational ketamine use.
Ketamine is toxic for your bladder and urinary tracks. Urinary toxicity occurs primarily in people who use ketamine routinely, with around 25% of frequent users having bladder complaints. These complaints include a range of disorders from cystitis to hydronephrosis to kidney failure.
The typical symptoms of ketamine-induced cystitis are frequent urination, dysuria, and urinary urgency sometimes accompanied by pain during urination and blood in urine[1]. The damage to the bladder wall has similarities to both interstitial and eosinophilic cystitis. The wall is thickened and the functional bladder capacity is as low as 10–150 millilitres.
Management of ketamine-induced cystitis involves ketamine cessation as the first step[2]. This is followed by NSAIDs and anticholinergics and, if the response is insufficient, by tramadol. The second line treatments are epithelium-protective agents such as oral pentosan polysulfate or intravesical (intra-bladder) instillation of hyaluronic acid.
Guess who is a regular user of ketamine 'for his depression'? Elon Musk.
[1] Castellani et al: What urologists need to know about ketamine-induced uropathy: A systematic review in Neurourology and Urodynamics – 2020
[2] Srirangam, Mercer: Ketamine bladder syndrome: an important differential diagnosis when assessing a patient with persistent lower urinary tract symptoms in Britisch Medical Journal - 2012
Hound of the Baskervilles Effect
The Hound of the Baskervilles Effect, also known as the Baskervilles Effect, is the supposed self-fulfilling prophecy that there is an increase in rate of mortality through heart attacks on days considered unlucky because of the psychological stress this causes on superstitious people.
The term derives from the Sherlock Holmes novel 'The Hound of the Baskervilles' in which a hellish-looking dog chases Sir Charles Baskerville, sufferer of a chronic heart disease, and who subsequently dies of a heart attack "with an expression of horror in his face".
The Baskerville Effect was named by David Phillips and his colleagues at the University of California (San Diego, USA) in a paper in which they claimed that the daily number of deaths of Chinese and Japanese Americans from heart attacks between 1973 and 1998 was 7 percent higher on the fourth of the month compared to the average for the other days in that month, while this was not observed in the general American population[1].
Four is considered an unlucky number in Chinese, and hence in the Japanese and Korean, because it sounds like the Chinese word for 'death'. As a result of this superstition, some Chinese and Japanese hotels avoid using it as a room number.
In an analysis of the 20,000 computerized death certificates of Asian-Americans in San Diego, Phillips discovered that there was a 13 percent uptick in death rates on the fourth of the month. The hypothesis was that the peak was caused by stress induced by the superstition surrounding this number.
However, not everyone was convinced about this correlation. In 2002, Gary Smith commented that Phillips and colleagues had omitted data from several heart disease categories, picking only those that happened to have a higher rate on the fourth day, calling them 'chronic heart diseases'. Smith also pointed out that they had not done this on their previous studies of Jewish deaths near Passover and Chinese deaths near the Harvest Moon, where they had used all heart disease categories[2].
Smith also found no statistically relevant peaks on day 4 in data from 1969 to 1988 and 1999 to 2001 for total coronary deaths, inpatients, or the subset of heart diseases used by Phillips and colleagues, adding that there were more deaths on day 5 in the 1969–1988 data, and more deaths on day 3 in the 1999–2001 data.
In 2003, Panesar and colleagues looked for this effect on the Chinese population of Hong Kong. They looked at mortality data from 1995 to 2000, comparing the days of the month with "deathly connotations" (4, 14 and 24) with the other days of the month on both the Lunar and Gregorian calendars, and found no statistically significant difference in the occurrence of cardiac deaths in Cantonese people[3].
[1] Phillips et al: The Hound of the Baskervilles effect: natural experiment on the influence of psychological stress on timing of death in British Medical Journal – 2001. See here.
[2] Smith: Scared to Death? in British Medical Journal – 2002. See here.
[3] Panesar et al: Is four a deadly number for the Chinese? in Medical Journal of Australia – 2003.
The term derives from the Sherlock Holmes novel 'The Hound of the Baskervilles' in which a hellish-looking dog chases Sir Charles Baskerville, sufferer of a chronic heart disease, and who subsequently dies of a heart attack "with an expression of horror in his face".
The Baskerville Effect was named by David Phillips and his colleagues at the University of California (San Diego, USA) in a paper in which they claimed that the daily number of deaths of Chinese and Japanese Americans from heart attacks between 1973 and 1998 was 7 percent higher on the fourth of the month compared to the average for the other days in that month, while this was not observed in the general American population[1].
Four is considered an unlucky number in Chinese, and hence in the Japanese and Korean, because it sounds like the Chinese word for 'death'. As a result of this superstition, some Chinese and Japanese hotels avoid using it as a room number.
In an analysis of the 20,000 computerized death certificates of Asian-Americans in San Diego, Phillips discovered that there was a 13 percent uptick in death rates on the fourth of the month. The hypothesis was that the peak was caused by stress induced by the superstition surrounding this number.
However, not everyone was convinced about this correlation. In 2002, Gary Smith commented that Phillips and colleagues had omitted data from several heart disease categories, picking only those that happened to have a higher rate on the fourth day, calling them 'chronic heart diseases'. Smith also pointed out that they had not done this on their previous studies of Jewish deaths near Passover and Chinese deaths near the Harvest Moon, where they had used all heart disease categories[2].
Smith also found no statistically relevant peaks on day 4 in data from 1969 to 1988 and 1999 to 2001 for total coronary deaths, inpatients, or the subset of heart diseases used by Phillips and colleagues, adding that there were more deaths on day 5 in the 1969–1988 data, and more deaths on day 3 in the 1999–2001 data.
In 2003, Panesar and colleagues looked for this effect on the Chinese population of Hong Kong. They looked at mortality data from 1995 to 2000, comparing the days of the month with "deathly connotations" (4, 14 and 24) with the other days of the month on both the Lunar and Gregorian calendars, and found no statistically significant difference in the occurrence of cardiac deaths in Cantonese people[3].
[1] Phillips et al: The Hound of the Baskervilles effect: natural experiment on the influence of psychological stress on timing of death in British Medical Journal – 2001. See here.
[2] Smith: Scared to Death? in British Medical Journal – 2002. See here.
[3] Panesar et al: Is four a deadly number for the Chinese? in Medical Journal of Australia – 2003.
Swollen Head Syndrome
Avian metapneumovirus (aMPV) causes turkey rhinotracheitis also known as avian pneumovirus infection of turkeys, which is an acute respiratory tract infection of turkeys. It is also associated with Swollen Head Syndrome also known as avian rhinotracheitis in broilers and broiler breeders, as well as with reproductive disorders that result in a marked drop in egg production in chickens and ducks.
The typical clinical signs seen in chickens infected with the virus include swelling of the periorbital and infraorbital sinuses, particularly around the eye, coupled with mild conjunctivitis. Some respiratory signs were also observed. The disease usually lasts from two to three weeks.
The Swollen Head Syndrome affects not only chickens, turkeys, and guinea fowl, but also pheasants, and Muscovy ducks. Furthermore, geese, most other duck species, and possibly pigeons are thought to be refractory to disease.
Respiratory signs occur in young birds and the adults are affected by drops in egg production, usually by from 5% to 30%. The transmission of the virus is lateral by aerosol through the respiratory route. It is spread by both airborne and mechanical (feed, water, and equipment) routes.
Avian metapneumovirus was first detected in turkeys in South Africa in the late 1970s, and has since spread to all major poultry-producing regions of the world, except Australia.
In 2001, the first human metapneumovirus (hMPV) was isolated in The Netherlands and classified as a member of the genus Metapneumovirus, which predominantly causes respiratory infections in humans.
Experimental studies suggest that turkeys also are susceptible to hMPV[1]. Complete genome sequencing has confirmed that the genomic organization of hMPV is similar to that of aMPV.
Vaccines are available for immunization of chickens and turkeys, and they are widely used in countries where the disease is endemic.
[1] Velayudhan et al: Human metapneumovirus in turkey poults in Emerging Infectious Diseases – 2006
The typical clinical signs seen in chickens infected with the virus include swelling of the periorbital and infraorbital sinuses, particularly around the eye, coupled with mild conjunctivitis. Some respiratory signs were also observed. The disease usually lasts from two to three weeks.
The Swollen Head Syndrome affects not only chickens, turkeys, and guinea fowl, but also pheasants, and Muscovy ducks. Furthermore, geese, most other duck species, and possibly pigeons are thought to be refractory to disease.
Respiratory signs occur in young birds and the adults are affected by drops in egg production, usually by from 5% to 30%. The transmission of the virus is lateral by aerosol through the respiratory route. It is spread by both airborne and mechanical (feed, water, and equipment) routes.
Avian metapneumovirus was first detected in turkeys in South Africa in the late 1970s, and has since spread to all major poultry-producing regions of the world, except Australia.
In 2001, the first human metapneumovirus (hMPV) was isolated in The Netherlands and classified as a member of the genus Metapneumovirus, which predominantly causes respiratory infections in humans.
Experimental studies suggest that turkeys also are susceptible to hMPV[1]. Complete genome sequencing has confirmed that the genomic organization of hMPV is similar to that of aMPV.
Vaccines are available for immunization of chickens and turkeys, and they are widely used in countries where the disease is endemic.
[1] Velayudhan et al: Human metapneumovirus in turkey poults in Emerging Infectious Diseases – 2006
Rebecca Syndrome
The Rebecca Syndrome is also known as Retroactive jealousy. It is the pathological emergence of jealousy towards an ex-partner of the current partner of the person experiencing it. In other words: It describes a severe form of pathological jealousy that a person experiences toward their partner's former lover.
The name of this syndrome was coined around the year 2006 by Dr. Darian Leader, a psychoanalyst and founding member of the Centre for Freudian Analysis and Research in London (UK). ,
The feeling of jealousy is considered pathological when it arises without solid grounds and when it reaches dimensions that affect the normal behavior of the person suffering from it.
This syndrome is named in homage to the novel Rebecca, written by Daphne du Maurier (1907-1989). The novel tells the story of Mrs de Winter who moves to Manderley with her new husband. She is haunted by the ghost of his late wife Rebecca - convinced he is still deeply in love with this seemingly perfect woman[1].
For some people, the Rebecca Syndrome is torturous. A relationship between two people might unconsciously be between three people. The ghost is always there.
The Rebecca Syndrome is not (yet) officially recognized as a psychological disorder in mainstream diagnostic classifications. It serves more as a cultural reference. The obsessions associated with this condition can lead to intrusive thoughts, stalking behaviors, and other negative consequences.
[1] Lee Glendinning: Rebecca syndrome: Or why increasing numbers of divorced and bereaved men are remarrying the 'ghosts' of their former wives in The Independent – 2006. See here.
The name of this syndrome was coined around the year 2006 by Dr. Darian Leader, a psychoanalyst and founding member of the Centre for Freudian Analysis and Research in London (UK). ,
The feeling of jealousy is considered pathological when it arises without solid grounds and when it reaches dimensions that affect the normal behavior of the person suffering from it.
This syndrome is named in homage to the novel Rebecca, written by Daphne du Maurier (1907-1989). The novel tells the story of Mrs de Winter who moves to Manderley with her new husband. She is haunted by the ghost of his late wife Rebecca - convinced he is still deeply in love with this seemingly perfect woman[1].
For some people, the Rebecca Syndrome is torturous. A relationship between two people might unconsciously be between three people. The ghost is always there.
The Rebecca Syndrome is not (yet) officially recognized as a psychological disorder in mainstream diagnostic classifications. It serves more as a cultural reference. The obsessions associated with this condition can lead to intrusive thoughts, stalking behaviors, and other negative consequences.
[1] Lee Glendinning: Rebecca syndrome: Or why increasing numbers of divorced and bereaved men are remarrying the 'ghosts' of their former wives in The Independent – 2006. See here.
Morton's Foot Syndrome
Morton's Foot Syndrome is known under a plethora of other names, such as Morton's toe, Morton's foot, Greek foot or Royal Foot.
This syndrome is characterized by a longer second toe. This is because the first metatarsal, behind the big toe, is short compared to the second metatarsal, next to it. It is a type of brachymetatarsia.
The most common symptom experienced due to Morton's Foot Syndrome is callusing and/or discomfort of the ball of the foot at the base of the second toe. The base of the big toe would normally bear the majority of a person's body weight during walking, but because the second metatarsal head is now farthest forward, the force is transferred there. Pain may also be felt in the arch of the foot, at the ankleward end of the first and second metatarsals.
Among the issues associated with Morton's Foot Syndrome is that the weight distribution causes the front of the foot to widen as the weight shifts from the first shortened toe to the others. Regular shoes will often cause metatarsalgia and neuromas as the shoe pushes together the toes. Wide shoes are recommended.
The name derives from American orthopedic surgeon Dudley Joy Morton (1884–1960)[1].
The ancient Greeks thought that having a long second toe was the best body feature. These big toes can still be seen in art and statues from Greece and Rome. Morton's Foot Syndrome is something that the Venus de Milo, Leonardo da Vinci's Vitruvian Man, and Michelangelo's David have.
The Vikings thought that if your second toe was long, you would live a long time.
Between 3% and 15% of people have a Greek toe.
[1] Schimizzi, Brage: Brachymetatarsia in Foot and Ankle Clinics - 2004. See here.
This syndrome is characterized by a longer second toe. This is because the first metatarsal, behind the big toe, is short compared to the second metatarsal, next to it. It is a type of brachymetatarsia.
The most common symptom experienced due to Morton's Foot Syndrome is callusing and/or discomfort of the ball of the foot at the base of the second toe. The base of the big toe would normally bear the majority of a person's body weight during walking, but because the second metatarsal head is now farthest forward, the force is transferred there. Pain may also be felt in the arch of the foot, at the ankleward end of the first and second metatarsals.
Among the issues associated with Morton's Foot Syndrome is that the weight distribution causes the front of the foot to widen as the weight shifts from the first shortened toe to the others. Regular shoes will often cause metatarsalgia and neuromas as the shoe pushes together the toes. Wide shoes are recommended.
The name derives from American orthopedic surgeon Dudley Joy Morton (1884–1960)[1].
The ancient Greeks thought that having a long second toe was the best body feature. These big toes can still be seen in art and statues from Greece and Rome. Morton's Foot Syndrome is something that the Venus de Milo, Leonardo da Vinci's Vitruvian Man, and Michelangelo's David have.
The Vikings thought that if your second toe was long, you would live a long time.
Between 3% and 15% of people have a Greek toe.
[1] Schimizzi, Brage: Brachymetatarsia in Foot and Ankle Clinics - 2004. See here.
Todestrieb or Thanatos Urge
In classical Freudian psychoanalytic theory, the Todestrieb ('death drive') is the unconcious drive towards your own death and destruction. This is often expressed through behaviors Todestrieb into his theory, Freud’s fundamental opposition was between the Ichtriebe ('ego drive') and the Sexualtriebe ('sexual drive'), a differentiation that is founded on the two-fold role of each individual being.
Sigmund Freud (1856-1939) first used the term in his 1920 essay Jenseits des Lustprinzips ('Beyond the Pleasure Principle'), but he 'borrowed' it from Sabina Spielrein (1885-1942), who mentioned it earlier in her paper Die Destruktion als Ursache des Werdens ('Destruction as the Cause of Coming Into Being') from 1912.
The concept of Todestrieb has been translated as "opposition between the ego (or death instincts) and the sexual (or life instincts)". The Todestrieb opposes Eros, the tendency toward survival, propagation, sex, and other creative, life-producing drives.
The Todestrieb is sometimes referred to as the Thanatos Urge in post-Freudian thought (in reference to the Greek personification of death), complementing Eros, although this term was never used by Sigmund Freud himself.
The terminology Thanatos Urge or simply Thanatos was introduced by Wilhelm Stekel (1868-1940) in 1909, though he used the words to signify a death-wish.
Still, it is a little odd that Sigmund Freud himself never, except in conversation, used the term Thanatos for the death instinct, one that has become so popular since.
At first, Freud used the terms 'eath instinct' and 'destructive instinct' indiscriminately, alternating between them, but in his discussion with Albert Einstein about war he made the distinction that the former is directed against the self and the latter, derived from it, is directed outward.
Sigmund Freud (1856-1939) first used the term in his 1920 essay Jenseits des Lustprinzips ('Beyond the Pleasure Principle'), but he 'borrowed' it from Sabina Spielrein (1885-1942), who mentioned it earlier in her paper Die Destruktion als Ursache des Werdens ('Destruction as the Cause of Coming Into Being') from 1912.
The concept of Todestrieb has been translated as "opposition between the ego (or death instincts) and the sexual (or life instincts)". The Todestrieb opposes Eros, the tendency toward survival, propagation, sex, and other creative, life-producing drives.
The Todestrieb is sometimes referred to as the Thanatos Urge in post-Freudian thought (in reference to the Greek personification of death), complementing Eros, although this term was never used by Sigmund Freud himself.
The terminology Thanatos Urge or simply Thanatos was introduced by Wilhelm Stekel (1868-1940) in 1909, though he used the words to signify a death-wish.
Still, it is a little odd that Sigmund Freud himself never, except in conversation, used the term Thanatos for the death instinct, one that has become so popular since.
At first, Freud used the terms 'eath instinct' and 'destructive instinct' indiscriminately, alternating between them, but in his discussion with Albert Einstein about war he made the distinction that the former is directed against the self and the latter, derived from it, is directed outward.
Koala Immune Deficiency Syndrome
Koala Immune Deficiency Syndrome (KIDS) is caused by the Koala retrovirus (KoRV). An infection with this syndrome results an AIDS-like immunodeficiency that leaves infected koala (Phascolarctos cinereus) more susceptible to infectious disease and cancers.
Koala retrovirus is closely related genetically to gibbon-ape leukemia virus (GaLV), feline leukemia virus (FeLV), and porcine endogenous retrovirus (PERV)[1].
The Koala retrovirus is thought to be a recently introduced exogenous virus that is also integrating into the koala genome (becoming endogenous). Thus the virus can transmit both horizontally (from animal to animal in the classic viral sense) and vertically (from parent to offspring as a gene).
Koala retrovirus was initially described as a novel endogenous retrovirus found within the koala genome and in tissues as free virions. Analysis showed that KoRV is an active replicating endogenous retrovirus that can also produce infectious virions.
The analysis also showed that KoRV was closely related to the highly pathogenic gibbon ape leukemia virus (GALV).
Some 80% of all deaths of captive koalas in Queensland (Australia) from leukemia, lymphoma, malignant tumours, and immune deficiency disorders is attributable to the virus. The virus is considered a threat that could lead to extinction of koalas in Queensland within 15 years
Research has also shown that some populations of koalas, particularly an isolated colony on Kangaroo Island do not appear to have the endogenous form of the retrovirus. This suggests that the virus gene sequence is a rather new acquisition for the koala genome.
Prevalence of KoRV (and KIDS) in Australian koala populations suggests a trend spreading from the north down to the south of Australia.[ Northern populations are completely infected, while some southern populations are still free.
In 2013, an exclusively exogenous subtype of KoRV was identified and termed Koala retrovirus-B (KoRV-B), with the endogenous form of KoRV referred to as Koala retrovirus-A (KoRV-A)[2]. KoRV-B will likely remain exogenous and more pathogenic than KoRV-A, because the deleterious effects it causes in its hosts will not be selected against to the extent they would in a virus capable of integrating into the germ line. So far, nine subtypes of KoRV have been isolated (KoRV-A to KoRV-I).
Currently, no vaccine or effective treatment is available for KoRV or its associated neoplastic diseases.
[1] Kayesh et al: Koala retrovirus epidemiology, transmission mode, pathogenesis, and host immune response in koalas (Phascolarctos cinereus): a review in Archives of Virology - 2020
[2] Xu et al: An exogenous retrovirus isolated from koalas with malignant neoplasias in a US zoo in PNAS- 2012
Koala retrovirus is closely related genetically to gibbon-ape leukemia virus (GaLV), feline leukemia virus (FeLV), and porcine endogenous retrovirus (PERV)[1].
The Koala retrovirus is thought to be a recently introduced exogenous virus that is also integrating into the koala genome (becoming endogenous). Thus the virus can transmit both horizontally (from animal to animal in the classic viral sense) and vertically (from parent to offspring as a gene).
Koala retrovirus was initially described as a novel endogenous retrovirus found within the koala genome and in tissues as free virions. Analysis showed that KoRV is an active replicating endogenous retrovirus that can also produce infectious virions.
The analysis also showed that KoRV was closely related to the highly pathogenic gibbon ape leukemia virus (GALV).
Some 80% of all deaths of captive koalas in Queensland (Australia) from leukemia, lymphoma, malignant tumours, and immune deficiency disorders is attributable to the virus. The virus is considered a threat that could lead to extinction of koalas in Queensland within 15 years
Research has also shown that some populations of koalas, particularly an isolated colony on Kangaroo Island do not appear to have the endogenous form of the retrovirus. This suggests that the virus gene sequence is a rather new acquisition for the koala genome.
Prevalence of KoRV (and KIDS) in Australian koala populations suggests a trend spreading from the north down to the south of Australia.[ Northern populations are completely infected, while some southern populations are still free.
In 2013, an exclusively exogenous subtype of KoRV was identified and termed Koala retrovirus-B (KoRV-B), with the endogenous form of KoRV referred to as Koala retrovirus-A (KoRV-A)[2]. KoRV-B will likely remain exogenous and more pathogenic than KoRV-A, because the deleterious effects it causes in its hosts will not be selected against to the extent they would in a virus capable of integrating into the germ line. So far, nine subtypes of KoRV have been isolated (KoRV-A to KoRV-I).
Currently, no vaccine or effective treatment is available for KoRV or its associated neoplastic diseases.
[1] Kayesh et al: Koala retrovirus epidemiology, transmission mode, pathogenesis, and host immune response in koalas (Phascolarctos cinereus): a review in Archives of Virology - 2020
[2] Xu et al: An exogenous retrovirus isolated from koalas with malignant neoplasias in a US zoo in PNAS- 2012
Tiara Syndrome
The Tiara Syndrome is about women being too reluctant to apply for promotions even when they are well deserved, simply believing good job performance will naturally lead to rewards[1]. The term was coined by Carol Frohlinger and Deborah Kolb, the founders of Negotiating Women, Inc., an advisory firm committed to helping organizations to advance talented women into leadership positions.
The Tiara Syndrome is related to the Imposter Syndrome. Women often undervalue their skills and are less effective at self-promotion than their male counterparts. A number of strategies can help battle this syndrome and ease the stress of 'taking off the tiara'.
As Carol Frohlinger says, "Women expect that if they keep doing their job well someone will notice them and place a tiara on their head. That never happens."
Her comment was made particularly in relation to negotiating starting salary and pay rises which men tend to be more comfortable at doing than women. However, as Sheryl Sandberg writes in her (ghost-written) book 'Lean In: Women, Work, and the Will to Lead' (2013): "Women are also more reluctant to apply for promotions even when deserved, often believing that good job performance will naturally lead to rewards."
Of course a high level of performance is the entry ticket to career progress but sometimes this very diligence gets in the way of fast tracking your career. Many women I work with in the City explain that they are so busy doing the operational aspects of their job that they don't have time to step back and focus on strategic priorities, for example. Nor do they feel they have space in their busy working weeks to fit in networking which is seen as an unnecessary - and often uncomfortable - use of their precious time. Similarly they do not seek mentors to guide them or indeed the support of sponsors to give them the invaluable exposure and opportunities needed to step up to senior leadership positions.
Psychologist Cordelia Fine says such behaviour stems from socialisation, not innate differences between the sexes. Some men also suffer, just as many women may not, but similar to 'Imposter Syndrome' it does appear to be more of a female behaviour.
[1] Fitzpatrick, Curran: Waiting for your coronation: a career-limiting trap in Nursing Economics – 2014.
The Tiara Syndrome is related to the Imposter Syndrome. Women often undervalue their skills and are less effective at self-promotion than their male counterparts. A number of strategies can help battle this syndrome and ease the stress of 'taking off the tiara'.
As Carol Frohlinger says, "Women expect that if they keep doing their job well someone will notice them and place a tiara on their head. That never happens."
Her comment was made particularly in relation to negotiating starting salary and pay rises which men tend to be more comfortable at doing than women. However, as Sheryl Sandberg writes in her (ghost-written) book 'Lean In: Women, Work, and the Will to Lead' (2013): "Women are also more reluctant to apply for promotions even when deserved, often believing that good job performance will naturally lead to rewards."
Of course a high level of performance is the entry ticket to career progress but sometimes this very diligence gets in the way of fast tracking your career. Many women I work with in the City explain that they are so busy doing the operational aspects of their job that they don't have time to step back and focus on strategic priorities, for example. Nor do they feel they have space in their busy working weeks to fit in networking which is seen as an unnecessary - and often uncomfortable - use of their precious time. Similarly they do not seek mentors to guide them or indeed the support of sponsors to give them the invaluable exposure and opportunities needed to step up to senior leadership positions.
Psychologist Cordelia Fine says such behaviour stems from socialisation, not innate differences between the sexes. Some men also suffer, just as many women may not, but similar to 'Imposter Syndrome' it does appear to be more of a female behaviour.
[1] Fitzpatrick, Curran: Waiting for your coronation: a career-limiting trap in Nursing Economics – 2014.
Grey Gorilla Syndrome
Assessment and decision-making skills are considered inherent to nursing, but what the underlying cognitive process involves or how it is developed and used has received much less attention. The process by which nurses link together basic knowledge, past experiences, and 'gut feelings' as a basis for decisions is called the 'Nursing Gestalt'. New nurses learn to make assessments, diagnoses, and sound judgements about care from a more experienced nurse who supports and teaches the neophyte. These researchers have called this mentoring relationship the Grey Gorilla Syndrome, in reference to the silverback primate who serves as a leader-teacher-protector-role model for his troupe.
Data were collected from interactions and in-depth interviews with 28 subjects from all levels of basic nursing who worked in medical intensive care units. The investigation studied the practice of these nurses in the early detection of cardiogenic shock. Nurses who had the support and guidance of Grey Gorillas expressed feelings of greater self-actualization, more job satisfaction, better peer relationships, and less stress. An emotional involvement and intense relationship developed between the Grey Gorilla and the neophyte. Units having a Grey Gorilla were observed to be quieter and more efficiently organized.
Barriers to this mentor-neophyte relationship are timing and accessibility since potential Grey Gorillas often have conflicting demands or work shifts. Organizational and managerial duties take up the time of head nurses and coordinators, and neophytes find themselves working evenings or nights with other equally inexperienced nurses. Some nurses with the necessary experience and expertise to assume a mentoring role are unapproachable or are reluctant to share their knowledge. Neophytes were not found to be reluctant or too competitive to enter into the relationship, and expressed regret only during the weaning phase.
The researchers suggest that problems of patient care, burnout, and turnover could be reduced by encouraging the use and development of Grey Gorillas, proving positive feedback, and recognizing their contribution.
Source: Pyles, Stern: Discovery of nursing gestalt in critical care nursing: The importance of the Grey Gorilla Syndrome in Journal of Nursing Scholarship - 1983
Data were collected from interactions and in-depth interviews with 28 subjects from all levels of basic nursing who worked in medical intensive care units. The investigation studied the practice of these nurses in the early detection of cardiogenic shock. Nurses who had the support and guidance of Grey Gorillas expressed feelings of greater self-actualization, more job satisfaction, better peer relationships, and less stress. An emotional involvement and intense relationship developed between the Grey Gorilla and the neophyte. Units having a Grey Gorilla were observed to be quieter and more efficiently organized.
Barriers to this mentor-neophyte relationship are timing and accessibility since potential Grey Gorillas often have conflicting demands or work shifts. Organizational and managerial duties take up the time of head nurses and coordinators, and neophytes find themselves working evenings or nights with other equally inexperienced nurses. Some nurses with the necessary experience and expertise to assume a mentoring role are unapproachable or are reluctant to share their knowledge. Neophytes were not found to be reluctant or too competitive to enter into the relationship, and expressed regret only during the weaning phase.
The researchers suggest that problems of patient care, burnout, and turnover could be reduced by encouraging the use and development of Grey Gorillas, proving positive feedback, and recognizing their contribution.
Source: Pyles, Stern: Discovery of nursing gestalt in critical care nursing: The importance of the Grey Gorilla Syndrome in Journal of Nursing Scholarship - 1983
Immigration Delay Disease
Immigration Delay Disease is the somewhat jocular name of a genetic disorder called adermatoglyphia. This is an extremely rare genetic disorder that prevents the development of fingerprints. Just five extended families worldwide are known to be affected by this condition.
Adermatoglyphia - from Ancient Greek a- (ἀ-) 'not' + dérma (δέρμα) 'skin' + gluphḗ (γλυφή) 'carving' - is the absence of ridges on the skin on the pads of the fingers and toes, as well as on the palms of the hands and soles of the feet. The patterns of these ridges (called dermatoglyphs) form whorls, arches, and loops that are the basis for each person's unique fingerprints. Because no two people have the same patterns, fingerprints have long been used as a way to identify individuals.
The name 'Immigration Delay Disease' was coined by Professor Peter Itin, a Dermatologist, based in Basel (Switzerland), after his first patient had trouble traveling into the U.S. without any fingerprints for identification.
In 2010 an isolated finding was published regarding the description of a person from Switzerland who lacked fingerprints[1]. The heterozygous expression of the mutation suggests an autosomal dominant mode of inheritance. The Swiss patient, and eight of her relatives who also had the mutation, all had 'flat finger pads and a reduced number of sweat glands in the hands'.
The medical condition and the 2007 Swiss medical case are both mentioned in the episode entitled "She Was Murdered Twice" (Series 4, Episode 7) of the television series Death in Paradise.
[1] Burger et al: The immigration delay disease: Adermatoglyphia–inherited absence of epidermal ridges in Journal of the American Academy of Dermatology - 2010
Adermatoglyphia - from Ancient Greek a- (ἀ-) 'not' + dérma (δέρμα) 'skin' + gluphḗ (γλυφή) 'carving' - is the absence of ridges on the skin on the pads of the fingers and toes, as well as on the palms of the hands and soles of the feet. The patterns of these ridges (called dermatoglyphs) form whorls, arches, and loops that are the basis for each person's unique fingerprints. Because no two people have the same patterns, fingerprints have long been used as a way to identify individuals.
The name 'Immigration Delay Disease' was coined by Professor Peter Itin, a Dermatologist, based in Basel (Switzerland), after his first patient had trouble traveling into the U.S. without any fingerprints for identification.
In 2010 an isolated finding was published regarding the description of a person from Switzerland who lacked fingerprints[1]. The heterozygous expression of the mutation suggests an autosomal dominant mode of inheritance. The Swiss patient, and eight of her relatives who also had the mutation, all had 'flat finger pads and a reduced number of sweat glands in the hands'.
The medical condition and the 2007 Swiss medical case are both mentioned in the episode entitled "She Was Murdered Twice" (Series 4, Episode 7) of the television series Death in Paradise.
[1] Burger et al: The immigration delay disease: Adermatoglyphia–inherited absence of epidermal ridges in Journal of the American Academy of Dermatology - 2010
Barbed Wire Disease
The Swiss physician Adolf Lukas Vischer (1884–1974) was an observer of the impact of the First World War on the human condition. In 1918 Vischer published an account of the psychological harm done to young men through the modern phenomenon of wartime captivity in POW and internment camps. The name of the book was Die Stacheldrahtkrankheit ('The Barbed-Wire Disease').
Vischer’s observations indicated that those who had been in enemy captivity for extended periods —two years or more— were also suffering from a particular kind of mental illness characterized by disinterest in life beyond the camp, restlessness and an inability to concentrate. He also witnessed similar symptoms among European and Indian POWs in Turkish captivity during an inspection tour with the Red Cross in Asia Minor in 1916–17, and again among German civilian internees on the Isle of Man and prisoners held in military and civilian camps on the British mainland.
This brought Vischer to the conclusion that what was already being dubbed ‘barbed-wire disease’ in some of the camp newspapers was a universal human response to being held behind barbed wire for prolonged stretches of time.
The syndrome was something common to all long-term prisoners. Furthermore, it was not eased or worsened by education, class, ethnicity or religion of any particular group of prisoners; rather, its sole cause was the fact of living behind barbed wire itself. The degree of severity depended primarily on the duration of captivity, not on experiences prior to capture.
At the same time in France, physicians also observed these same effects on captivity, but they invented their own term: cafard, from the Arabic kafir ('unbeliever') but with the currupted meaning of 'depression' or 'melancholy. They saw cafard as a form of spiritual home-sickness to be fought against and overcome, rather than as a medical condition that could only be treated, if at all, by release back into civilian life.
Though not exactly the same, modern researchers use the term 'institutionalization' or 'institutional syndrome' to describe deficits or disabilities in social and life skills, which develop after a person has spent a long period living in prisons or mental hospitals. These individuals may be deprived of independence and of responsibility, to the point that once they return to 'outside life' they are often unable to manage many of its demands.
Vischer’s observations indicated that those who had been in enemy captivity for extended periods —two years or more— were also suffering from a particular kind of mental illness characterized by disinterest in life beyond the camp, restlessness and an inability to concentrate. He also witnessed similar symptoms among European and Indian POWs in Turkish captivity during an inspection tour with the Red Cross in Asia Minor in 1916–17, and again among German civilian internees on the Isle of Man and prisoners held in military and civilian camps on the British mainland.
This brought Vischer to the conclusion that what was already being dubbed ‘barbed-wire disease’ in some of the camp newspapers was a universal human response to being held behind barbed wire for prolonged stretches of time.
The syndrome was something common to all long-term prisoners. Furthermore, it was not eased or worsened by education, class, ethnicity or religion of any particular group of prisoners; rather, its sole cause was the fact of living behind barbed wire itself. The degree of severity depended primarily on the duration of captivity, not on experiences prior to capture.
At the same time in France, physicians also observed these same effects on captivity, but they invented their own term: cafard, from the Arabic kafir ('unbeliever') but with the currupted meaning of 'depression' or 'melancholy. They saw cafard as a form of spiritual home-sickness to be fought against and overcome, rather than as a medical condition that could only be treated, if at all, by release back into civilian life.
Though not exactly the same, modern researchers use the term 'institutionalization' or 'institutional syndrome' to describe deficits or disabilities in social and life skills, which develop after a person has spent a long period living in prisons or mental hospitals. These individuals may be deprived of independence and of responsibility, to the point that once they return to 'outside life' they are often unable to manage many of its demands.
Bromism
Bromism is the syndrome that results from the long-term consumption of bromine, usually through bromine-based sedatives such as potassium bromide and lithium bromide. Bromism was once a very common disorder, being responsible for 5 to 10% of psychiatric hospital admissions, but is now uncommon since bromide was withdrawn from clinical use in many countries and severely restricted in others.
Bromism is caused by a neurotoxic effect on the brain. The symptoms of bromism include mental dullness, memory loss, slurred speech, tremors, ataxia, and muscular weakness, a transitory state resembling paranoid schizophrenia, and a skin eruption called bromoderma. Eventually leading to somnolence, psychosis, seizures, and delirium.
High levels of bromide chronically impair the membrane of neurons, which progressively impairs neuronal transmission, leading to toxicity. Doses as small as 0.5 gram per day of bromide can lead to bromism.
While significant and sometimes serious disturbances occur to neurologic, psychiatric, dermatological, and gastrointestinal functions, death is rare from bromism.
Bromism has also been caused by excessive consumption of soda that contains Brominated Vegetable Oil, leading to headache, fatigue, ataxia, memory loss, and potentially inability to walk as observed in one case.
Brominated Vegetable Oil (or BVO) is a vegetable oil that is modified with bromine. As was (until recently) authorized, it was used in small amounts, not to exceed 15 parts per million, as a stabilizer for fruit flavouring in beverages to keep the citrus flavouring from floating to the top. Over time, many beverage makers have reformulated their products to replace BVO with an alternative ingredient. Today, few beverages in the U.S. contain BVO.
Brominated Vegetable Oils were already banned in Europe, but the American Food and Drugs Administration (finally) decided on July 3, 2024 to revoke its use as a food additive[1]. Sun Drop, manufactured by Keurig Dr Pepper, is the largest national brand in the US to still include the additive.
There is no specific treatment available for bromism. Increased intake of regular salt and water, which increases the flow of the related chloride ion through the body, is one way of flushing out the bromide.
You might think that you're now safe from bromine poisoning, but that's not entirely true: bromides can still be obtained as unregulated dietary supplements[2].
[1] FDA: Brominated Vegetable Oil (BVO). See here.
[2] Friedman, Cantrell: Mind the gap: Bromism secondary to internet-purchased supplements in American Journal of Emergency Medicine - 2022.
Bromism is caused by a neurotoxic effect on the brain. The symptoms of bromism include mental dullness, memory loss, slurred speech, tremors, ataxia, and muscular weakness, a transitory state resembling paranoid schizophrenia, and a skin eruption called bromoderma. Eventually leading to somnolence, psychosis, seizures, and delirium.
High levels of bromide chronically impair the membrane of neurons, which progressively impairs neuronal transmission, leading to toxicity. Doses as small as 0.5 gram per day of bromide can lead to bromism.
While significant and sometimes serious disturbances occur to neurologic, psychiatric, dermatological, and gastrointestinal functions, death is rare from bromism.
Bromism has also been caused by excessive consumption of soda that contains Brominated Vegetable Oil, leading to headache, fatigue, ataxia, memory loss, and potentially inability to walk as observed in one case.
Brominated Vegetable Oil (or BVO) is a vegetable oil that is modified with bromine. As was (until recently) authorized, it was used in small amounts, not to exceed 15 parts per million, as a stabilizer for fruit flavouring in beverages to keep the citrus flavouring from floating to the top. Over time, many beverage makers have reformulated their products to replace BVO with an alternative ingredient. Today, few beverages in the U.S. contain BVO.
Brominated Vegetable Oils were already banned in Europe, but the American Food and Drugs Administration (finally) decided on July 3, 2024 to revoke its use as a food additive[1]. Sun Drop, manufactured by Keurig Dr Pepper, is the largest national brand in the US to still include the additive.
There is no specific treatment available for bromism. Increased intake of regular salt and water, which increases the flow of the related chloride ion through the body, is one way of flushing out the bromide.
You might think that you're now safe from bromine poisoning, but that's not entirely true: bromides can still be obtained as unregulated dietary supplements[2].
[1] FDA: Brominated Vegetable Oil (BVO). See here.
[2] Friedman, Cantrell: Mind the gap: Bromism secondary to internet-purchased supplements in American Journal of Emergency Medicine - 2022.
Sèvres Syndrome
Turkey has managed to consistently choose the wrong side of history over the last 200 years. This meant that the country supported the Germans in both the First and Second World Wars and therefore had to bear the consequences of the loss of the Axis. Twice.
Because both world wars were ultimately about territorial gain, the losing side was forced to make territorial concessions. After the First World War, the Treaty of Versailles ordered Germany to pay reparations in 1919, but also to make territorial concessions.
The Treaty of Versailles for Germany had its counterpart in the Treaty of Sèvres for Turkey. The 1920 Treaty of Sèvres was a pact between the Allies and the Ottoman Empire, officially dismantling the Ottoman Empire and forcing it to relinquish claims to territories in North Africa and the Middle East. It also recognized independent and autonomous areas for Armenia, Kurdistan, and Greece.
Turkey's pride was hurt, and the humiliation and the subsequent implosion of the Turkish army eventually resulted in widespread guerilla warfare, led by Mustafa Kemal (1881-1938), later glorified as Atatürk. As the Turks are especially adept at genocide, huge numbers of Greeks, Jews and Armenians were massacred during the 'War of Independence' in 1920. Even in 1955, a pogrom against Greeks and Greek properties was endorsed by the Turkish government.
The Treaty of Sèvres was never implemented since it was left unratified by the Ottoman Parliament and due to Turkish victory during the subsequent 'War of Independence'.
Still, the Treaty of Sèvres still has a psychological effect on modern Turks. So much that it has become a syndrome: The Sèvres syndrome (Sevr sendromu)[1]. It refers to a popular belief in Turkey that dangerous internal and external enemies, especially the West, are 'conspiring to weaken and carve up the Turkish Republic'.
This belief is simply a conspiracy theory, because no one would, in their right mind, want to carve up Turkey (again). But it did result in a sort of siege mentality among many Turks. Which is stupid, but conspiracy theories are mostly in the realm of stupidity.
The Sèvres Syndrome is also the reason that Turkey is turning its back on Europe and forging closer ties with Russia and its former satellites. Which is also why Turkey is expected to join the Shanghai Cooperation Organization (SCO), a regional organization helmed by China and Russia. The SCO has renowned members such as China, Russia, Kazakhstan, Kyrgyzstan, Tajikistan, India, Pakistan, Iran, and Belarus.
[1] Guida: The Sèvres Syndrome and “Komplo” Theories in the Islamist and Secular Press in Turkish Studies - 2008
Because both world wars were ultimately about territorial gain, the losing side was forced to make territorial concessions. After the First World War, the Treaty of Versailles ordered Germany to pay reparations in 1919, but also to make territorial concessions.
The Treaty of Versailles for Germany had its counterpart in the Treaty of Sèvres for Turkey. The 1920 Treaty of Sèvres was a pact between the Allies and the Ottoman Empire, officially dismantling the Ottoman Empire and forcing it to relinquish claims to territories in North Africa and the Middle East. It also recognized independent and autonomous areas for Armenia, Kurdistan, and Greece.
Turkey's pride was hurt, and the humiliation and the subsequent implosion of the Turkish army eventually resulted in widespread guerilla warfare, led by Mustafa Kemal (1881-1938), later glorified as Atatürk. As the Turks are especially adept at genocide, huge numbers of Greeks, Jews and Armenians were massacred during the 'War of Independence' in 1920. Even in 1955, a pogrom against Greeks and Greek properties was endorsed by the Turkish government.
The Treaty of Sèvres was never implemented since it was left unratified by the Ottoman Parliament and due to Turkish victory during the subsequent 'War of Independence'.
Still, the Treaty of Sèvres still has a psychological effect on modern Turks. So much that it has become a syndrome: The Sèvres syndrome (Sevr sendromu)[1]. It refers to a popular belief in Turkey that dangerous internal and external enemies, especially the West, are 'conspiring to weaken and carve up the Turkish Republic'.
This belief is simply a conspiracy theory, because no one would, in their right mind, want to carve up Turkey (again). But it did result in a sort of siege mentality among many Turks. Which is stupid, but conspiracy theories are mostly in the realm of stupidity.
The Sèvres Syndrome is also the reason that Turkey is turning its back on Europe and forging closer ties with Russia and its former satellites. Which is also why Turkey is expected to join the Shanghai Cooperation Organization (SCO), a regional organization helmed by China and Russia. The SCO has renowned members such as China, Russia, Kazakhstan, Kyrgyzstan, Tajikistan, India, Pakistan, Iran, and Belarus.
[1] Guida: The Sèvres Syndrome and “Komplo” Theories in the Islamist and Secular Press in Turkish Studies - 2008
Bald Sea Urchin Disease
Bald Sea Urchin Disease is a bacterial disease known to affect several species of sea urchins in the Mediterranean Sea, North Atlantic and along the California coastline. The disease was first described in the red sea urchin (Mesocentrotus franciscanus)[1].
Research suggests two pathogens are responsible for the disease, Listonella anguillarum and Aeromonas salmonicida[2].
Infection generally occurs at the site of an existing physical injury. The affected area turns green and spines and other appendages are lost. Urchins also lose control of their tube feet, which they use to walk.
Spine loss is the key characteristic of the disease. If the lesion remains shallow and covers less than 30% of the animal's surface area, the animal tends to survive and eventually regenerates any lost tissue. However, if the damage is more extensive or so deep that the hard inner test is perforated, the disease is fatal.
In the 1980s the near disappearance of a keystone herbivore, the long-spined black sea urchin (Diadema antillarum) due to disease of unknown etiology resulted in a massive ecological phase shift in the Caribbean Sea from coral cover to uncontrolled algal growth on the reefs.
At the beginning of 2023, researchers spotted the first signs of the urchin plague in the Mediterranean Sea, when an invasive species of urchin, the black sea urchin (Diadema setosum), began falling sick in waters around Greece and Turkey. From there, the disease appears to have spread southward through the Suez Canal to the Red Sea.
The epidemic looks set to wipe out all of the Mediterranean and Red Sea’s urchins, and possibly their coral reefs too.
"It's a fast and violent death: within just two days a healthy sea urchin becomes a skeleton with massive tissue loss," Omri Bronstein, a senior lecturer in Zoology at Tel Aviv University, said in a statement. "While some corpses are washed ashore, most sea urchins are devoured while they are dying and unable to defend themselves, which could speed up contagion by the fish who prey on them."
It is unknown why sea urchins are suddenly vulnerable to these bacterial predators. Maybe it is a result of climate change or maybe one of these bacteria mutated and shared some mutated genes with another bacterial species.
[1] Johnson: Studies on Diseased Urchins from Point Loma in Annual Report Kelp Habitat Improvement Project – 1971
[2] Shaw et al: Bald sea urchin disease shifts the surface microbiome on purple sea urchins in an aquarium in Pathogens and disease – 2023. See here.
Research suggests two pathogens are responsible for the disease, Listonella anguillarum and Aeromonas salmonicida[2].
Infection generally occurs at the site of an existing physical injury. The affected area turns green and spines and other appendages are lost. Urchins also lose control of their tube feet, which they use to walk.
Spine loss is the key characteristic of the disease. If the lesion remains shallow and covers less than 30% of the animal's surface area, the animal tends to survive and eventually regenerates any lost tissue. However, if the damage is more extensive or so deep that the hard inner test is perforated, the disease is fatal.
In the 1980s the near disappearance of a keystone herbivore, the long-spined black sea urchin (Diadema antillarum) due to disease of unknown etiology resulted in a massive ecological phase shift in the Caribbean Sea from coral cover to uncontrolled algal growth on the reefs.
At the beginning of 2023, researchers spotted the first signs of the urchin plague in the Mediterranean Sea, when an invasive species of urchin, the black sea urchin (Diadema setosum), began falling sick in waters around Greece and Turkey. From there, the disease appears to have spread southward through the Suez Canal to the Red Sea.
The epidemic looks set to wipe out all of the Mediterranean and Red Sea’s urchins, and possibly their coral reefs too.
"It's a fast and violent death: within just two days a healthy sea urchin becomes a skeleton with massive tissue loss," Omri Bronstein, a senior lecturer in Zoology at Tel Aviv University, said in a statement. "While some corpses are washed ashore, most sea urchins are devoured while they are dying and unable to defend themselves, which could speed up contagion by the fish who prey on them."
It is unknown why sea urchins are suddenly vulnerable to these bacterial predators. Maybe it is a result of climate change or maybe one of these bacteria mutated and shared some mutated genes with another bacterial species.
[1] Johnson: Studies on Diseased Urchins from Point Loma in Annual Report Kelp Habitat Improvement Project – 1971
[2] Shaw et al: Bald sea urchin disease shifts the surface microbiome on purple sea urchins in an aquarium in Pathogens and disease – 2023. See here.
Wild Boar Paradox
On April 26, 1986, the Number Four RBMK reactor at the nuclear power plant at Chernobyl, Ukraine, went out of control during a test at low-power, leading to an explosion and fire that demolished the reactor building and released large amounts of radiation into the atmosphere.
The explosion had a major impact on the forest ecosystem in Central Europe. While the contamination of deer and roe deer decreased over time as expected, the measured levels of radioactivity in the meat of wild boar remained almost constant. The limit values are still being exceeded by a significant margin in some samples. For years, this Wild Boar Paradox was considered unsolved. Until now[1].
To identify the source of radioactive caesium in Bavarian wild boar meat, the scientists turned to a different caesium isotope with a much longer half-life: 135Cs. A nuclear explosion yields a relatively high 135Cs/137Cs-ratio, whereas a reactor yields a low ratio.
After analysing the ratio of Caesium isotopes in samples of wild boar meat from eleven districts of Bavaria, Germany, scientists concluded that global fallout from nuclear weapons tests is still responsible for a significant fraction of the contamination, even though Bavaria also experienced heavy fallout from the Chornobyl reactor meltdown.
Although Chernobyl has been widely believed to be the prime source of 137Cs in wild boars, the team found that “old” 137Cs from weapons fallout significantly contributes to the total level (10–68%) in those specimens that exceeded the regulatory limit.
In the paper, scientists note that this paradoxical non-decline is often attributed to the boars’ tendency to root up and eat underground fungi such as deer truffles. Under the 'right' soil conditions, these organisms act as a repository for 137Cs, which migrates downwards through the soil very slowly, sometimes only about one millimetre per year.
This result suggests that there are, in effect, two separate downward-migrating Caesium “fronts” contaminating the boars’ winter food supply: one from atmospheric nuclear weapons tests, which peaked in 1964, and one from Chernobyl in 1986.
[1] Stäger et al: Disproportionately High Contributions of 60 Year Old Weapons-137Cs Explain the Persistence of Radioactive Contamination in Bavarian Wild Boars in Environmental Science & Technology – 2023. See here.
The explosion had a major impact on the forest ecosystem in Central Europe. While the contamination of deer and roe deer decreased over time as expected, the measured levels of radioactivity in the meat of wild boar remained almost constant. The limit values are still being exceeded by a significant margin in some samples. For years, this Wild Boar Paradox was considered unsolved. Until now[1].
To identify the source of radioactive caesium in Bavarian wild boar meat, the scientists turned to a different caesium isotope with a much longer half-life: 135Cs. A nuclear explosion yields a relatively high 135Cs/137Cs-ratio, whereas a reactor yields a low ratio.
After analysing the ratio of Caesium isotopes in samples of wild boar meat from eleven districts of Bavaria, Germany, scientists concluded that global fallout from nuclear weapons tests is still responsible for a significant fraction of the contamination, even though Bavaria also experienced heavy fallout from the Chornobyl reactor meltdown.
Although Chernobyl has been widely believed to be the prime source of 137Cs in wild boars, the team found that “old” 137Cs from weapons fallout significantly contributes to the total level (10–68%) in those specimens that exceeded the regulatory limit.
In the paper, scientists note that this paradoxical non-decline is often attributed to the boars’ tendency to root up and eat underground fungi such as deer truffles. Under the 'right' soil conditions, these organisms act as a repository for 137Cs, which migrates downwards through the soil very slowly, sometimes only about one millimetre per year.
This result suggests that there are, in effect, two separate downward-migrating Caesium “fronts” contaminating the boars’ winter food supply: one from atmospheric nuclear weapons tests, which peaked in 1964, and one from Chernobyl in 1986.
[1] Stäger et al: Disproportionately High Contributions of 60 Year Old Weapons-137Cs Explain the Persistence of Radioactive Contamination in Bavarian Wild Boars in Environmental Science & Technology – 2023. See here.
Flight Shame (or Flygskam)
Like Greta Thunberg, flight shame (or flygskam) is an anti-flying social movement that originated in Sweden. The aim of this movement is to reduce the environmental impact of aviation.
Flight shame refers to an individual's uneasiness over engaging in activities that are energy-intense and climatically problematic. The movement started in 2018 in Sweden and gained some traction the following years among climate activists throughout northern Europe. The term is also used to shame air travelers as people involved in socially undesirable activities. This way the movement tries to discourage people from flying to lower carbon emissions in order to thwart climate change.
Staffan Lindberg, a Swedish singer, was reported to have coined the term flygskam in 2017. Malena Ernman, a Swedish opera singer and the mother of climate activist Greta Thunberg, also announced publicly that she would stop flying. She was blackmailed into flygskam by her daughter, because forr about two years, Greta Thunberg challenged her parents to lower the family's carbon footprint and overall impact on the environment by becoming vegan, upcycling, and giving up flying[1].
Tågskryt, a Swedish word that literally means 'train brag', is a term that is also derived from the flygskam movement. It wasn't enough just to shame people, but you also had to brag that you were better than them. Yes, you have to post your journey by train on social media.
But, as we are all humans, some claim to be part of the flygskam movement, but do take the occasional flight. The Swedish call this att smygflyga (to sneakily fly) and no one will ever post this on social media.
While the intentions of the adherents of flygskam are perhaps laudable, their impact is minimal. In fact, shaming people into not using a plane is probably counterproductive. As are Greta Thunberg's actions. Have you ever wondered how Greta Thunberg travels to all those locations to participate in strikes?
[1] Jonathan Watts: Greta Thunberg, schoolgirl climate change warrior: ‘Some people can let things go. I can’t’ in The Guardian - March 11, 2019. See here.
Flight shame refers to an individual's uneasiness over engaging in activities that are energy-intense and climatically problematic. The movement started in 2018 in Sweden and gained some traction the following years among climate activists throughout northern Europe. The term is also used to shame air travelers as people involved in socially undesirable activities. This way the movement tries to discourage people from flying to lower carbon emissions in order to thwart climate change.
Staffan Lindberg, a Swedish singer, was reported to have coined the term flygskam in 2017. Malena Ernman, a Swedish opera singer and the mother of climate activist Greta Thunberg, also announced publicly that she would stop flying. She was blackmailed into flygskam by her daughter, because forr about two years, Greta Thunberg challenged her parents to lower the family's carbon footprint and overall impact on the environment by becoming vegan, upcycling, and giving up flying[1].
Tågskryt, a Swedish word that literally means 'train brag', is a term that is also derived from the flygskam movement. It wasn't enough just to shame people, but you also had to brag that you were better than them. Yes, you have to post your journey by train on social media.
But, as we are all humans, some claim to be part of the flygskam movement, but do take the occasional flight. The Swedish call this att smygflyga (to sneakily fly) and no one will ever post this on social media.
While the intentions of the adherents of flygskam are perhaps laudable, their impact is minimal. In fact, shaming people into not using a plane is probably counterproductive. As are Greta Thunberg's actions. Have you ever wondered how Greta Thunberg travels to all those locations to participate in strikes?
[1] Jonathan Watts: Greta Thunberg, schoolgirl climate change warrior: ‘Some people can let things go. I can’t’ in The Guardian - March 11, 2019. See here.
Lost Key Syndrome
The official designation of the Lost Key Syndrome is Dysexecutive Syndrome. It consists cf a group of symptoms, usually resulting from brain damage to the frontal lobe. The syndrome describes a common pattern of dysfunction in functions, such as planning, abstract thinking, flexibility and behavioural control. Although many of the symptoms regularly co-occur, it is common to encounter patients who have several, but not all symptoms.
Patients are left wondering whether these symptoms are the result of their brain injury, whether they are just a result of the patient is getting older, or whether they simply lost their keys for the umpteenth time.
These symptoms that fall into cognitive, behavioural and emotional categories and tend to occur together. Many of the symptoms can be seen as a direct result of impairment to the central executive component of working memory, which is responsible for attentional control and inhibition.
The Lost Key Syndrome often seems to occur with other disorders, such as schizophrenia, dementia, Alzheimer's disease, meningitis and chronic alcoholism[1].
Assessment of patients with the Lost Key Syndrome can be difficult because traditional tests generally focus on one specific problem for a short period of time. People with Lost Key Syndrome can do fairly well on these tests because their problems are related to integrating individual skills into everyday tasks. The lack of everyday application of traditional tests is known as low ecological validity.
There is no cure for individuals with the Lost Key Syndrome, but there are several therapies to help them cope with their symptoms in everyday life.
[1] Abbruzzese et al: Persistent dysexecutive syndrome after pneumococcal meningitis complicated by recurrent ischemic strokes: A case report in World Journal of Clinical Diseases - 2023
Patients are left wondering whether these symptoms are the result of their brain injury, whether they are just a result of the patient is getting older, or whether they simply lost their keys for the umpteenth time.
These symptoms that fall into cognitive, behavioural and emotional categories and tend to occur together. Many of the symptoms can be seen as a direct result of impairment to the central executive component of working memory, which is responsible for attentional control and inhibition.
The Lost Key Syndrome often seems to occur with other disorders, such as schizophrenia, dementia, Alzheimer's disease, meningitis and chronic alcoholism[1].
Assessment of patients with the Lost Key Syndrome can be difficult because traditional tests generally focus on one specific problem for a short period of time. People with Lost Key Syndrome can do fairly well on these tests because their problems are related to integrating individual skills into everyday tasks. The lack of everyday application of traditional tests is known as low ecological validity.
There is no cure for individuals with the Lost Key Syndrome, but there are several therapies to help them cope with their symptoms in everyday life.
[1] Abbruzzese et al: Persistent dysexecutive syndrome after pneumococcal meningitis complicated by recurrent ischemic strokes: A case report in World Journal of Clinical Diseases - 2023
Female Inconsistency Syndrome
The Female Inconsistency Syndrome describes a very specific type of character, known as Mary Sue.
A Mary Sue is a character archetype in fiction, usually a young woman, who is often portrayed as inexplicably competent across all domains, gifted with unique talents or powers, liked or respected by most other characters, unrealistically free of weaknesses, extremely attractive, innately virtuous, and/or generally lacking meaningful character flaws[1].
Mostly, she is a slim and beautiful young woman. If a Mary Sue hits a grown man weighing about 100 kilograms, he will fall down like he's been hit by a 10-ton truck.
As a literary trope, the Mary Sue archetype is broadly associated with poor-quality writing, and stories featuring a Mary Sue character are often considered weaker for it. Though the term is mostly used negatively, it is occasionally used positively.
Always female and almost always the main character, a Mary Sue is often an author's idealized self-insertion and may serve as a form of wish fulfillment. Mary Sue stories are often written by adolescent authors or adults who didn't grow out of their adolescence[2].
There's a syndrome lurking beneath that childish writing. Why would a grown man write about a fictional and idealized woman who has no flaws? In this divisive world where 'woke' is equivalent to 'broke', such a unnatural depiction of women will drive away the public that simply wants to watch a movie to be entertained.
[1] Framke: What is a Mary Sue, and does Star Wars: The Force Awakens have one? in Vox - 2015. See here.
[2] Whatsawhizzer: The Mary Sue and Female Inconsistency Syndrome in wattpad. See here.
A Mary Sue is a character archetype in fiction, usually a young woman, who is often portrayed as inexplicably competent across all domains, gifted with unique talents or powers, liked or respected by most other characters, unrealistically free of weaknesses, extremely attractive, innately virtuous, and/or generally lacking meaningful character flaws[1].
![]() |
[Batgirl and the actress who played her] |
Mostly, she is a slim and beautiful young woman. If a Mary Sue hits a grown man weighing about 100 kilograms, he will fall down like he's been hit by a 10-ton truck.
As a literary trope, the Mary Sue archetype is broadly associated with poor-quality writing, and stories featuring a Mary Sue character are often considered weaker for it. Though the term is mostly used negatively, it is occasionally used positively.
Always female and almost always the main character, a Mary Sue is often an author's idealized self-insertion and may serve as a form of wish fulfillment. Mary Sue stories are often written by adolescent authors or adults who didn't grow out of their adolescence[2].
There's a syndrome lurking beneath that childish writing. Why would a grown man write about a fictional and idealized woman who has no flaws? In this divisive world where 'woke' is equivalent to 'broke', such a unnatural depiction of women will drive away the public that simply wants to watch a movie to be entertained.
[1] Framke: What is a Mary Sue, and does Star Wars: The Force Awakens have one? in Vox - 2015. See here.
[2] Whatsawhizzer: The Mary Sue and Female Inconsistency Syndrome in wattpad. See here.
Abonneren op:
Posts (Atom)