Inches of panic, life
and death.
Unclaimed disturbances.
We walk
down the same road
but you stopped
to tie your shoelace.
You didn’t see it coming.
And now:
our universes spin and spin
in webs of incomprehensions
as we try
to understand
what cannot be touched
by the other.
I’ve slithered around death –
maybe you haven’t been as lucky.
or maybe I’m the unlucky one
the haunted one.
I can feel them following me around:
An omen of what could happen,
a shadow of what never did.
A parallel outcome,
Pain beyond all I could imagine:
I killed  the old lady.
I can feel the crinkly skin
of her neck in my palms.
Impotence disguised as power
I killed you, I killed you
and you die every night
I was laughing
and I wasn’t quick enough.
I was happy
and I wasn’t quick enough.
And now cars are demons
Sirens deafen me
and lights blind me.
And people are evil
They kick dogs and live off arrogance
And I
live off bloodlust and compassion
I live off my own confusion.


Featured image from CRASH by  J.G. Ballard. Panther Books 1975. Cover Art by Chris Foss.


I can smell the lethargy in the air as the rain comes down.
Who told you you could write all over my skin?

Territorial disputes.

Casually manhandling death and the rain
don’t stop, the rain           don’t stop.
Biting breasts under neon colours.
Stuffing your face and drowning in the barrel-
Drowning in the rain of your pain.

Contempt for conformity. Body builders of human agony.
The vivid dreams stopped months ago.
Flashes of blood running down my neck.

This winding road is damned and this skin is too tight.
Grinning mouths with men hanging at the corners.
Unsteady flooring and gums aching.

I’m heady from the drinks, the want and the sweat.
This tube smells of metal, blood and piss.
There’s a nightmare pulsing in between my legs.
Laughing hyenas pull at my clothes.           I give in.

Vaccinate me for control.

Chapped lips in the cold. Stomach acid scratches at my soul.
Flashing streetlights, cars, dancing on my window.
Magnetic network of obligations and purpose.

Buzzing in the world and screeching in my ears.
Monotone high pitched frequencies and I’m going mad, I’m going       mad.

The itch, the itch the pulse           in the eye,

the everlasting night, the bite,

the blood.

I’m a mess of filaments,

my nerves are barbwire.

Your fingers feel like bombs.

Psychosis, migraines, want.           A hollowed out gut.

Out of body,

overlooking this city.

You stand next to me, naked and shivering.
My cigarette shakes at the lips.

It falls and I          let          myself                               fall.

OUTRAGEOUS! The Psychology behind morality, mass indignation and self righteousness.

This is a post about the way people react to the ‘bad things’ going on in the world. Or rather, the way people react to the ‘bad things’ they learn about from the news, Facebook, Twitter, Pinterest or their classmate/workmate, which are by no means exhaustive of the bad things going on in the world.

The illustrations in this post are by Pawel Kuczynski, a brilliant Polish satirical artist and illustrator which you can check out here.

1385815_603773366353858_1504917352_n.900x600I would like to direct your attention to how morality causes mass indignation, and how in turn this leads to self-righteous behaviour. I shall keep this very brief and technical, so as not to contribute to the various rants and political positions being upheld at this very moment about events happening in the world.

Let us begin with the question of MORALITY. Collins dictionary defines it as ‘conformity, or degree of conformity, to conventional standards of moral conduct’. Pretty accurate, although I do quite prefer Nietzsche’s description of it as the ‘herd-instinct in the individual’. What you need to understand is that morality does not rest on absolute truths of what is right and wrong in universal terms. It rests rather, upon conventional and agreed sets of rules and guidelines, which vary significantly amongst different groups.

So if morality is not an objective truth, but rather a human/social construction, you may ask why exactly it is that we need it. I suggest it is comparable to the ‘social contract’ advocated by luminaries such as Rousseau, Hobbes, Locke and Rawls. We need morality, to get along with other people, because we are all different and self-interested, and the world would be absolute chaos if we didn’t have it. In some ways, morality is a partial sacrifice of the self, for the benefit of the community: we agree to give up a part of our independence of thought and action, in other to live harmoniously with other people. The reasons behind this, behind why man is a social animal and why we need other people, is a subject I’m not going to go into right now.

What you do need to understand is that for whatever reason, we do make this sacrifice and that this ‘giving up a part of oneself’ to the community, is not an easy sacrifice to make. Just think about all those fuzzy moral questions that continue to raise serious debates: homosexuality, abortion, capital punishment, euthanasia, contraception, animal ethics, torture, slavery, war… the list could go on a while. My point is: morality is a sacrifice, and a big one at that.

And yet some people break this contract, undermining the effort you put into keeping this world order going. Hence comes the INDIGNATION: the feeling of shock and anger which you have when you think that something is unjust or unfair. I work so hard to keep up something, and then you wankers just come and blow shit up and ruin it. Not cool. What did I even work for?

And here we come to the point in which we decide to share our indignation with all the other animals in our group. After all, these moral law-breakers have ruined the outcome of my efforts, so I need to find some other way to benefit from my work. This leaves me with my final point, which is SELF-RIGHTEOUSNESS, which Mr. Collins again describes as ‘having or showing an exaggerated awareness of one’s own virtuousness or rights’. Some of the synonyms listed are: sanctimonious, smug, superior, complacent, hypocritical, goody-goody (informal) and holier-than-thou. This pretty much means that someone exhibiting self-righteous behaviour is not just saying ‘look at me I’m great’, they are saying ‘look at me, I am better that these other people and I want you all to know’.

So what does this have to do, you may ask me, with my desire to share my indignation with others? The answer to this question resides in the reasons as to why you share such information and/or to the barely existent reflections you make upon the consequences of this rather simple act.foto8discorsosporco.900x600
So I ask all of you: when you post an indignant post, or share a picture to express your solidarity with victims of an event, what exactly, are you trying to achieve? I’m pretty sure the standard response to this question is that you want to ‘raise awareness on an issue’. But if that is the case, I ask you again, is there not perhaps a better way you could be doing such a thing? Is expressing your indignation really worth your time and energy? The effect of adding your post to the millions of other online shares is really minimal compared to the useful things you could be doing with you time. If you are so interested in a particular cause, why is it that the only moments in your life in which you contribute to it, are those in which the information hits you right in the face? And why have you chosen to take a stance in this cause, and ignored the millions of other ‘outrageous’ things that go on in the world?

I’ll answer the question for you. You are not writing to implement change, you are doing it because it makes you feel good. Because it puts a safe distance between you and these outrageous moral law breakers. Your message is not going to implement change and you know it: it is the same as many other thousands of messages, and their counter-messages, with the sole purpose of increasing public indignation and making the topic a hot potato all over the world, perhaps because it is someone’s interest that it be so.

ces-oeuvres-poignantes-qui-remettent-tout-en-question-444915.900x600This leads to another argument: are we a group or are we, as Nietzsche classed us ‘a herd’? Since I am not here to take a political stance, but rather to attempt to give a technical explanation behind mass behaviour, I shall leave you to reflect upon this question yourself. Let us just leave it at the notion that a herd implies the presence of a shepard, and perhaps of his sheep dogs. And that if that were the case, Foucault’s insights on how learning and information are the best form of power in governing masses, are pretty insightful in understanding how morality and indignation can be easily used to direct and influence the masses.

But regardless of political positions, or who’s ‘side’ you are on, and who decide to point your finger at, what you are doing, yet again, is blaming Hitler for Nazism and WWII. Do yourself a favour and read ‘Eichmann in Jerusalem: on the Banality of Evil’ in which Hannah Arendt attends and reflects upon the trial of nazi architect and executioner Adolf Eichmann.


Arendt described how the officer, far from being a vehement anti-Semite, was a rather innocuous and banal individual. He followed his orders without thinking or asking questions or considering the effects of his actions. The pain of his victims was not apparent to him, nor was his active role in their suffering. It was not hatred that caused him to act as he did, but rather the lack of self-awareness and judgement. The book received much criticism and Arendt was accused by many of justifying the criminal, which in truth was far from what she was doing. She was simply pointing out that if it weren’t for the millions of people, who just went on with their lives and performed their duties without asking questions, Nazism would have probably remained a crazy man’s romanticised idea of a ‘perfect’ society.

It is easy to find a scape goat and point your finger towards authority, much less so to accept the responsibility of being an accomplice in a faulty world order, who only laments things gone wrong when they hit you in the face, whereas the rest of your time your only concern is to get on with your daily life and routine.

My suggestion to you is stop thinking about and lamenting what people should be doing, start looking at what they actually are doing, why they are doing it and why you know about it, and then decide how to react to it. Stop making yourself feel better for not doing anything, by pointing at and comparing yourself publicly to greater evils. And understand that you do not have power over other people, unless you have power over yourself and you development. Spend your time on thinking about your own character and actions, before shooting down those of others. Only then will you be able to come up with ideas and ways to ‘change the world’ and if it is your desire to do so, to help others.



a blanket
on a clothes line           the stains
all washed out
I hang out in boredom,

to dry

I am sick
of the clips
that so wearily hold me up

of this washed out sanity

I am sick.

This is not the best me I can be.

Stop this, run again.


dance away control:
colours bodies           laughter

c a r e l e s s n e s s

the frenzy the rush
the high.

I miss life and I have lied.

burn books thoughts dreams.

They aren’t enough,

I’m going to die.

burn lists

I don’t need to be clever and ok.
people movement fear anger           lust.

– to touch


be touched.

to feel alive.

Featured artwork by Jonas Fyhr. Find him at


A Modern Review of Anti-Psychiatry:

Why do people refuse pharmacological treatment for psychiatric conditions?

Most of you are probably familiar with the famous scene from One Flew Over the Cuckoo’s Nest, where McMurphy (Jack Nicholson) pretends to consume and then spits out him medication after nurse Ratched refuses to tell him what it is. It’s just medicine, it’s good for him and he shouldn’t be asking questions.angell_1_071411_jpg_630x497_cr

‘If Mr. McMurphy doesn’t want to take his medication orally, I’m sure we can arrange that he can have it some other way. I don’t think you’d like it.’

The film, made in 1975, was based on the book written by Ken Kesey in 1962, at the hight of the anti-psychiatry movement that was pervading the western world. In this period the theme of patients evading pharmacological care became quite common in literary and cinematic depictions of psych wards and mental health. Other interesting readings on mental health care at the time include Michel Foucault’s ‘Madness and Civilisation: A History of Insanity in the Age of Reason’ (1961), Szasz’s ‘The Myth of Mental Illness’ (1960), ‘Asylums’ by Goffman (1961) and ‘Psychiatry and Anti-Psychiatry’ by Cooper (1967). The Anti-psychiatry movement mainly questioned three things:

(1) the existence of mental illness and the use of psychiatric diagnosis as a power tool to control social deviants; (2) the power of psychiatrists to detain patients against their will and the use of barbaric methods in psych wards and (3) the medicalisation of madness.

When he spat out the pill, McMurphy was defying a system that was oppressive and malfunctioning, as indeed many internment facilities of the time were. Many of the past treatments used to treat patients with mental instability were primitive and often barbaric: to name a few, trepanning, lobotomies, insulin shock therapy, bloodletting and badly administered electroconvulsive therapy. Similarly their pharmacological counterparts where just as invasive and excessive, and patients were often stuffed to the brim with sedatives such as bromides and barbiturate, and primitive anti-psychotics (chlorpromazine was one of the first), which caused severe side effects, drowsiness and physical dependency.

However, things have significantly improved in the past years, and society’s relationship with mental health is consistently changing towards a world of increasing awareness. Pharmacological treatment in psychiatry has also developed much since its origins. Let us look, for example at the first effective medicine for the treatment of mental illness: lithium carbonate, the effectiveness of which as a mood stabiliser was demonstrated in 1948 by Australian psychiatrist John Cade and approved by the US Food and Drug Administration (FDA) for the treatment of acute mania in 1970.

Although the evidence for lithium as an anti-manic agent is incontrovertible, the drug is also known to cause rather serious adverse effects and carries a “black box warning”. It can cause central nervous system (CNS) toxicity, renal toxicity, thyroid toxicity, and teratogenic effects, all of which can be life threatening. It is also associated with non-life threatening but rather bothersome side effects, such as tremor, excessive urination, dry mouth, nausea, sedation, acne, and cognitive dulling. Mild CNS toxicity manifests as restlessness, irritability, and sedation. Severe neurotoxicity can progress to delirium, with ataxia, coarse tremor, seizures, and ultimately coma and death.

Dr. Lembke at Stanford University writes how earlier studies on the dosage of lithium in treating acute mania advocated a concentration of serum lithium levels between 0.9- 1.4 mEq/L. Severe neurotoxicity is associated with lithium serum concentrations exceeding 1.6mEq/L, but can occur at egonschielelower levels in susceptible individuals. Later studies have illustrated that effective mania response can be achieved with doses between 0.5 – 0.72 mEq/kg/day, corresponding to serum lithium levels below 1.0 mEq/L.

With the reduction in the posology of lithium, patients are much less susceptible to the bothersome side effects associated with high levels, which were extremely common in past times. Today, the optimal dosage of lithium ought to be carefully administered by professionals, with attention to factors such as a limited starting dose, rates of titration, serum concentration for efficacy and toxicity, drug-drug interactions, dosing frequency, and rates of discontinuation.

Furthermore, nowadays patients have increasing access to communication and information technologies, relationships between patient and physician are more dynamic and interactive, there is increasing awareness and de-stigmatisation of mental disorders, and policies are being put in place to assure the equality of human rights for those suffering from psychiatric disorders and mental health difficulties. It would appear that modern psychiatry has worked consistently towards resolving the accusations of the anti-psychiatry movement in the 60’s, even if there still is much progress and research to be made in the field.

Yet out of the three above mentioned critiques, medicalisation still holds a highly significant following in contemporary society, amongst public figures and patients alike. So what is it, in this day and age, that causes patients to refuse their treatment? My conclusion is that three main factor contribute to this issue: (1) fear of social stigma, (2) fear of physical side effects and (3) fear of loss of control.

Let me explain this further through my own personal experience. A few weeks ago, when my psychiatrist suggested I take a low maintenance dosage of Depakine so as to avoid any fall-backs into manic-depressive episodes, my brain automatically started saying: no, no, no, no. Which has led me, over the past few weeks to a great reflexion in what really hides behind my weariness of this type of drug, regardless of being well-informed about its properties, effects and dosages, and confident in the advancements psychiatry has undergone in recent years.

In this post, I would like to share with you my conclusion, as I believe it is a plausible hypothesis for many patient’s behaviour. While I do believe that fear of side effects and/or social stigma can play an important role in an anti-medication approach, I think there is another, more subtle, yet profoundly existential reason to explain this refusal. When I d94412deca31e98092dcef1b5ff532b4advised my psychiatrist regarding my doubts over starting a new drug, her response was one that I have encountered many a time in similar situations, or articles advocating the importance of medication for mental illness.

‘If you were diagnosed with Diabetes, and not Bipolar Disorder, would you be questioning the use of a drug in its cure?’

Well, to be perfectly honest the answer is no. I’ve never really had a problem with taking painkiller, in moderation, for a headache, I daily take medication for my asthma and have used antibiotics in several occasions. (Note: I understand there is a whole school that criticises western medicine in general, as well as the motivations that drive pharmaceutical companies, but this is not the argument I wish to discuss in this post).

None the less, the idea of constantly taking a small maintenance dose of a mood stabiliser gives me the heebie-jeebies. And I do not think this is entirely related to my skepticism towards labelling, although I do believe that a strict adherence to labels in psychiatry may pose some difficulties when dealing with individual cases of patients (which I will discuss further in future). I do not even think it is entirely the fear of losing the manic part of myself, which I have come, after much time and consideration, to view as a diversion and not as an ‘up’ side of my mood and personality.

I think what causes my weariness, is my desire for control over all aspects of my life. What makes psychiatric medication different from other classes of drugs, is that what is acting upon is not a rachel_elise_painting_4physical resentment, but rather a chemical imbalance that plays an enormous role in what constructs my personality. In some way, I suppose my fear is dictated by the possibility that taking such medication, may in some way alter my essence as a human being and my control over my own life.

Let me explain this better. I have been taking 100mg Sertraline (brand name Zoloft, Lustral) daily for almost a year. It is an anti-depressive drug classed as an SSRI (selective serotonine reuptake inhibitor) which essentially means it plays on the level of the neurotransmitter serotonin in my synapses, which is one of the main chemicals responsible for mood. At around the same time I started this therapy, I also undertook many other steps towards well being, such as a better diet, a reduced consumption of caffeine, alcohol and nicotine, physical activity, meditation, therapy and a considerable dose of self awareness and reflection upon my existence.

As of today, I am doing significantly well in my day to day life and, allowing for some minor fall backs, am leading a content, enthusiastic and motivated existence. My main difficulty remains the terror of losing this controlled balance I have cultivated for my self. I almost feel as if I were a blanket hanging precariously on a clothing line, and if one of the many clips holding me up were removed, I may lose my balance, my control over reality. I have found myself asking myself, many a time: how much of my current well being is due to my actions, and how much is due to my pharmacological treatment? Or better, if I had not started on Zoloft a year ago, would I still be in the same place?

I realise now, that this is a rhetorical ‘what if’ question, to which I will never be able to provide an accurate answer. And losing oneself in the hypothetical possibilities of what could have been, is something I hove long deemed unhealthy and unproductive. I spoke with my psychotherapist regarding my skepticism towards medication and a question she asked me really did strike home.

Even if you are doing all these other things for your own stability, and you could potentially be ok without medication, why is it that you still feel inclined to refuse the extra help it could give you, even if it does out-balance the negative effects?

Why is it that we feel inclined to do everything on our own, without other people, without pills, without help of any sorts? Is it fear of weakness? Is it the same reason why so many people around the world art-of-science-3keep their issues in the closet and the same reason for which I myself, for many years, ignored my own pain and instability?

I realised in that moment how much my own preconceptions and perception of control have played a part in my decisions and how hypocritical my weariness of Depakine and Sertraline is. After all, when I drink a beer, spend time with friends, find comfort in a lover, or seek relief in music, art, sex, travelling, food, etc., am I not in some ways asking the world for a helping hand? I fear losing control when indeed I have no control: my existence depends upon the world around me and all of the silly little things that keep me hanging on the clothes line. How is medication any different to them?

So yes, until I am conscious and wise enough to be a blanket that holds itself up on its own, I am not ready to give all of these things up and, for the time being, I need them. I need my friends, I need my family, I need distractions, I need beauty and, as hard as it is for me to admit, I need my medication. What I also realise is that the purpose of all these things is not to ‘hold me up’, but rather to ‘hold me upright’, like the training wheels on a bike that prepare you to ride by yourself. They construct me and make me grow, and allow me to pursue the activities and reflections that make me who I am. And one day, I am certain, I will no longer need these wheels. I will be able to live my life with self-awareness and conscientiousness and experience all around me with light-heartedness and care and no longer with visceral need and dependence.

Cooper, David (1967). ‘Psychiatry and Anti-Psychiatry’ Routledge, Abingdon, Oxon: 2001

Foucault, Michel (1961) ’Madness and Civilisation: A History of Insanity in the Age of Reason’ Routledge Classics, Abingdon Oxton: 2005

Goffman, Erving (1961) ‘Asylums. Le istituzioni totali: i meccanismi dell’esclusione e della violenza’ Einaudi, 2010

Kesey, Ken (1962) One Flew over the Cuckoo’s nest. The Viking Press Ink.

Lembke, Anna. MD, Clinical Instructor, Stanford University, Optimal Dosing of Lithium, Valproic Acid, and Lamotrigine in the Treatment of Mood Disorders accessed on ‘Primary Psychiatry’ on Nov 12th 2015. URL:

Szasz, Thomas S. ‘The Myth of Mental Illness Foundations of a Theory of Personal Cunduct’ (1960). HarperCollins: 2011

Combating Workaholism – Why Leisure is Important and How Our Society Neglects it

“Of all people only those are at leisure who make time for philosophy, only those are really alive,”

Seneca, On the Briefness of Life.

Within our modern culture of productivity and consumption, the concept of ‘leisure’ is not considered as a fundamental right and essential component of being human, but rather as a privileged luxury, or as a colossal waste of time.

In 1948, German philosopher Josef Pieper published Leisure, the Basis of Culture – a manifesto for the importance of leisure in an age when we have mistaken making a living for having a life. He highlights the origins of the word leisure: the Greek word σχoλη, from which derives the word school – the institution for learning and contemplation.

Under the tyranny of workaholism, the human being has forgotten the value of leisure, and has been reduced to a functional piece of a much bigger machine, and our work has become the only thing that there is to our existence. Pieper writes how our culture has effectively normalised working as a mere obligation:

‘What is normal is work, and the normal day is the working day. But the question is this, can the world of man be exhausted in being “the working world”? Can the human being be satisfied with being a functionary, a “worker”?

How is it that we have come to view work as a necessary evil that is needed for our survival and leisure as a luxury we cannot afford? And how is it that we have come to see these two activities as entirely distinct and mutually exhaustive?

To answer this question we must look into mans basic desires and expectations in life, and how these have been manipulated over time. The common modern perception is that the main purpose in a persons existence is to find happiness and to live a good life. The debate over what constitutes the notion of ‘good life’ and ‘good society’ is secular: financial security, access to a variety of goods and services, leisure and entertainment, equality, peace, good health care, life expectancy, literacy, cultural development, political rights and freedom, social civility, are only some examples of the criteria argued to contribute to individuals and societies happiness.

Because many of these aspects are obtainable through financial means, Ray suggests that a minimal requirement for the good life of a society is that the ‘physical quality of life be high’. Based on these assumptions, we can quite easily sum up man’s relationship with work:

I want to be happy
I need money to be happy
I need to work to earn money

Conclusion: a workaholic society where leisure has no room.

Thus, according to this logic, work becomes a simple means to an end, in which the human spirit finds no affirmation or

growth. However this reasoning rests on the entirely generalised and constructed premise that material goods = happiness, as well as excluding some rather significant variables, such as time and health. If we dedicate all our life to the means of our objective, and then no longer have the time, the mental enthusiasm and the physical health to benefit from it, what indeed is the point? As Dostoevskij put it: ‘when each man will have reached happiness, there will no longer be time.’

Another question I could ask is: why is it so evident that we have to sacrifice a whole chunk of our lives, in order to find a smidgen of happiness in what’s left of it? Our modern work etiquette rests on the underlying assumption that work and leisure live on planets apart and are mutually exclusive. All you need to do is look at the basic microeconomic model on individual preferences which contrasts leisure and income, placing leisure as a dis-utility and putting us at a fork in the road when there ought to be an intersection.

E.F Schumacher underlines this concept in his masterpiece Small is Beautiful: Economics as if people mattered’. He highlights how traditional Western Economics have shifted us towards a reality where “goods are more important than people and consumption is more important than creative activity.” He writes how:

There is universal agreement that a fundamental source of wealth is human labor. Now, the modern economist has been brought up to consider “labor” or work as little more than a necessary evil. From the point of view of the employer, it is in any case simply an item of cost, to be reduced to a minimum if it cannot be eliminated altogether, say, by automation. From the point of view of the workman, it is a “disutility”; to work is to make a sacrifice of one’s leisure and comfort, and wages are a kind of compensation for the sacrifice. Hence the ideal from the point of view of the employer is to have output without employees, and the ideal from the point of view of the employee is to have income without employment.

Leisure as an affirmation of the human spirit is not a ‘taking time off’. It’s not switching our mind of from our every activity. It
is not laziness. It is being able to give the activities we pursue the time and value they deserve. It is being able to sit still and373875b767aa1f93377f44119b06f29csoak in all that is around you, listen to your feelings, and breath in your own life. It is not detachment, it is full immersion and affirmation. It is distraction, observation and uninfluenced attention. It is that feeling of immensity when you listen to music, look at the stars or listen to a lover’s heartbeat. Leisure feeds our minds and souls, gives us ideas, and puts purpose into our life and into our work. It allow us to be inspired by all that is around us and to let loose the creative spirit that is part of man’s nature. It is a joyful celebration of who we are and what we do, and a quiet meditation on what we desire and believe. Without leisure, the job you choose and the life you live are meaningless, unsatisfactory and deprived of any form of self-awareness.

Leisure may not be what makes you survive, in this world of frenzied and money-hungry busy bodies, but it sure as hell is the only thing that gives value to your survival.

To conclude, here are a few more beautifully written word from Pieper’s manifesto:

Against the exclusiveness of the paradigm of work as effort, leisure is the condition of considering things in a celebrating spirit. The inner joyfulness of the person who is celebrating belongs to the very core of what we mean by leisure… Leisure is only possible in the assumption that man is not only in harmony with himself … but also he is in agreement with the world and its meaning. Leisure lives on affirmation. It is not the same as the absence of activity; it is not the same thing as quiet, or even as an inner quiet. It is rather like the stillness in the conversation of lovers, which is fed by their oneness.


Dostoevskij Fedor (1873). I Demoni. Feltrinelli, Milano: 2000

Pieper, Joseph. (1948) Leisure, the Basis of Culture. Pantheon Books. San Francisco: 2009

Ray, Debraj (1998). Economic Development: Overview. Development economics. Princeton University Press.

Schumacher, E.F. (1973) ’Small is Beautiful: A Study of Economics as if People Mattered. Vintage Books. London: 2011.

Seneca, Lucio Anneo, La Brevità della Vita, by Alfonso Traina, BUR Milano: 2010

Vabba, Alisha (2014) Economic development, Globalisation and Human Well-being. Development Economics. Lancaster University.

People, goodbyes.

The brute, astute revelation
Of a painfully insignificant fade out:
You never were, the specialness I craved for.

You never were.

Forcefulness embraced me then,
And now your face I cannot colour with my pain.
I craved the ethereal self, I imagined through your eyes.

I was your portal –

To feel love, for yourself, to feel worth.
A portal for big words, and comforting elation.
I was a beast of beauty to subject,
Like the beast within us all we cannot tame.

I am tall now –

Taller than you now, navigating higher comfort.
We seek the same fulfilment
And project ourselves in winning battles.

I was your projection –

A mirror of the self you wished to be.
Through lust and ego you created many me’s.
We are all just shadows of each others dreams.
My existence depends upon you all,

And I need you.

I could only ever gage myself through you.
Only when you were inside me
Could I smell and taste the colours of me,
Never where they mine to be felt.

I touched myself when you were inside me,
And the walls for a moment crumbled
And we floated, for a moment, in the same chaos.

That me, you made me.

Forever yours it will be and you, will forever be mine.

The Truman Show on Reality, Illusion and Scientific Revolution.

‘We accept the reality of the world with which we are presented.’

Some of you may recognise this quote from the 1998 film The Truman Show, written and directed by Andrew Niccol and Peter Weir and starring Jim Carrey. The film features the life of Truman Burbank, who lives with his perfect wife, in a perfect town full of perfectly happy people who all know and love him. What the unsuspecting protagonist doesn’t know is that since before birth he has been the star of a 24h reality TV show, broadcast live around the entire world. His hometown of Seahaven is built under a giant arcological dome in which everyone except Truman himself is an actor involved in the screenplay. He is furthermore classically conditioned by negative imagery and memories that dissuade him from travelling or moving away from the setting.

The film touches on some of the greatest philosophical debates of all time such as the distinctions between free will and determinism and appearance and reality. The question of what is real has been debated for centuries. In Ancient Greece, almost as a presage to Einstein’s general relativity, Heraclitus identifies the essence of the universe in ‘becoming’ believing that everything is subject to time and change and that even that which appears static is effectively moving. This philosophy is incorporated in his famous aphorism “πάντα ῥεῖ” which means “everything flows”.

‘It is not possible to descend into the same river twice, or to touch a mortal substance twice in the same state; due to the impetuosity and speed of change it is dispersed and collected, it comes and goes’

being not beingParmenides on the other hand on the other hand offers a more static and objective notion of reality, according to which man can only choose between truth (ἀλήθεια), based on reason which guides us towards true essence and opinion (δόξα), based on sensation, which guides as towards appearance, or false essence.

‘For nothing exists or will exist except being, since Fate fettered it to be whole and unmoving’. (fragment 8)

The most famous analogy to the Truman shows depiction of an illusory reality can perhaps be found in Plato’s allegory of the cavePlato_-_Allegory_of_the_Cave, in which tied up prisoners observe shadows on a cave wall believing they are all that there is to reality. In this analogy one prisoner breaks free from his bonds and notices that the shadows are mere imitations of puppets behind him and, upon leaving the cave, sees the real things which these puppets are meant to represent. Truman, until he begins doubting the world around him is like such a cave prisoner.

The notion of an illusory reality has also been depicted in many fictional masterpieces such as the Matrix, 1984, Blade Runner, Brave New World, Memento and Inception. What, in my opinion, makes the Truman Show such a modern depiction of man’s perception of the world is how it is dealt with in a lighthearted and almost humorous manner, almost as presage to the superficiality of our age, which unsurprisingly is obsessed with reality shows and gossip culture. But most of all, what I find particularly refreshing in the Truman show, which is absent in many film and literature depictions of the topic, is that it provides a motivation as to why Truman begins to question his reality: technical difficulties. While for example, in the Matrix, the protagonist Neo is portrayed as some sort of mystical prophet with a strong inner eye, Truman is a completely normal man, living his day to day life. If it weren’t for some particularities in the production, he would most probably never have questioned his odd existence.

Which leads to some rather complex questions: (1) Why is it that we ask ourselves certain questions and others we do not think of? (2) Why is it that some question reality and others do not? (3) when is it or rather what is it that makes us question our reality, or rather, which are the technical difficulties that cause our attention to shift away from what we know, and lead us to question our worldview?

To answer questions (1) and (2), let me bring your attention to the topic of ‘attention’ itself, which I believe is extremely relevant to this argument. I want you to imagine for a second walking down a busy city street and paying attention to your surroundings. You are likely to set your eyes on many different people and situations: perhaps a particularly skilled busker, an interesting architecture, a woman talking loudly on the phone about her husband or a police officer placing a fine on a badly parked vehicle. Now imagine you are not alone on this walk, but your best friend is walking beside you. You have many things in common but still, do you think he/she will notice the exact same things you do? You might both notice the busker as you have a similar taste in music, but for a million reasons, most of which determined by the casual setting of your eyes (maybe you stop to tie your shoelace and notice something on the ground) your 50 metre walk is characterised by a million different particularities. Extend this argument to the whole street and you’ve got 200 people living a completely different experience.

This subject is discussed brilliantly by cognitive scientist Alexandra Horowitz in her research book titled ‘On looking: Eleven Walks with Expert Eyes’. In which she purposely goes on walks with people in different fields of expertise to see how differently everyone perceives the world around them.ct-prj-0303-on-looking-pictureThe author points out how “attention is an intentional, unapologetic discriminator. It asks what is relevant right now, and gears us up to notice only that.”

I often pride myself upon my ability to be distracted by the beauty in life. In my first post on this blog I wrote about a woman playing the violin in the tube in Berlin, and how angry at humanity it made me that no one else seemed to notice her. Now my mind flutters to all the thousands and thousands of things that, every day and in every situation skim past me unnoticed. Even in this moment, while I concentrate on writing this article, I am missing out on the majority of things happening around me. In her book Horowitz invites the reader to a similar reflection.

‘By marshalling your attention to these words’ she writes: ‘you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance’.

The absurd level of individual bias that affects perception and hence, reality, is rather scary. The scary question is: if I had handled and directed my attention differently, would I be a different person? If I hadn’t read a particular book, smelled a particular smell, met a particular person, or been in a specific place at a specific time, would my reality be different? If a stage light had not fallen from the sky right in front of Truman’s nose, would he ever have questioned his world?

How much of our notion of reality is dictated by sheer and utter casualty? As I have previously pointed out, it is impossible for a human being to see the world without the filter of our perception: this is made up by our cognitive functions and conserved knowledge. We don’t see the world how it is exactly, but how it is projected through our own beliefs, knowledge and sensations. I am not sustaining that man has some magical thinking ability that can create phenomena with his mind, but simply that what we look at, and the way we look at it are what construct our notion of reality.

These considerations on attention explain both, why it is that we pay attention to certain things and others not, and why different people pay attention to different things. This is made up largely by casualty, and increasingly more by the personality based decisions that are constructed through time by the combination of our casual experiences, which eventually determine the objects of our attention.

I recently have embarked upon an online course in Philosophy of the Sciences offered on Coursera by the University of Edinburgh (brilliant course by the way, I suggest it to anyone who is interested in the notion of reality and consciousness and exploring the origins of our universe and the world as we know and perceive it). During my studies I found a similarity between the casualty of attention and experience and what Australian physicist Brandon Carter referred to as the Anthropic Principle in 1974, which has since become a key worldview in philosophy of science. Anthropic reasoning is based on the notion that the kind of observer we are will set restriction to the kind of physical conditions we are likely to observe. In other words, we are context-sensitive physical observers that can only thrive in a narrow range of physical conditions and are only likely to observe conditions suitable for our observation.

Think of this from a cosmological point of view: our bodies contain a very wide range of elements, from lighter ones such as hydrogen to heavier and rarer ones such as iron and sodium. These last ones are only formed in the heart of stars through stellar nucleosynthesis, in which lighter nuclei combine together to form heavier ones. This means, quite literally, that our bodies are made of stardust. Now think of all the other natural phenomena that have permitted our existence on planet Earth, in the Solar System, in the Milky Way, in our Universe. (for a good picture of the size of Earth in the Universe, check out this interactive scale). Without gravity, carbon chemistry (which is only possible at particular temperature and pressure conditions), the freezing of water or the particular structure of space around us, we wouldn’t even be here to observe these phenomena.

The absurdity of circumstances that has permitted our existence, which is often referred to as ‘cosmic fine tuning’ has lead to many theories according to which the universe has somehow been ‘designed’ for our specific existence. This is, unfortunately, a categorical generalisation, of the anthropic principle, which is far from what the principle wishes to suggest. Imagine being a frog in a pond. It is one thing to say: it is likely that I have grown up in conditions that allow for frog spawn, and thus these are the conditions I can observe’. It is another to generalise and say: my presence in this pond indicates that the universe was designed with a view to generate frogs’.

The reason why we cannot make this generalisation is intrinsic in the anthropic principle itself. We know what we observe. And it is likely that what we observe is a reality that has allowed for our existence, for us to be there to observe. In other words we are in someway codependent on the specific reality we observe. Who is to say that there may not be other types of reality, in which there are not conditions for our existence and hence we are not able to observe? Scientists have speculated the possibility that our universe is merely a subset of a much larger ensemble (often referred to as multiverse) that can contain all the physically possible ways the universe could be. From such a point of view, it is not surprising that we inhabit this particular universe with just the right conditions for life.

If it isn’t clear to this point, I am offering a critique of the scientific and philosophical notion of causality, which is at the basis of Newtonian science. The sheer casualty (by which I intend chancely, accidental, unforeseeable nature) of our daily experiences, leads me further and further away from the scientific cause-effect laws of physics. Which leads me to another consideration on scientific progress.

To answer question (3) let me introduce another topic we looked at in the afore mentioned course: the different stages of ‘science’ as described by Thomas Kuhn in ‘The Structure of Scientific Revolutions’ (1962). Before Kuhn, science was seen as a sequence of scientificThomas_Kuhn theories which build on and perfectioned its predecessors by providing a more accurate image of the world.

But according to Kuhn this picture is totally wrong and there is no such thing as a distinct scientific method. He describes how, during periods of normal science, scientists work within a scientific paradigm. This includes the main scientific theory, the experimental and technological resources as well as the system of values of the community, such as simplicity, mathematical elegance, parsimony, etc. During this time, textbook work is fundamental. Kuhn moves away from Popper’s notion of falsificationism, towards a view of scientific research as ‘problem solving’, or rather attempting to solve the minor difficulties and discrepancies of textbook knowledge.

When a significantly large number of these anomalies accumulates, the normal science enters a period of crisis. At this point the community may decide to abandon the old paradigm and move onto a new one, in what Kuhn refers to as a paradigm shift. The choice of this new theory is not dictated by its superiority over the old one but on its higher puzzle-solving power, which accounts for the anomalies in the old one. In short, according to Kuhn, a scientific paradigm is picked over another one not because it is closer to the truth, but because it is better at problem-solving than the previous one.

In the Truman show, Truman constructed his notion of reality with what he was presented. When anomalies started to present themselves, he attempted to find solutions to them, based on his conserved knowledge of how his world worked. When the number of anomalies accumulated (stage lights falling from the sky, people acting in a repetitive and staged manner, meeting his supposedly deceased father, etc.) he no longer had the ability to solve them according to his rationale. Truman entered into a period of crisis, and decided to search for solutions elsewhere, similarly to what Kuhn would define as a paradigm shift. When Truman discovered that his life was a TV show and decided to exit the little door in the sky, he did not move closer to reality. He did not pick reality over fiction. He merely chose a different reality, in which the anomalies he couldn’t account for in the first one made more sense.

Shift this argument to our human notion of reality and you get the same reasoning. Newtonian science has worked so far, and we’ve managed to find solutions to minor difficulties with its basic principles and assumptions. With the introduction of aliceinquantumland6quantum mechanics in the 1920’s, this is no longer possible. What quantum mechanics demonstrates is that the reality we observe is dependent on the observer. This seems to have rather strong connections with the psychological notion of consciousness: the fact that we experience an internal world of images, sensations, thoughts, and feelings that are related to the external world.

However, mainstream science seems to have always largely ignored the anomaly of consciousness which its traditional methods were unable to explain. This kind of goes against Kuhn’s view that unexplainable anomalies cause a crisis and then a paradigm shift. Based on recent times, one could ironically revise Kuhn’s theory as follows: when science has unexplainable anomalies that accumulate it does not immediately enter a crisis. It quite simply ignores the problem until it happens to discover a theory that works better, causing a paradigm shift. Consciousness has been ignored because it didn’t make sense with traditional newtonian science: it could not be empirically observed, and it clashed significantly with science’s search for objective and universal truths.

However, with the introduction of quantum mechanics, the phenomenon of consciousness is no longer ignorable and can in no way be explained by our current paradigm. Many theorists have tried to do so, opting towards a better understanding of brain chemistry, towards computing theory according to which consciousness rises from complexity of the brain’s processing, or looking towards chaos theory. But how can something as immaterial as consciousness rise from something as unconscious as matter?

The impossibility of answering such a question leads me to think that we may be approaching the time to stop with the problem solving and justifications, and to question the basic assumptions of science and reality. What I’m trying to get at, in this rather diverse argument which has fluttered from cinema to philosophy and from cosmology to consciousness, is that I feel it might be time to question the validity of cause-effect, materialist newtonian science. I believe we have reached a point in history were the failure in its explanatory value is significant, as quantum mechanics and consciousness show us. From a broader perspective, cosmology and anthropic reasoning show us how so called cosmic fine-tuning have pushed a us towards an anthropocentric view of reality which is widely supported by cause-effect laws. Is it possible that we are now shifting from a causal view of reality to a casual one, in which our existence is based upon chance and not cause-effect laws?


Gilmore, Robert (1995). Alice in Quantum-land, Springer Science and Business Media, illustration.

Horowitz, Alexandra (2012). On Looking: Eleven Walks with Expert Eyes, Simon and Schuster: 2014

Kuhn, Thomas (1962). The Structure of Scientific Revolutions, Einaudi: 1999

Professors Massimi, Michela and Richmond, Alistair. Lectures in Philosophy of Science at the University of Edinburgh.