Busy, distracted, inattentive? Everybody has been since at least 1710
https://aeon.co/essays/busy-and-distract...least-1710
EXCERPT: The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fuelled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day.
The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern.
[...] Amid celebration of attention as a virtue, educators, religious commentators and medical professionals constantly expressed fears about the moral harms of inattention. From the later 18th century onwards, anxieties about the ‘habit of inattention’ were increasingly represented as a moral disease. [...] inattentive people lacked the stability and moral fibre necessary for concentration. They were ‘characterised as unwary, careless, flighty and bacchanal’. They were portrayed as relatively immature, reckless and unreliable.
[...] In fact, until the 1970s, when the legitimate medical diagnosis of attention deficit disorder, or ADD, entered the mainstream vernacular and was used to understand a sub-group with a real disability, the widespread social condition of inattention tended to be represented principally as one of defective moral control. [...] The recent decades have seen a dramatic reversal in the conceptualisation of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterised as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way. [...] According to the US neuroscientist Daniel Levitin, the distractions of the modern world can literally damage our brains.
Yet the moral concerns that have always underpinned society’s preoccupation with inattention still lurk in the background....
Putnam's Progress
http://www.philosophersmag.com/index.php...s-progress
EXCERPT: In 2001, Julian Baggini interviewed Hilary Putnam (1926-2016), the philosopher who never stood still
"I make no secret of changing my mind on one or two important issues." It's a pretty unexceptional admission, but for the speaker of these words the issue of changing one's mind has a peculiar force. Hilary Putnam has written on a wide range of topics, encompassing metaphysics, the philosophy of language and the philosophy of mind. His output has become essential reading for anyone serious about contemporary debate in all these areas. Yet what he has become most renowned for is changing his mind.
It's something that baffles Putnam, who can reel off a long list of books and papers he wrote a quarter of a decade ago which he still stands by. He also points to colleagues, like Jerry Fodor, who have made some pretty big U-turns themselves without being saddled with a reputation for inconstancy.
Fixating on this aspect of Putnam's work is misplaced for two reasons. First of all, not changing one's mind is hardly a virtue in itself. "I've never thought it a virtue to adopt a position and try to get famous as a person who defends that position," says Putnam, "like a purveyor of a brand name, like you're selling corn flakes." Putnam recalls Carnap, with whom he worked at Princeton for a year. "I remember how often he said 'I used to think so and so, now I think so and so'. I remember admiring that very much."
More importantly, however, is that readers who focus too much on where Putnam has changed his mind are in danger of missing the constants. Putnam himself says, "Much of the apparatus that I depend on in my own reasoning has not changed." This apparatus is most evident when one looks at the backbone of his philosophy in his work on language and meaning....
https://aeon.co/essays/busy-and-distract...least-1710
EXCERPT: The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fuelled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day.
The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern.
[...] Amid celebration of attention as a virtue, educators, religious commentators and medical professionals constantly expressed fears about the moral harms of inattention. From the later 18th century onwards, anxieties about the ‘habit of inattention’ were increasingly represented as a moral disease. [...] inattentive people lacked the stability and moral fibre necessary for concentration. They were ‘characterised as unwary, careless, flighty and bacchanal’. They were portrayed as relatively immature, reckless and unreliable.
[...] In fact, until the 1970s, when the legitimate medical diagnosis of attention deficit disorder, or ADD, entered the mainstream vernacular and was used to understand a sub-group with a real disability, the widespread social condition of inattention tended to be represented principally as one of defective moral control. [...] The recent decades have seen a dramatic reversal in the conceptualisation of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterised as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way. [...] According to the US neuroscientist Daniel Levitin, the distractions of the modern world can literally damage our brains.
Yet the moral concerns that have always underpinned society’s preoccupation with inattention still lurk in the background....
Putnam's Progress
http://www.philosophersmag.com/index.php...s-progress
EXCERPT: In 2001, Julian Baggini interviewed Hilary Putnam (1926-2016), the philosopher who never stood still
"I make no secret of changing my mind on one or two important issues." It's a pretty unexceptional admission, but for the speaker of these words the issue of changing one's mind has a peculiar force. Hilary Putnam has written on a wide range of topics, encompassing metaphysics, the philosophy of language and the philosophy of mind. His output has become essential reading for anyone serious about contemporary debate in all these areas. Yet what he has become most renowned for is changing his mind.
It's something that baffles Putnam, who can reel off a long list of books and papers he wrote a quarter of a decade ago which he still stands by. He also points to colleagues, like Jerry Fodor, who have made some pretty big U-turns themselves without being saddled with a reputation for inconstancy.
Fixating on this aspect of Putnam's work is misplaced for two reasons. First of all, not changing one's mind is hardly a virtue in itself. "I've never thought it a virtue to adopt a position and try to get famous as a person who defends that position," says Putnam, "like a purveyor of a brand name, like you're selling corn flakes." Putnam recalls Carnap, with whom he worked at Princeton for a year. "I remember how often he said 'I used to think so and so, now I think so and so'. I remember admiring that very much."
More importantly, however, is that readers who focus too much on where Putnam has changed his mind are in danger of missing the constants. Putnam himself says, "Much of the apparatus that I depend on in my own reasoning has not changed." This apparatus is most evident when one looks at the backbone of his philosophy in his work on language and meaning....