Monday, January 02, 2012

Blogging and the curse of coolness

Only a few years ago blogging was being hailed, in the mass media and in academia, as a revolutionary medium of communication. The pundits told us, with varying degrees of enthusiasm, that blogs, with their speed of publication and low costs, would inevitably supersede newspapers. Blogs were also tipped to transform literature by complicating or even obliterating the distinctions between author and reader. And politicians were warned that blogging would change the nature of public debate, as hordes of formerly quiescent citizens began posting their opinions online.

In 2004 the Merriam-Webster dictionary pronounced 'blog' the 'word of the year', and in 2005 the media revealed that a new blog was being created every second.

Today, though, blogs are out of fashion. A survey reported by the New York Times showed that blogging is fast losing popularity amongst younger people, who are much more enamoured of newer internet platforms like facebook and twitter. A Pew Research Centre survey of thousands of American web users suggests that blogging is also losing its appeal for older folks. The proprietor of Gombeen Nation, one of Ireland's most popular blogs, recently observed that, thanks to steadily falling readerships, his competitors 'are dropping like flies'. The internet is now littered with the hulks of abandoned blogs.

Blogging is in decline because it conflicts with both the profit drive of capitalist corporations and the consciousness of contemporary internet users.

Corporations find it relatively difficult to make money out of blogging - the potential for advertising is limited, compared to that available at e mail sites, facebook, and twitter, where new internet 'pages' are opened much more often. Blogging has failed to attract many of the celebrities - actors, musicians, sportspeople - whose tweets and facebook updates are followed by huge and lucrative audiences.

If blogging is out of tune with twenty-first century capitalism, it is also at odds with the thought patterns of many of the residents of developed societies. The millions of citizens who created accounts at sites like blogger.com, LiveJournal, and Wordpress soon ran out of enthusiasm for their new hobby, as the prospect of regularly turning out posts hundreds or thousands of words long came to seem oppressive rather than liberating.

With its draconian restrictions on word length, twitter spares its users the troublesome task or advancing and defending arguments. Slogans, non-sequitirs, and in-jokes replace premises and citations. Just as the brutally abbreviated 20-20 form of cricket removes most of the distinctions between good and bad players, so twitter destroys much of the difference between good and bad thinking. Martin Guptill can make a 20-20 half-century, and Charlie Sheen can become a literary star on twitter.

It is telling that politicians, who were generally rather unenthusiastic about blogging, have become some of the most prolific and popular tweeters. Political soundbites and slogans have been growing shorter and more fatuous for decades, and twitter is the perfect medium for the shortest and most fatuous of them.

With its endless opportunities for self-indulgence, facebook offers a similar escape from the tyranny of thought.

The popular abandonment of blogs in favour of twitter, facebook and other internet platforms that prioritise brevity and insouciance is part of a wider tendency in our culture. As Nicholas Carr has observed in his famous essay 'Is Google making us stupid?' and his follow-up book The Shallows, the internet is being used to enforce a 'Taylorisation' of the modern mind.

Just as the Taylorist method of factory management seeks to get higher and higher yields from workers by dividing their time into smaller and smaller portions and making their tasks more and more specific, so companies like Google are today trying, with the help of technology, to speed up and simplify our thinking. As workers become accustomed to speed-reading and multitasking at their computer terminals, their brains are, to some extent at least, rewired, so that they find it harder to do the sort of 'deep' reading and thinking which literature and serious political discourse demand. Blogging - good blogging, anyway - becomes more onerous, and the inanities of twitter and facebook become more appealing.

Whenever I wonder whether blogging will survive the second decade of the twenty-first century, I think of Alex Wild's The Constant Losers. In Wild's novel, which was published late in 2010 and recently received some well-deserved praise from Landfall, a young man and woman conduct a strange dialogue through the fanzines they self-publish and the cassette mix tapes they create and circulate.

The protagonists of The Constant Losers are twenty-something hipsters, familiar with the topography and nightlife of the bohemian zones of Auckland. Both characters are nevertheless preoccupied with the artefacts left behind by a pre-internet, pre-digital era of youth culture. For most members of their generation, the cassette anthology and the photocopied, stapled-together fanzine seem to require pointless amounts of solitary labour, and appear mendaciously resistant to the desires of readers and listeners. Their tracks and pages cannot be skipped or rearranged, or posted to a filesharing forum. But Wild's characters are enchanted by the fustiness of their cassettes and their fanzines. The scarcities and uncertainties inherent in old-fashioned DIY publishing excite them, as they seek rare issues of fanzines in the freebie racks of inner city music shops. They enthuse over the clunky fragility of their tapes, as well as the archaic hissing and sighing sounds which the cassettes impose like overdubs on the tunes they collect.

Some of Wild's readers have considered her novel a homage to youth and coolness, but I prefer, in my uncool, curmudgeonly way, to see the text as a dig at a civilisation addicted to technological and cultural innovation. Wild's characters rebel against the twenty-first century not by stripping off their clothes and heading for the nearest forest, but by retrieving and aestheticising the obsolete innovations of their parents' generation. Like Joseph Cornell, who made surreal worlds out of old-fashioned objects arranged in boxes, or Laurence Aberhart, who uses Victorian technology to photograph contemporary New Zealand, the heroes of The Constant Losers are determined to find a future in the past. The architecture of Wild's novel complements its theme. The Constant Losers appears, on the surface, to be a chaotic work, consisting as it does of fascimile-style 'reproductions' of the protagonists' respective fanzines. With their zany fonts, smudged black and white images, and corny or esoteric headlines, the zines don't initially look very considered, let alone artful.

But a closer examination of The Constant Losers reveals the author's almost classical concern with form. As they take turns entreating each other, Wild's protagonists balance and stabilise her text. The Constant Losers can be considered a novel of letters, in the tradition of Dostoevsky's Poor Folk and Alice Walker's The Color Purple. Like her characters, Alex Wild refuses to prioritise fashion over history.

Blogging may have been superseded by new and inferior innovations, but the medium need not die. Indeed, bloggers should treat the rise of alternative forms of online communication as a liberation, rather than a disaster. Freed from the curse of coolness, blogging can now develop as a literary and artistic genre, or set of genres. Blogging may have lost some of its old practitioners, but it should be able to attract writers, artists, and political thinkers dissatisfied with the short attention span of twitter and the ritualised onanism of facebook. Blogging may become an act of resistance against the dumbing down of culture and political discourse in the twenty-first century.

Here in New Zealand, Richard Taylor's exciting, perplexing Eyelight is exploring the aesthetic possibilities of the hyperlink, and testing the limits of the internet 'page'. On a series of quieter but equally strange sites, Jack is showing that the blog can become a sort of cultural memory bank. Ross' A Gentle Madness documents his bibliomania, while his edition of the late Leicester Kyle's lost works is bringing an important writer out of the shadows. With his insistence on publishing one seriously researched blog-essay at the same time every week, Giovanni Tiso is using blogging to make a stand against our culture's tendency towards brevity and superficiality. Over at the Kea and Cattle blog, the newly-minted Rhodes Scholar Andrew Dean has been showing that wild eclecticism and intellectual rigour can go together, as he publishes mini-essays about subjects as different as depressed cricketers, South Island regionalism, Rilke, and The Simpsons.

Like the characters in Alex Wild's first novel, today's bloggers are consciously rejecting fashion, and showing the possibilities inherent in a supposedly outmoded medium of communication.

[Posted by Maps/Scott]

66 Comments:

Blogger Steve Robinson said...

Thank you for this. My blog of 7 years has been on hiatus for about 5 months. You motivate me to get back to it.

5:03 am  
Blogger Giovanni Tiso said...

The problem with Nicholas Carr is that he's an idiot. I would urge you to stop quoting him as any kind of authority, since he exhibits all of the superficiality that you ascribe to the masses who flocked to Twitter and Facebook instead of remaining faithful to blogging.

The decline of blogging is in fact a rather more complicated affair. For one thing, people have turned to platforms like Facebook and Twitter because they are better suited to the kind of writing they used to do on their blogs. Many if not most blogs were geared towards the short post hoping to garner instant reaction, and for that Twitter and Facebook work a treat. Autobiographical writing (which, by the way, I wouldn’t knock off-hand. Besides, there is plenty of ‘self-indulgence’ in what you and I do), rants/complaints and casual observations don’t always need to be in long form, and since you address form in your post, I would argue that the formal constraints of, say, Twitter, like most formal constraints, can actually sharpen and focus one’s writing. Simply put, some tweeters are fantastic writers, and to knock their chosen medium as an ‘inferior innovation’ just because of its brevity would be like saying that Borges is an inferior author because he didn’t write novels, instead of pausing to think of why he didn’t (bother to) write novels.

Indeed, speculating on why many former bloggers no longer bother to blog would be rather more productive. Carr, like Jaron Lanier, has a very poor understanding of the economics of it all, by which I don’t mean simply the commercial return or otherwise of online writing, but also the perceived return in terms of social capital for one’s efforts and more generally the issue of the value of blogging. The continuing fortune of fanzines and other forms of what library collection speak pointedly calls “ephemera” would be a much better starting point for a discussion on that than The Shallows.

Two starters for ten: the farewell posts of Nina Power and Evan Calder Williams, undoubtedly the two most grievous losses to blogging in the English language of this past year. The latter is an extraordinary piece of writing.

9:38 am  
Anonymous AHD said...

all of which is to remind me that I need to write something, and I need to return to my former once-a-week publishing schedule.

If only this pile of marking would do itself...

12:11 pm  
Blogger Matthew R. X. Dentith said...

Gio has said most of what I would have brought to the table, so I'll just say a little about the contrarian aspect of your post, specifically:

Like the characters in Alex Wild's first novel, today's bloggers are consciously rejecting fashion, and showing the possibilities inherent in a supposedly outmoded medium of communication.

which is one of the most ludicrous things I have read this year (admittedly, this year has not been going on for very long). Do you have citations and evidence which would support this fantastically bold claim?

Long-form writing is not an inherently superior form of discourse, in the same way that short-form writing is not an inherently inferior method of communication. I don't usually bring Nietzsche to the table, but Nietzsche's philosophy was largely delivered in what we might now call "tweet-sized chunks."

Indeed, the problem with your criticism of Twitter seems to stem, in part, from not recognising that individual tweets are not the be-all and end-all of that form of communication: tweets exist in a context (both with respect to the individual author and with respect to the community in which the author participates) and people can use Twitter to put forward sophisticated arguments (see, for example, #twecon, my attempt to run Twitter-based conferences). Like a lot of blogs (perhaps the majority), some tweets are fatuous, vapid and pointless. Then again, that's true of any medium.

You've admitted to being fairly anti-technology in the past, so I do wonder if you are trying to justify your dislike of modern innovations with post facto arguments.

12:20 pm  
Blogger Timespanner said...

While I had a few weeks absence from blogging due to work pressures (which will shortly start up again, for as long as they last), I had folk who know me and my blog remarking on the lack of new posts. With some concern, bless them. I reckon there are still readers out there. I still get emails from it. All right, if blogs go, they go. But I, for one, haven't lost interest. Not yet.

2:44 pm  
Blogger Giovanni Tiso said...

The man is planting his flag in the soil of new territory, by making a series of bold and creative claims. He admits that his arguments are speculative, and suggests, in that famous article for the Atlantic, 'Is Google Making us stupid?', that it'll take a good deal of time to know whether his arguments are correct or not.

He does no such thing. I would (might) indeed admire him if he were speculative, as I do Mark Poster, but the The Shallows is in fact a litany of dubious, unexamined empirical evidence about the brain being rewired, within the conceptual framework of a McLuhanesque analysis of media as metaphor. The resulting hodge-podge falls apart quickly and in fairly embarrassing fashion. And what remains is the unpleasant feeling that Carr’s real beef is with all these unqualified people writing.

2:45 pm  
Anonymous Scott said...

Sorry, I went to remove the typos (the wages of haste!) from the comment Giovanni was replying to, and its disappeared. Here it is again:

I'm pleased to have motivated you, s-p - I'll add your blog to my roll (when I remember how, that is!)

I have to disagree, provisionally at least, with Giovanni and Matthew.

I'd argue that their apparent demand for overwhelming empirical evidence in favour or against claims about a new and evolving technology is in itself evidence of the hegemony that quantitative, calculative thinking is winning in our culture. The 'Taylorisation of the mind' (sorry Richard!) which poor old Nicholas Carr describes is reinforcing this tendency.

In a different time and place - in fin de siecle Austria, or ancient Greece, or nineteenth century Germany - Carr would be admired rather than scorned for the speculative quality of many of his arguments.

The man is planting his flag in the soil of new territory, by making a series of bold and creative claims. He admits that his arguments are speculative, and suggests, in that famous article for the Atlantic, 'Is Google Making us stupid?', that it'll take a good deal of time to know whether his propositions are correct or not.

Intellectuals have always behaved like this. Are we going to test the propositions of Heraclitus empirically? Try to falsify Nietzsche? Demand data in support of The Myth of Sisyphus? Condemn Freud for publishing his theories without decades of field trials?

I do agree with Giovanni and Matthew about the possibility of producing resonant and intellectually robust work in small-scale forms, like the aphorism or the haiku or the koan or even, in a better world, the tweet.

But I'd argue that in our present situation the potency of a lot of aphoristic thinking, in particular, is lost, because we have grown accustomed, under the influence of advertising, political soundbites, and internet platforms like twitter, to chomping little chunks of thought like lollies. There's a very deep-set and very negative trend towards greater and greater brevity and greater and greater (pseudo-)clarity of language.

Sun Tzu's The Art of War, The Book of Tao, and Heraclitus' fragments are all fascinating and profound compilations of aphoristic texts, but all of them have been corrupted in recent decades. The Book of Tao and the fragments of Heraclitus are New Age faves; The Art of War is popular in China as a business manual!

Texts do not have the same qualities in every time and place. Matthew mentions Nietzsche as a profound philosopher but, as Max Horkheimer showed in an important essay, Nietzsche's texts became poisonous in the 1930s, because they resonated (or had been made to resonate) so perfectly with the designs of the Nazis. It had been corrupted. I'd argue that, in our present conjuncture, a lot of profound short-form works are undergoing a similar corruption.

I don't think that all short-form works are liable to this sort of corruption, though - those which value ambiguity over clarity and rhetorical power, like (say) the micropoems of Robert Grenier or the fragments of Wittgenstein, may actually grow in subversive power.
But I do feel sorry for Heraclitus and Sun Tzsu...

2:53 pm  
Anonymous Scott said...

Hi Timepsanner, I don't read your blog from day to day, but I often use it as a sort of reservoir - if I'm trying to find something out, especially about your stomping ground of West Auckland, I turn up and do a search. If your blog was a book I'd have been reading it out of order, but then again I often read books out of order! I think the archived pages of blogs like yours are very valuable.

2:56 pm  
Anonymous Scott said...

Hi Giovanni,

I've only read a little of The Shallows (it was too expensive to buy, wasn't in the library and -irony alert - I couldn't find it online!) but I'm happy to defend 'Is Google Making us Stupid?' I think it's a splendid text, which presents some intriguing evidence in favour of its arguments without presenting that evidence as conclusive, and which makes an eloquent case for the importance of literature in the twenty-first century. I thought some of the responses to Carr's essay from IT gurus were revealingly philistine.

On a slightly different track, what do you make of Heidegger's take on this general subject in 'The Question Concerning Technology'? I hated that essay when I read it as a student in my early '20s, but it makes a good deal more sense to me today.

3:02 pm  
Blogger Matthew R. X. Dentith said...

Matthew mentions Nietzsche as a profound philosopher but, as Max Horkheimer showed in an important essay, Nietzsche's texts became poisonous in the 1930s, because they resonated (or had been made to resonate) so perfectly with the designs of the Nazis. It had been corrupted. I'd argue that, in our present conjuncture, a lot of profound short-form works are undergoing a similar corruption.

A work being misused is different from it being problematic because of its brevity. Derrida, for example, was never particularly brief yet his works (whose value I am fairly agnostic on) have been misused by some students of cultural studies. Invoking the spectre of the Nazis doesn't say anything much about the form of Nietzsche's work (it says more about the shared zeitgeist of Nietzshe and the Nazis). Any work of any length can be used or misused, so your argument doesn't really show anything about the corruption or inferiority of Twitter or Facebook.

Also, I just want to point out, for the sake of the future generations, that I do not think Nietzsche's work is profound. However, in continental philosophy he is a) taken to be important and b) wrote aphoristically.

Are we going to test the propositions of Heraclitus empirically? Try to falsify Nietzsche? Demand data in support of The Myth of Sisyphus? Condemn Freud for publishing his theories without decades of field trials?

Yes, yes, some of us are (especially given the danger to public health that Freud's theories caused at the time).

3:13 pm  
Blogger Giovanni Tiso said...

Carr's essay for The Atlantic was a bombastic exposition of a preconceived, poorly thought through notion, the limits of which were more sharply exposed when he tried to expand that idea into a book-length argument. I think looking at Heidegger would be infinitely more useful, but I am rather more fond of later repurposings of the greek concept of techne by Ong, Bolter, Yates, Poster, Havelock, to a lesser extent McLuhan - all of whom understood that rhetorical shifts promoted by new technologies always produce epistemic shifts. Carr just gave us the pop treatment of those ideas - ironically, he is the soundbite you deplore.

3:27 pm  
Blogger Giovanni Tiso said...

(If you're on the market for something that might confirm your bias - and I see no harm in that - I'd far rather opt for Amusing Ourselves to Death.)

3:28 pm  
Blogger Giovanni Tiso said...

It occurs to me I reviewed the damn thing. Carr's book, I mean.

3:35 pm  
Blogger Matthew R. X. Dentith said...

In a different time and place - in fin de siecle Austria, or ancient Greece, or nineteenth century Germany - Carr would be admired rather than scorned for the speculative quality of many of his arguments.

As would Ken Ring, Martin Doutre and the like.

3:50 pm  
Blogger Dave Brown said...

Here's a promiscuous use of all media. I'll wake to any medium. There is also a print and paper version. Havnt got a live stream yet.
http://redrave.blogspot.com/2011/07/draft-action-program-for-europe-rising.html

4:57 pm  
Anonymous Scott said...

Hi Matthew,

I don't think that the aphoristic quality of Nietzsche's work led to its abuse by the Nazis - I mentioned him because the different ways his work has communicated itself in different times and places show that we can't assume that a text or a literary form will always have the same qualities.

As Horkheimer pointed out, Nietzsche's work seemed in the late nineteenth century like a romantic rebuke to the industrialisation of Germany and a rejection of the technocratic state Bismark had created. Nietzsche was swimming against a fairly strong stream. He inspired many young people who had progressive politics.

In the 1930s, though, the model of capitalism and the state Bismark had created were in crisis, and reactionary Romanticism was on the rise. Nietzsche looked different, and his cooption by the Nazis was not surprising.

Because of the way that historical conjunctures can overdetermine the reception of texts, it's not good enough to point to, say, the fragments of Heraclitus or the very short short stories of Borges or the Book of Tao and say 'look, short works can be profound, therefore twitter can be profound'. We have to examine the conjuncture we live and ask whether it corrupts certain forms of communication and certain texts.

Do you really think Martin Doutre would be popular in pre-war Vienna or the Greece of Socrates and Plato? I don't think our age and our intellectual milieux have a monopoly on reason and critical thinking...

5:01 pm  
Anonymous ellen p said...

It's ironic how those opposed to Scott's arguments (Giovanni& HORansome)seem to back those very arguments by the length and detail of their comments- comments that would never be able to appear in their present form in such forums as Twitter etc.

5:09 pm  
Anonymous for your information said...

THE INTERNET WILL BE SHUT DOWN ON MAY 11 2018.

5:14 pm  
Blogger Matthew R. X. Dentith said...

Let me get this right:

Because of the way that historical conjunctures can overdetermine the reception of texts, it's not good enough to point to, say, the fragments of Heraclitus or the very short short stories of Borges or the Book of Tao and say 'look, short works can be profound, therefore twitter can be profound'. We have to examine the conjuncture we live and ask whether it corrupts certain forms of communication and certain texts.

So, we shouldn't ask for evidence for the speculation of Nicolas Carr when it comes his hypothesis about the trend of dumbing down via the internet (to quote: "In a different time and place - in fin de siecle Austria, or ancient Greece, or nineteenth century Germany - Carr would be admired rather than scorned for the speculative quality of many of his arguments. ") but we must marshall evidence about the milieu in which texts are created and disseminated if we want to argue against speculative claims like Carr? I think you are trying to have it both ways here, Scott.

Anyway, my argument is not that if A an be profound aphoristically, then B, which is aphoristic should also be profound: that would be a silly argument. My argument is that you seemed initially dismissive of Twitter and Facebook because it was just vapid soundbites as if the form of tweeting dictated its function.

Do you really think Martin Doutre would be popular in pre-war Vienna or the Greece of Socrates and Plato? I don't think our age and our intellectual milieux have a monopoly on reason and critical thinking...

Well, yes I do think he would be popular at those times: he is a thinker whose particular set of ideas were popular in those times.

5:17 pm  
Blogger Giovanni Tiso said...

This comment has been removed by the author.

5:56 pm  
Blogger Giovanni Tiso said...

It's ironic how those opposed to Scott's arguments (Giovanni& HORansome)seem to back those very arguments by the length and detail of their comments- comments that would never be able to appear in their present form in such forums as Twitter etc.

Mmmh... no, sorry, it's not ironic at all. We're having this discussion on the internet, which is anathema to Carr's argument on which Scott's rests. I don't see anything particularly ironic in the fact that I'm very generously singled out by Scott as a virtuous blogger either: I do in fact see merit in long form writing, and I don't think that Twitter is a good medium for conducting in-depth arguments. But that's not to say that I think Twitter is not good for anything, or that the fact that some erstwhile bloggers have transited to Twitter is a bad thing - in many cases, Twitter is a medium more suited to their writing.

The problem as I see it is in the mix: is there any less freely accessible quality critical writing now that blogs are on the decline from the peak of their popularity? I think on balance - and in spite of some very notable defections - the answer is no. And one of the notable virtues of Facebook and Twitter is precisely to link to these writings and make them more visible. I think the media ecology is richer for that.

5:58 pm  
Anonymous Scott said...

I don't think that Nicholas Carr should be able to say anything he wants and get away with it, but nor do I think it's fair to expect him to have burst on the scene with a fully consistent theory backed by a heap of incontestable evidence.

There's a tendency in our culture to take an 'all or nothing' approach to theorising about society. Either a set of ideas about society has the precision and testability of a physics theorem, or else it's completely subjective, and we can either agree or disagree with it depending on whim.

I find the philosophers Imre Lakatos and Hans-Georg Gadamer useful in getting beyond this false dichotomy.

Lakatos pointed out that even the most robust scientific theories were originally bold speculations with internal inconsistencies relatively little empirical evidence in their favour. It often took a long time for evidence to accumulate and for inconsistencies to be ironed out.

Gadamer condemned the tendency of our society to value 'calculative' thinking, which uses models taken from the hard sciences, over the looser, more creative 'judgement-based' thought, which is more indebted to the arts and 'softer' parts of the humanities like history.

I think it is dangerous to imagine that the vast majority of important thinkers about society can be assessed in a 'calculative' manner drawn from the hard sciences - and I don't fancy throwing the likes of Freud, Marx, Nietzsche, and so on into the dustbin of history...

7:01 pm  
Blogger Giovanni Tiso said...

There's a tendency in our culture to take an 'all or nothing' approach to theorising about society.

Yes. It's Carr's approach.

7:14 pm  
Blogger Matthew R. X. Dentith said...

Lakatos pointed out that even the most robust scientific theories were originally bold speculations with internal inconsistencies relatively little empirical evidence in their favour. It often took a long time for evidence to accumulate and for inconsistencies to be ironed out.

That's part of Lakatos's theory, sure, but he also argued that the burden of proof is on people who hold radical theories (like Carr's) to provide extraordinary evidence for their extraordinary claims.

(Indeed, Lakatos, Kuhn et al, have to put in provisos like that because, otherwise, you get the situation we have now with anthropogenic climate change deniers and intelligent design proponents, who propose the "Galileo was proved right!" argument to justify their continuing to speculate wildly.)

I think it is dangerous to imagine that the vast majority of important thinkers about society can be assessed in a 'calculative' manner drawn from the hard sciences - and I don't fancy throwing the likes of Freud, Marx, Nietzsche, and so on into the dustbin of history...

You are, I think, confusing asking for evidence for a view with the methodologies of the hard sciences. I'm a social scientist and I know the limits of evidence, inferential reasoning and the like, but I can still assert that views which make startling claims suffer from holding a heavier burden of proof than others. Gadamer essentially set up a strawman argument about this kind of thing: the hard sciences are also somewhat creative in their calculations and the social sciences, et al, can have calculuses of their own: pluralism, in methodologies, is fairly common.

8:48 pm  
Anonymous Scott said...

I think there are different threads to Carr's argument in 'Is Google making us stupid?' which could be taken in different directions. On the one hand, his suggestion that the way we are using the net is rewiring our brains could be developed into a coherent set of propositions which might be tested, over time, by various scientific experiments. But as Lakatos recognised, that would have to be a long-term project. Giovanni obviously feels that Carr has made an indifferent start to such a project, in The Shallows!

On the other hand, Carr's polemic against shallow reading and an obsession with efficiency is more in line with the sort of 'cultural critique' which we can find in the work of people like Leavis, Orwell, Raymond Williams, EP Thompson and, today, Terry Eagleton (obviously I'm picking on the Poms when I make this list, but I could cite other traditions in other places).

I don't think that the credibility of claims made in this second tradition hinges upon the presentation of vast amounts of evidence.

EP Thompson actually argued that it was dangerous for scholars of history and literature to ponder and present too much evidence. What was crucial, he considered, was the quality of the analysis the scholar made of the samples he had gathered. A brilliant analysis of a single quotation might defeat a meticulous presentation brimming with citations.

We cannot falsify the arguments that Thompson makes in his famous essays, in the way that we can (with the sort of trouble Lakatos makes clear is required) falsify a theory in physics or biology. Nor can we can we falsify Freud, or Heidegger, or Nietzsche, or most of the thinkers who have shaped the humanities. How could we scientifically test the notion of an eternal recurrence or Dasein? To try would be to make the mistake of mixing up apples and pears. That's the point I think Gadamer is making when he calls for the recognition of the place that 'judgement-based thinking' has had in intellectual history.

10:17 pm  
Blogger Giovanni Tiso said...

I'm (largely) with you, although Freud probably deserves a separate treatment. The problem with Carr is that he went straight for the "science help me" school of evidence.

10:23 pm  
Blogger Matthew R. X. Dentith said...

Once again, pluralism. This isn't a game of one methodology over another. It's Gadamer and co who misrepresent the methodologies.

Also, although the doctrine of falsificationism is pretty much a dead horse, look at Dave Chalmers's book "What is this thing called science?" for a discussion of just how easy it is to falsify several of Freud's core doctrines in psychoanalysis. Indeed, given the quite robust debate in my home department, it is quite plausible to say that you can, in some updated sense of "falsify," criticise the core tenets of Nietzsche, Aristotle, Derrida, et al.

I have to say, coming as I do from the position of a lapsed continental philosopher turned analytic, that I'm unimpressed by any notion of "judgement" which claims that "a brilliant analysis of a single quotation might defeat a meticulous presentation brimming with citations." Sure, a brilliant analysis of a sub-set of citations might, in some cases, defeat a less brilliant analysis of a greater set of citations but to make the greater claim, as you say Thompson did, seems foolhardy.

10:25 pm  
Anonymous Scott said...

Thompson made the claim back in the early '70s, in a review of a book called Religion and the Decline of Magic by Keith Thomas.

Thompson was impressed by Thomas' argument that the early modern era saw a decline in traditional folk beliefs and a rise in popular fidelity to systematic religious doctrine - and he was not the only one to be impressed, because Thomas' book quickly became a classic - but he tired of having to read quote after quote, and see citation after citation.

Eventually, Thompson argued, even the most meticulous scholar runs out of quotes and citations, and well before that happens the reader decides whether or not to trust the judgement of the scholar. Thompson felt that the sensitivity and skill with which the scholar presented and analysed his pieces of evidence counted for more than the quantity of evidence.

I think that some of Thompson's greatest moments come when he's improvising to deal with the fact that he lacks much evidence. His extraordinary reinterpretation of the Luddite movement in The Making of the English Working Class, for instance, runs on the empirical equivalent of the smell of the oily rag.

I think Thompson probably wanted to rehabilitate the Luddites because he hated the patronising way they were treated by apologists for the darker side of the industrial revolution, and then decided to press ahead even when he couldn't find a huge amount of evidence to support his reinterpretation. But what an argument he makes!

10:42 pm  
Blogger Matthew R. X. Dentith said...

Well, that's all very nice and all, but that kind of thing isn't the kind of activity that impresses me. I like arguments to rest upon premises which, if they make interesting and perhaps controversial statements, have arguments to back them up which aren't merely based on trusting the "brilliance" of the scholar. There's a fairly long and sullied history of previously good scholars suddenly making new and radical interpretations of data (in history, in the sciences, et al) which turn out bad and tend to say more about the psychology of the researcher than they do about the quality of their ideas.

10:50 pm  
Anonymous Anonymous said...

Nietzsche quite clearly states that no university philosopher is qualified to read him, let alone assess him.

11:03 pm  
Anonymous Scott said...

'I like arguments to rest upon premises which, if they make interesting and perhaps controversial statements, have arguments to back them up'

But Thompson does have arguments aplenty to back his claims, in The Making of the English Working Class and in his other classic texts - he just doesn't have masses of data.

In the whole of The Making, a book which revolutionised our understanding of industrialisation, modernity, and class conflict, Thompson never once offers an estimate for the size of the English working class, let alone the rate of its growth. Nor does he bother with the economic data that preoccupied so many previous scholars of the industrial revolution.

What Thompson does is present new readings of a series of primary texts. And the authority of his book rests on the power of these readings.

We can't contest Thompson's claim that Luddism was a form of resistance to capitalism, not a mindless reaction to modernity by quoting statistics - we have to go to the arguments Thompson brings to the (meagre) primary material. And though we may well disagree with Thompson's reading of Luddism, we won't falsify him, in the way we might be able to falsify a theory in the hard sciences. We're dealing, in texts like The Making, with a different discourse and a different, though no less valuable, method of evaluation.

12:01 am  
Blogger Richard said...

This is a get review Scott. I felt guilty as I had put Alex's book aside.

I agree. I am on Face Book but I feel we are in time where everyone is moving so fast. Progress is equted with quickness or brevity. (I used to drive fast and even dangerously in my late twenties to thirties but now I am quite slow and leisurely on the road...people should realize that older people take more time in making decisions, also one becomes more alert to dangers and to checking things out...)) It can be that, but there are other, deeper values.

I don't have a cell phone which is ironic as I was in telecommunications for many years. I also leave my land line phone off the phone to avoid early morning calls, or those calls selling me things or offering me things etc

I deliberately work very very slowly on my Blog. I wait for things to come to me. I don't try to make myself famous or anything. Of course one always hopes, one is human! (And it IS good to get comments). But I feel that a writer or an artist should be true to his or her self (their art, in all its form and aspects)) and from looking again at Alex Wilds book (and Jack's The Kingdom of Alt which also got a good review and which I have read quite some of) I can see that with new material Art is still continuing. The modish has its place but there are deeper issues and questions. I forgot (!) [strange "conversation" with a mysterious "anon" on your other post] that in fact I do owe some debt to Roland Barthes...now his Camera Lucida is really worth studying in the light of all this. In deeps se he is resistive but wanting deeper and even "traditional values...he wanted truth or he wants (or wanted) to continually question truth and and ideas...photography not film and not the internet is defining...but the intent is hugely significant.

Alan Sondheim went so far as to do huge project called a Meditation on the Internet. (He has disappeared fro some meas on I know not.) Mark Young keeps on with his "Series Magritte" and so on. Bronwyn Lloyd also has a Blog. face Book has its place but I limit time on there.

Well, I am motivated to read Alex Wild's book!

1:40 am  
Blogger Richard said...

Yes Giovanni is right, there are many different ways or forms. Blogging is one way and twitter etc fulfill other functions.

Some people find that Trade Me is really the basis of their social life. After all Markets still exist in the 'real' as well as Trade Me the online market and both fulfill similar functions.

Buying and selling, but also social intercourse etc For some, MradeMe etc are their only way they can communicate, which is sad...but it is also good if they have it as some use talk back radio to contact others.

We live in a very lonely world.

1:48 am  
Blogger Richard said...

Logical reaong is usefulas tool but Empiricism broke down (Hence Kant's critique, but some of Hume's questions [and Locke and Berkley, who verges on divine madness!] remain as mysterious and as bafflingly ambiguous (in their implications) as those of Heraclitus)) and is always in dialectical conflict with Idealism.

Doutre, Heiddeger, Marx, Plato Nietzsche, Foucault all have something to say. All can be useful and can stimulate ideas, but none of their ideas can be "proven". There are no wrong ideas. It is the mode of discourse that is important, or the process of the "discussion" or the dialectic.

Science breaks down in looking at questions of ultimate truth*, and it keeps forgetting it is in a probabalistic world. Religious fanatics (are they fanatics?) are as equally absolutist or can be.

Chris Hedges, rejects religion per se, but believes against Richard Dawkins (with whom he argues about evolution being proven) that Religion has a deep place even in world as conflicted and wounded as we are in now.

*Noone will ever know how the universe was formed, or "how verything works", for example. reality as we live in will forever be a msytery, hadrons or no hadrons, Hicks particles or no Hicks particles. Wittgenstein knew that.

2:16 am  
Anonymous Anonymous said...

Taylorisation of the mind...

does that mean RICHARD Taylor?????

lol

2:30 am  
Blogger Matthew R. X. Dentith said...

You keep focussing on falsificationism, as if its some important doctrine that distinguishes the sciences from the non-sciences. Philosophers gave up on that distinction quite some time ago: falsificationism is a dead horse. That's the lesson of Lakatos and Kuhn. Falsificationism is a very troubled concept that isn't even in any real use in the hard sciences: probably the best model for explaining theoretical change in both the social and hard sciences at the moment is David Lewis's notion of the web of belief.

Anyway, it sounds as if Thompson is dealing with an underdetermination of evidence problem, which raises the spectre of "Given a lack of evidence, why should we trust any analysis?" Whilst you can be impressed by a "brilliant analysis," a lack of evidence (or a very meagre set thereof) can be grounds for questioning whether the analysis, brilliant or otherwise, is something an epistemic agent should reasonably come to believe. One person's "brilliant analysis," in such situations, can be another's "dumb hypothesis" and often the demarcating factor is to do with the background assumptions both the writer of the analysis and the reader bring to the table (for example, I'm still not convinced that Marxist readings of history provide fuller explanations than explanations from other schools, which affects my appreciation of any piece of historical interpretation).

Also, I think we're probably not talking about the same thing when it comes to what "evidence" is. You seem to be taking evidence as pieces of hard data whilst I'd add in that arguments can be evidence too (but not all evidence are arguments), given that arguments give reasons to believe some conclusion (or sub-conclusion) which then gives weight to some interpretation of an event.

8:54 am  
Blogger Ben Wilson said...

Interesting post. I'd answer in long form, as has been my only style for years, but my New Year's resolution is not to do that because it ate my life.

@HORansome - I'm on the opposite trajectory to you, I was exclusively analytic when studying philosophy, since I was also a computer scientist and the methods fit well together, indeed people who have my kind of skills tend to end up as "analysts".

I can't say I enjoyed it, really. Time for a change. Probably not more philosophy, and hopefully less computers.

9:54 am  
Blogger Matthew R. X. Dentith said...

I realise you probably aren't mistaking this mistake, Ben, but its highly problematic to associate analytic philosophy with the kind of work analysts in the business, et al, world do. For one thing, I know several continental philosophers who are, or have been, business analysts and for another thing, "analytic philosophy" is a group term referring to a number of really quite different philosophies which tend to have their point of origin in the US or the UK rather than Europe (talking here about Western philosophy, obviously). Pierce is an analytic philosopher, for example, as are most of the pragmatists.

10:04 am  
Blogger Ben Wilson said...

I wasn't referring to business analysts. I was referring to a role in software development, although there is some crossover to business analysis there. The most useful business analysts tend to at least understand how to do logic.

I'd be interested to hear of continental philosophy as applied to business analysis. I wonder if there are postmodern use-cases and diagramming techniques, or methods of factory optimization. I suspect not, that the value of such thought would be at the highest levels only, analyzing business culture and language, or some such thing. From what I can tell about senior management, so long as you can tell a good story, you're in. If you've mastered the language of continental writing, you'll have no trouble with the far more pedestrian business of corporate language. And in the end, it's not the quality of ideas that matter there at all, just the relationships you build. If your boss likes French wine, you're probably in. Or you were buddies at high school or varsity with some of the same people. I'm not disparaging this - it's just the same for anyone claiming to be analytic too.

Maybe I'm wrong. Like I said, I went for the analytic schools. There could be a vast untapped world of highly appropriate, specific, and technical techniques from the continent, that are mightily appropriate to the business of software development. I may have been programming in an analytic cocoon for 30 years.

10:34 am  
Blogger Ben Wilson said...

Also, FWIW, I find the analytic/non-analytic distinction itself problematic, which is why I only went for the analytic stuff. In artificial intelligence research it has been credibly said that the reason they all believe in a mechanistic model of the human mind is because there really is no actual rival to it, that isn't mystical non-science. Any time you can think of a model, you're describing a machine. The best you can hope for is to come up with a different program, but under that, it's still a machine.

Trying to remember who said that. I think it was Dennett.

10:57 am  
Anonymous Scott said...

Would it be fair to say that the analytic/continental tradition in philosophy has become a bit blurred in recent times, with the Brits and Americans, who have traditionally been the curators of the analytic tradition, starting to take a strong interest in Heidegger and certain other continental types? The distinction was always a bit confusing for laypeople like me, because it suggested a geographical divide which didn't really exist. Popper and Wittgenstein, after all, were archetypal analytic philosophers, yet came from Austria...

12:32 pm  
Anonymous Scott said...

Here's our man Kendrick Smithyman unifying the analystic and continental traditions in one of his late poems:

BRIEF HISTORY OF WESTERN PHILOSOPHY: THE BICYCLE

Did Bertrand Russell first adjust his hat?
Probably not, but carefully he wrapped
his trousers round his ankles, then clipped,
hiked his leg over and pushed off
pedalling away towards his mistress.
Marriage was finished, as he thought.
The road ahead was fairly level going.
Traffic was not heavy.

Heidegger never learned to drive a car.
The enemy was getting close; he might be
arrested on the spot. He took his bicycle,
pedalling east towards his old hometown,
Messkirch. The good life of Freiburg was behind.
Student Nolte pedalling faster overtook:
"Professor, sir, your wife has sent you this,"
a knapsack stuffed with laundry freshly ironed,
some food. Thoughtful Elfride, ever on the job,
she knew the road ahead would be tough going
and the traffic – best not dwell on that.

‘There are, indeed, things that cannot be
put into words. They make themselves manifest.
They are what is mystical.’
Wittgenstein, 6. 522

1. 6. 93
Editor's note
Brief History of Western Philosophy: The Bicycle : first published in Landfall 191 (Autumn 1996), 112, also in Last Poems; Bertrand Russell: (1872-1970), British philosopher and mathematician; Martin Heidegger: (1880-1976), German philosopher, author of Being and Time (1927), was professor at the University of Freiburg, 1929-45; Ludwig Wittgenstein: (1889-1951), Austrian born British philosopher who studied at Cambridge University with Russell; the quotation is from his Tractatus Logico-Philosophicus (1922)

Previous | Ne

12:53 pm  
Blogger Ben Wilson said...

>Popper and Wittgenstein, after all, were archetypal analytic philosophers, yet came from Austria...

Yes, and the Vienna Circle was the heart of Logical Positivism.

4:39 pm  
Anonymous Anonymous said...

some dirty lies here!!!!!!!!!!!!

4:53 pm  
Blogger Dave Brown said...

Hey Scott get hip with ex-Workers' Power blogger Paul Mason (who also tweets, texts and is on facebook)as well as being BBC Newsnight economics editor. Very cool dude.
http://www.guardian.co.uk/world/2012/jan/03/how-the-revolution-went-viral?CMP=twt_gu

Wrote an excellent critique of state capitalism about 20 years ago.
http://www.fifthinternational.org/content/crisis-stalinism-and-theory-state-capitalism

More power to your blogger's wrist!

9:31 pm  
Blogger Matthew R. X. Dentith said...

Would it be fair to say that the analytic/continental tradition in philosophy has become a bit blurred in recent times, with the Brits and Americans, who have traditionally been the curators of the analytic tradition, starting to take a strong interest in Heidegger and certain other continental types?

This is a legitimate question (rather than trouble-making) question: who are these curators (I'm not even sure what that means, truth be told) of the analytic tradition who are beginning to take a strong interest in Heidegger and certain other continental types? I'm not aware of this trend meself.

1:12 am  
Blogger Matthew R. X. Dentith said...

The distinction was always a bit confusing for laypeople like me, because it suggested a geographical divide which didn't really exist. Popper and Wittgenstein, after all, were archetypal analytic philosophers, yet came from Austria...

It's a confusing divide, to be sure, but there are cultural differences which end up being important. In analytic philosophy, loosely speaking, we end up being not so impressed by who said what but rather by what the best arguments say in support of some position. So, the fact that Popper came up with falsificationism isn't particularly important, or that Lakatos first expressed the notion of research paradigms but rather what the best construal of thesis of falsificationism looks like now or what more modifications to the notion of research paradigms say about the notion of theory change over time. Contrast that with, say, Nietzsche scholarship, which (loosely speaking) focuses on new reinterpretations of Nietzsche rather than what the successors to Nietszsche said (as I said, this is very loosely speaking).

1:19 am  
Anonymous Anonymous said...

Don't forget some of the serious writer - writers from New Zealand who blog regularly, like Chad Taylor (http://chadtaylormarginalia.blogspot.com).

There's a couple of others but his is probably one of the best exampples of the author blog.

9:27 am  
Blogger Ben Wilson said...

Yes, analytic philosophers are not really fans of radical reimaginings, preferring slow incremental knowledge growth. Modernists, ultimately. Usually, they've got quite tight fields of specialization, although after establishing a reputation, as is the trend in public life, they'll often speak far more widely. So you get Bertrand Russell, who spent most of his early years on logic, number theory, etc, finally publishing Principia Mathematica in which practically his only original contributions to philosophy occur, later taking on The History of Western Philosophy (which culminated in his own contributions). Seems to me rather a similar trend to famous scientists taking up general commentary late in their lives, or our successful business people turning politician, of the wide experiences they garnered trading currency.

Most of them do seem to see their own positions very much as the end of philosophy, the last speakers before the business is taken over by engineers. It's quite interesting that they almost always flip to general interests later in life - like cognitive dissonance has finally got to them, that in actual fact they loved the generality of philosophy or science, despite having taken stands that are all about industrializing knowledge, making it highly specialized and specific, which of course kills general studies.

It was rather depressing to see that the central position of the philosophy department at Auckland, which was right next to the library, right in the middle of the university, in a charming original old building, is now only a fond old memory of mine. The building has been removed and not replaced - where it was is an empty courtyard, and philosophy is taught in the most depressing building on campus, on levels that are mostly below the ground. Just from an architectural point of view alone I couldn't bear to revisit studying there - walking those halls felt even more soul-less than a corporate high rise, a long dank corridor with side rooms for tutorials or old philosophers (most of whom taught me 20 years ago). It was like walking into a crypt.

10:02 am  
Blogger Matthew R. X. Dentith said...

Sorry, Ben, but that's one of the most inaccurate and ungenerous descriptions of analytic philosophy I've ever read.

That being said, your comments about the dullest of our current location in Arts II are mostly right (although, most of the Department is above ground and we don't teach tutorials in that building).

10:06 am  
Blogger Ben Wilson said...

@HORansome Feel free to have a crack at any particular part of the characterization. I'll agree that it's not generous, and will point out again that analytic was my school, so it's not like I've never seen any use in it. I still do. It's one way of doing things, one way of thinking about things, and a powerful one at that, probably the most powerful one that has ever gripped the world apart from religious faith. But it's not the only one, and I don't think my comment on the psychology of it is without truth. It may be hypocritical, but that is, as the analytics say, a tu quoque argument.

We do also seem to be digressing majorly, and I'm breaking my NY resolution. Gah.

10:39 am  
Blogger Dave Brown said...

@Ben Agree about what's happening in Uni. Had a look at University of Fuzzo. How's it going? Do you know about the new University without Conditions? http://www.fu.ac.nz
Your sort of self-learning could be shared around.

10:46 am  
Blogger Matthew R. X. Dentith said...

The thing is, I think your description is so inaccurate that I'm not really sure where to start taking it apart (I don't buy your psychological thesis at all: if it's true it;s limited to a small subset of philosophers, arguably a subset in proportion to members of other academic disciplines who shows the same tendencies). I'm also aware that to critique your position I might fall into the trap of provisionally accepting some of your claims and thus arguing on your own ground (which, as I say, I just don't think is a fair description). I'm also now in the midst of teaching a course in summer school, so I no longer have lots of free time to argue on the internet.

So, this will be brief.

1. I think your description is inaccurate because it, in part, suggests that analytic philosophy is about incremental steps whilst non-analytic philosophy is about radical reimaginings.

Firstly, the latter isn't true: non-analytic philosophy (say, continental and Eastern) is a widely divergent group of philosophies, some of which are radical reimaginings and some of which are small theories focussing on minutiae. It's a common mistake people make with respect to continental philosophy (and I was a student of continental philosophy up until the beginning of my PhD) to say that it's radical and challenging of the status quo when, really, most of it is small theories dealing with specific fundamental philosophical questions and not necessarily giving you big picture theories.

Secondly, the former claim isn't even true: over the course of the 20th Century, in analytic philosophy, we've seen radical reimaginings of models of explanations, models of epistemic justification, modal realism, the nature and structure of time, counterfactuals, causation, probability theory, just to name a few. Most of the earlier accounts have been thrown away (goodbye JTB account of knowledge) or are in limited use (we still teach falsificationism, because the problems of falsificationism are as interesting as the problems of confirmationism).

2. I think your description is inaccurate because you seem to suggest we analytic philosophers are all specialists (until the end of our lives) and we don't focus on big picture theories. However, that ignores a whole host of philosophers like Searle, Nozick and Rawls (whose theories, although looking specialised, basically provide big picture answers to fundamental questions) and it ignores another host of analytic philosophers like David Lewis, who were pluralists and didn't think there could be big picture theories (or that the big picture theory was a form of pluralism).

So, for example, I'm a specialist in epistemology, but I have developed views on ontology, the structure of mathematics, ethics and the like which give me a nice, big picture pluralism. Just because my thesis doesn't crank out an answer to life, the universe and everything doesn't mean I don't have a view on it (42).

3. I think your description is inaccurate because you seem to think that people changing research interests late in life is some kind of cognitive dissonance rather than, say, changing research interests because, well, sometimes you decide that after thirty years on causation you'd finally like to formalise that theory of art you've been thinking about for the last thirty years as well.

I could go on, but in the process of writing this I received three e-mails from students I really should deal with.

11:17 am  
Blogger Ben Wilson said...

Not well, Dave. It's turned into a school of hard knocks, as predicted by those of my friends who had tried it themselves.

At this point, I'd be reasonably happy with a job pulling up weeds, indeed, I'm heading to WINZ to look for one right now.

11:31 am  
Blogger Ben Wilson said...

@HORansome I made no claims about non-analytic schools.

To seriously evaluate the claims that developments in analytic philosophy have been radical requires a definition of radical. At least in the analytic school it does. I didn't give one, I don't really think it's easily done.

I also never said that I thought analytic philosophers aren't actually radical. They just don't really consider themselves to be. They want to work within existing structures, to reform modernity from within. They believe in the project. The concept of burning down huge swathes of thought, as Nietzsche might have, isn't their style.

Within those tight constraints, they have the greatest power to be very radical indeed. Change one element of the logic of any major thought structure and you can invert outcomes, taking an opposite stance on something that was received wisdom only the day before.

I'm never sure if non-analytic people get this. Perhaps they really, really do, and they just don't like it, much the way I don't like how a tiny change to software can completely break it.

As I said already, I find the distinction itself problematic. It's not like logical analysis never occurred to anyone outside of this collection of people, and it's not like analytic people don't constantly fall back onto their humanist backdrop when actually trying to make the theories work.

I certainly never said they can't be generalists - my example of Russell is one of a major generalist - in the end. AFTER he had demonstrated his worthiness to modernism by adding a couple of pieces to a massive puzzle that was the culmination of the works of hundreds of mathematicians over centuries.

Perhaps this is a phenomenon that happens outside of analytic philosophy too - I drew parallels to the same thing happening with technocrats and public life. Maybe same goes for the continental philosophers - I wouldn't know. They've never been my study.

My cognitive dissonance point is weak, though, you are right there. It's a very vague pattern, that highly specific technical mastery ends up being a prerequisite for speaking about generalities in that school. It's by no means confined to philosophers - practically everyone is not respected as a generalist until they have first been a specialist. To me there's something quite silly about that, something highly prone to killing a subject like philosophy. Why study it first? You might as well study science or maths or economics first.

Maybe that's a fair cop too, that the killing of western philosophy is the natural outcome of western philosophy. Philosophy used to be the *only* subject taught at Auckland University. How's the roll looking now, as a proportion of the enrolled students there? I'm really curious.

12:23 pm  
Blogger Matthew R. X. Dentith said...

I don't know the exact numbers, but this year we were the biggest Department with effective fulltime students (EFTS) in the Arts Faculty (and, I believe, across the University as a whole, although don't quote me on that as I might be wrong (estimating and comparing EFTS is tricky across faculties), so we're doing well.

I also never said that I thought analytic philosophers aren't actually radical. They just don't really consider themselves to be. They want to work within existing structures, to reform modernity from within. They believe in the project. The concept of burning down huge swathes of thought, as Nietzsche might have, isn't their style.

I really don't agree with that claim: any academic (philosopher or otherwise) dreams of being able to topple a system of thought with a new paper and people try to do it all the time in analytic philosophy. Admittedly, we often fail to show this to students in the BA because we tend to teach the old standards from the usual textbooks, but when you start looking at journal articles, you start to find out that burning down the house is something everyone seems to attempt at least once. I know I want to reinvent epistemology by focussing on units of epistemic agents rather than individual epistemic agents, but it's going to be several years before I can write that one up.

12:31 pm  
Blogger Ben Wilson said...

@HORansome I'd like to thank you for taking the time on this discussion, btw. It does seem off topic and I'd actually like to pick it up with you directly, if you are willing.

12:46 pm  
Blogger Matthew R. X. Dentith said...

Can do.

12:47 pm  
Blogger Ben Wilson said...

Cool, check Twitter.

12:54 pm  
Anonymous Anonymous said...

Hey there! I know this is kinda off topic but I’d figured I’d ask. Would you be interested in exchanging links or maybe guest writing a blog post or vice-versa? My blog addresses a lot of the same topics as yours and I think we could greatly benefit from each other. If you might be interested feel free to shoot me an email. I look forward to hearing from you! Excellent blog by the way!

4:56 pm  
Anonymous jaket kulit said...

There are some intriguing points more than time in this posting but I don’t determine if I see these folks center to heart. There is definitely some validity but I will take hold opinion until I appear into it further. Very good post , thanks and that we want a lot far more! Put into FeedBurner also

4:57 pm  
Anonymous jaket kulit said...

It always is amazing to me just how blog owners for example yourself can find some time and also the commitment to keep on composing terrific blog posts. Your blog isfantastic and one of my need to read weblogs. I just wished to thank you.

10:39 pm  
Anonymous http://www.jaketkulitnusantara.com said...

I LIKE THIS AND I SALE LEATHER JACKET .THANK GAYS

10:15 am  
Blogger Richard said...

Out of all this you get the offer - with what I suppose is a (deliberate or not?) spelling error so that guy's becomes gays! It seems that now, while Blogs continue, the debate came to a head when Ted attacked FB and Twitter. It is still and interesting post though. But,indeed, I think many of the Bloggers have moved to Twitter and FB and other such.

The debate is still important. It may be an empty room, but it is still a room with resonance.

And Alex Wild's book was overlooked (more or less) in the ensuing debate.

By the way, for some older people, such as myself, find looking at any screen tiresome (and reading gets harder as one's sight deteriorates). Books with largish font are good. Larry McMurtry, who is a writer, a book dealer with a huge book store, and a collector of books (he has several thousands of his own and his store did have about 300,000 books), points out that book lovers simply like books. To them, the debate is irrelevant. They don't enter the debate or listen to it. The keep on collecting and loving books. (I might get an e-book as I suppose the font size can be enalrged, but I will still keep my physical book).

What many people are missing is the tactility of life as lived. They never pull a clock or engine apart, or build a shelf, or paint a house or mix concrete: their life can become increasingly a fantasy on YouTube and so on. Still, people can be on Twitter, build and do hands on things, have physical books,cell phones as well as old fashioned clunky stuff.

I'm sorry I Taylorised everything! I'd never heard of the fellow Nicholas Carr.

But 'falsification' is used in logic. It is nothing new, and isn't a proof or disproof of anything except a fairly tight chain of logic. And epistemology? New things have knocked out the old questions of justified belief etc? That is like saying Aristotle or Buddha have been disproved. Or as I famously quipped in the SH and HH Salt I think it was: "Shakespeare's plays have been disproved."

12:47 pm  
Anonymous Anonymous said...

Thank You For Share.

Jaket Kulit
Jaket Kulit

9:11 am  
Blogger OGEN Infosystem (P) Limited said...

Thanks for this awesome information. If you need an attractive website design for your business, visit OGEN Infosystem and also get SEO Services for your business promotion.
Website Designing Company

6:06 pm  

Post a Comment

<< Home