• Café Life is the Colony's main hangout, watering hole and meeting point.

    This is a place where you'll meet and make writing friends, and indulge in stratospherically-elevated wit or barometrically low humour.

    Some Colonists pop in religiously every day before or after work. Others we see here less regularly, but all are equally welcome. Two important grounds rules…

    • Don't give offence
    • Don't take offence

    We now allow political discussion, but strongly suggest it takes place in the Steam Room, which is a private sub-forum within Café Life. It’s only accessible to Full Members.

    You can dismiss this notice by clicking the "x" box

Craft Chat What will happen to Writing in the Age of Generative AI?

wrightstuff

Full Member
Joined
Aug 23, 2024
Location
Singapore
LitBits
0
Singapore
The Head of Y Combinator, probably the world's most prestigious and worthwhile business incubator, has made a rare prediction about technology.

He believes that generative AI will reduce the need to write so much that the world will be divided into the 'writes' and the 'write-nots'. Since he also believes that those who can't write can't think, the consequence is that the world will be divided into thinkers and non-thinkers. That is a truly scary prediction.

I think he is spot on with his prediction, but I'm not so sure about the follow-on division into thinkers and non-thinker. Here's the article: Writes and Write-Nots

What do you think?

By the way, I'm claiming dibs on the use of this concept in a future dystopian novel...
 
It's an interesting article, and I think he makes some excellent points about the likely future of writing in general. Although people who are driven to write, will still write, and that's a lot of us.

Clinical pedant that I am, and on the subject of AI, I must take issue with the example he uses that 'Doctors know how many people have a mole they're worried about.' It makes his point, I suppose, but it's simply not true. Doctors don't know how many people are worried about anything. And they probably couldn't tell you how many people they have referred in recent months with suspicious-looking skin lesions. But the GP surgery's AI can. The AI can tell me how many of our 18,000 patients have presented with skin lesions in the past week/month/year, and how many have been fast-tracked for specialist dermatology assessment, and how many of them have gone on to have cancer diagnoses. Not how many of them were worried, though (presumably all of them, but let's not presume).
AI has revolutionised our triage and appointments system over the last 12 months. There is no longer a wait for face to face appointments - you'll be seen same day if it's urgent, or within the week if it's routine. Pretty good, huh?
But we don't let the AI make any clinical decisions. We don't let it look at the images of skin lesions and decide if they need fast-tracking or not. Because there is so much more to clinical diagnosis than pattern recognition. Every clinical decision is made by a senior clinician. For now at least...

I look forward to your dystopian novel @wrightstuff There's a lot for you to go on, isn't there?
 
It's an interesting article, and I think he makes some excellent points about the likely future of writing in general. Although people who are driven to write, will still write, and that's a lot of us.

Clinical pedant that I am, and on the subject of AI, I must take issue with the example he uses that 'Doctors know how many people have a mole they're worried about.' It makes his point, I suppose, but it's simply not true. Doctors don't know how many people are worried about anything. And they probably couldn't tell you how many people they have referred in recent months with suspicious-looking skin lesions. But the GP surgery's AI can. The AI can tell me how many of our 18,000 patients have presented with skin lesions in the past week/month/year, and how many have been fast-tracked for specialist dermatology assessment, and how many of them have gone on to have cancer diagnoses. Not how many of them were worried, though (presumably all of them, but let's not presume).
AI has revolutionised our triage and appointments system over the last 12 months. There is no longer a wait for face to face appointments - you'll be seen same day if it's urgent, or within the week if it's routine. Pretty good, huh?
But we don't let the AI make any clinical decisions. We don't let it look at the images of skin lesions and decide if they need fast-tracking or not. Because there is so much more to clinical diagnosis than pattern recognition. Every clinical decision is made by a senior clinician. For now at least...

I look forward to your dystopian novel @wrightstuff There's a lot for you to go on, isn't there?
That's really his point - only those who choose to do so will write - and he believes that these will be the only thinking people. Initially, that will be a significant percentage, but it will decline over time. A reference might horse-riding. Before cars, you walked or (if you could afford it) rode. That meant that all but the very poorest could ride, even if they didn't do so frequently. Over time, this skill became unnecessary for everyday life, and became a hobby, with the percentage riding shrinking decade by decade. This is what he sees happening with writing.

Ironically, I suspect it should have been 'Doctors know that many people have a mole they're worried about.'

Actually, I believe what you're describing is the application of data analytics - the use of data interrogation techniques to find patterns and predict outcomes. It used to be called data mining when I was selling the service to clients in the 90s.

Here in Singapore, a combination of data analytics and learning systems (not quite full AI) has been tested as a diagnostic tool. The data-driven system produced more accurate diagnoses than even the most experienced clinician. This was especially true of visual information (e.g.: skin lesions), perhaps because the learning system had the ability to reference a vastly greater data-set, with perfect recall. It points to a future in which any occupation currently relying on the application of knowledge will be more about managing AI systems (and people) than making the actual decision.

That simple idea is a whole world, and if you'll excuse me, I have to go and build it...
 
Okay, using MLK and JFK Jrs as examples of those who plagerize and pay folks to write for them seems a bit off. While MLK is known to have plagerized pieces of his doctoral thesis, nobody reads that, and there is no suggestion he didn't write Notes from a Birmingham Jail or the stuff we do read, and JFK hired a speech writer? Like every other politician? A ghostrider, ditto. Picking out these two leaves me pretty convinced he's a conspiracy theorist advocate.
Beyond that, writes and write nots ignores that social media is largely text-based, doens't it?
That said, every week there is another examplo of writing tasks being turned over to AI. Now, this is not because people cannot write, but rahter than it costs more to hire a person.
 
Okay, using MLK and JFK Jrs as examples of those who plagerize and pay folks to write for them seems a bit off. While MLK is known to have plagerized pieces of his doctoral thesis, nobody reads that, and there is no suggestion he didn't write Notes from a Birmingham Jail or the stuff we do read, and JFK hired a speech writer? Like every other politician? A ghostrider, ditto. Picking out these two leaves me pretty convinced he's a conspiracy theorist advocate.
Beyond that, writes and write nots ignores that social media is largely text-based, doens't it?
That said, every week there is another examplo of writing tasks being turned over to AI. Now, this is not because people cannot write, but rahter than it costs more to hire a person.
Yeah - I didn't think the examples were helpful.

Also, his conclusion that professors plagiarise trivial writing says (to me) that they consider that task beneath them, rather than that they're not capable of it.

As for social media, I would expect input to become more and more by voice than typing over time, but still read to a significant degree.

The article is actually poorly-written, which is unusual for him, but does show how little even those in significant positions value writing skills - even now.

It's a difficult ask, but look past the presentation to the idea: writing as part of a work-related skillset is likely to decline, and basic communication is moving towards voice, video and ideograms. Project far enough, and writing of any sort will be voluntary rather than needed.
 
Ironically, I suspect it should have been 'Doctors know that many people have a mole they're worried about.'
Ah, that would make more sense.
Actually, I believe what you're describing is the application of data analytics - the use of data interrogation techniques to find patterns and predict outcomes. It used to be called data mining when I was selling the service to clients in the 90s.
Data analytics is part of its responsibility but it is a triage AI. Asks questions of the patient in order to sort and sift urgency and book appointments. It's like a receptionist capable of dealing with unlimited cases at once.
Here in Singapore, a combination of data analytics and learning systems (not quite full AI) has been tested as a diagnostic tool. The data-driven system produced more accurate diagnoses than even the most experienced clinician. This was especially true of visual information (e.g.: skin lesions), perhaps because the learning system had the ability to reference a vastly greater data-set, with perfect recall. It points to a future in which any occupation currently relying on the application of knowledge will be more about managing AI systems (and people) than making the actual decision.

That simple idea is a whole world, and if you'll excuse me, I have to go and build it...
Yes, AI is really good at pattern recognition - ours is learning from us so that it will do this in the future - but its bedside manner is rather cold.
 
Yes @Bloo totally scream-worthy. I'll likely be long gone by the time it all reaches its logical conclusion. For now, my speciality (geriatrics and palliative) is fairly safe. It's a job most humans don't want to do, but it's low priority for AI too.
 
Back
Top