What matters the most is not the words within an utterance, but the discourse conveyed by that utterance. [Translation: how you say it matters less than what you say.]
Word usage is prone to trends. Not just slang. Easy come, easy go.
What I am concerned however is that those chatbots babble a bloody lot. And people might be willing to accept babble a bit more, due to exposure lowering their standards. And they kind of give up looking for meaning on what others say.
What matters the most is not the words within an utterance, but the discourse conveyed by that utterance. [Translation: how you say it matters less than what you say.]
Under certain circumstances. How you say things in work and personal settings such as dating can absolutely affect outcomes.
The number of times I’ve been attacked over tone growing up tells me that I either had abusive parents, or and that how you say stuff matters a lot. Intonation can also turn a statement into a question or even make it sarcastic. Words come with baggage beyond their meaning, and using a word with negative connotations can turn a compliment into an insult.
In the specific case of clanker vocab leaking into the general population, that’s no big deal. Bots are “trained” towards bland, unoffensive, neutral words and expressions; stuff like “indeed”, “push the boundaries of”, “delve”, “navigate the complexities of $topic”. Mostly overly verbose discourse markers.
However when speaking in general grounds you’re of course correct, since the choice of words does change the meaning. For example, a “please” within a request might not change the core meaning, but it still adds meaning - because it conveys “I believe to be necessary to show you respect”.
I take the specific view that inadvertent phrasing of three words or less has repeatedly changed my life. Three is actually really excessive. One usually does the job; I only had to go to three once.
But then again, I’m a writer. Oh, I sure as shit wasn’t the first time, but this is where serendipity cones into play. You know what you have to say without even thinking about it.
And AI sucks at that. If you interpret its output as a human-made summary, it shows everything you shouldn’t do — such as conflating what’s written with its assumptions over what’s written, or missing the core of the text for the sake of random excerpts (that might imply the opposite of what the author wrote).
But, more importantly: people are getting used to babble, that what others say has no meaning. They will not throw it into an AI to summarise it, and when they do it, they won’t understand the AI output.
I don’t see a big deal given
What I am concerned however is that those chatbots babble a bloody lot. And people might be willing to accept babble a bit more, due to exposure lowering their standards. And they kind of give up looking for meaning on what others say.
Under certain circumstances. How you say things in work and personal settings such as dating can absolutely affect outcomes.
The number of times I’ve been attacked over tone growing up tells me that I either had abusive parents,
orand that how you say stuff matters a lot. Intonation can also turn a statement into a question or even make it sarcastic. Words come with baggage beyond their meaning, and using a word with negative connotations can turn a compliment into an insult.In the specific case of clanker vocab leaking into the general population, that’s no big deal. Bots are “trained” towards bland, unoffensive, neutral words and expressions; stuff like “indeed”, “push the boundaries of”, “delve”, “navigate the complexities of
$topic”. Mostly overly verbose discourse markers.However when speaking in general grounds you’re of course correct, since the choice of words does change the meaning. For example, a “please” within a request might not change the core meaning, but it still adds meaning - because it conveys “I believe to be necessary to show you respect”.
I take the specific view that inadvertent phrasing of three words or less has repeatedly changed my life. Three is actually really excessive. One usually does the job; I only had to go to three once.
But then again, I’m a writer. Oh, I sure as shit wasn’t the first time, but this is where serendipity cones into play. You know what you have to say without even thinking about it.
Or use an Ai to summarize it…
And AI sucks at that. If you interpret its output as a human-made summary, it shows everything you shouldn’t do — such as conflating what’s written with its assumptions over what’s written, or missing the core of the text for the sake of random excerpts (that might imply the opposite of what the author wrote).
But, more importantly: people are getting used to babble, that what others say has no meaning. They will not throw it into an AI to summarise it, and when they do it, they won’t understand the AI output.