Some economists are warning there’s no sign of AI-related job displacement appearing in the labor data. Altman claimed it’s just a matter of time until it does.
This doesn’t seem particularly shocking for him to say, pointing out that his product is being used as a scapegoat seems like sensible reputation management.
Same reason why a CEO with a half-arsed strategy they want to enact hires a highly paid management consultancy which by an amazing coincide produces an “analysis” or “study” that “identifies” “problems with the company” and “recommends” that the company executes said half-arsed strategy, and later after it fails miserable, said CEO will blame the consultancy.
Modern publicly traded company CEO’s aren’t hired on actual competence as strategists or top-level organizers, they’re hired on self-aggrandizing skills, personal connections, near-fraudulent misportrayal of their capabilities, or in other words, salesmanship.
AI is taking some jobs, in situations where the limiting factor is the rate at which work can be done rather than the skills required to do it. Say you have five people in PR, of which three are responsible for trawling through sources to find out what’s actually being said about the company, and two are responsible for writing press releases. The jobs of two of the people trawling through sources could be replaced with AI, as the limiting factor is the amount of posts, documents and stories you can read. While you’d still need an overseer to fact check and collate, that sort of work can be done much faster than actually reading and finding sources. If the company also lays off one of the people responsible for writing press releases, however, that would be unrelated to AI as that sort of job isn’t replaceable by AI (right now at least) due to the majority of the work being something that probabilistic models just aren’t correct enough to do, so that’d be an unrelated layoff being blamed on AI, even if whoever orders it genuinely believes that AI has replaced the job.
This doesn’t seem particularly shocking for him to say, pointing out that his product is being used as a scapegoat seems like sensible reputation management.
Same reason why a CEO with a half-arsed strategy they want to enact hires a highly paid management consultancy which by an amazing coincide produces an “analysis” or “study” that “identifies” “problems with the company” and “recommends” that the company executes said half-arsed strategy, and later after it fails miserable, said CEO will blame the consultancy.
Modern publicly traded company CEO’s aren’t hired on actual competence as strategists or top-level organizers, they’re hired on self-aggrandizing skills, personal connections, near-fraudulent misportrayal of their capabilities, or in other words, salesmanship.
I mean it’s also kind of true though…
AI is taking some jobs, in situations where the limiting factor is the rate at which work can be done rather than the skills required to do it. Say you have five people in PR, of which three are responsible for trawling through sources to find out what’s actually being said about the company, and two are responsible for writing press releases. The jobs of two of the people trawling through sources could be replaced with AI, as the limiting factor is the amount of posts, documents and stories you can read. While you’d still need an overseer to fact check and collate, that sort of work can be done much faster than actually reading and finding sources. If the company also lays off one of the people responsible for writing press releases, however, that would be unrelated to AI as that sort of job isn’t replaceable by AI (right now at least) due to the majority of the work being something that probabilistic models just aren’t correct enough to do, so that’d be an unrelated layoff being blamed on AI, even if whoever orders it genuinely believes that AI has replaced the job.