> The chief constable of one of Britain’s largest police forces has admitted that Microsoft’s Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a nonexistent match between West Ham and Maccabi Tel Aviv.
> We’ve reached out to Microsoft to comment on why Copilot made up a football match that never existed, but the company didn’t respond in time for publication.
It seems like as of yet, none of these "Let AI make decisions for us" workflows have led to obvious human death yet, but it is starting to feel like it is only a matter of time.
> The chief constable of one of Britain’s largest police forces has admitted that Microsoft’s Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a nonexistent match between West Ham and Maccabi Tel Aviv.
> We’ve reached out to Microsoft to comment on why Copilot made up a football match that never existed, but the company didn’t respond in time for publication.
It seems like as of yet, none of these "Let AI make decisions for us" workflows have led to obvious human death yet, but it is starting to feel like it is only a matter of time.
In this case I think AI was used to look for justifications to a decision already made, not to make the decision.