A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.
Prepare for more of that, applied to weaponized drones
Oh that wedding? The drone just did that sorry
Actually the human AI helper making $1/h has legal responsility for that strike
Is this how we also give AiChatbots personhood by accident?