AI is a Weirdo

·

2 min read

An artificial general intelligence (AGI) could beat you at chess, tell you a story, bake you a cake, describe a sheep, and name three things larger than a lobster. It’s also solidly the stuff of science fiction, and most experts agree that AGI is many decades away from becoming reality if it will become a reality at all.

We currently have less advanced artificial narrow intelligence (ANI). The algorithms that grab attention when they outperform humans at games like chess and go, for example, are better than humans at a specific, specialized activity. Unfortunately, a real-world issue is frequently more complex than it initially appears to be. Sometimes, you'll encounter algorithms like those in the GPS app that guided drivers toward burning neighborhoods during the 2017 holiday wildfire season in California. It wasn't trying to kill people it just saw that those neighborhoods had less traffic. It had not been made aware of the fire.

Another instance of an issue that may be more widespread than it initially appears is self-driving automobiles:

In 2016, there was a fatal accident when a driver used Tesla’s autopilot feature on city streets instead of the highway driving that it had been intended for. A truck crossed in front of the car, and the autopilot’s AI failed to brake it didn’t register the truck as an obstacle that needed to be avoided.

AI.jpg

Mobileye (the company that created the collision-avoidance system) claimed in an analysis that since their system was created for highway driving, it had only been trained to prevent rear-end crashes. That is to say, it had only received training to identify trucks coming at it from the rear. According to Tesla, when the AI spotted the vehicle, it determined that it was an overhead sign and that there was no need to apply the brakes.