The knowledge your resume – which you’ve crafted so carefully – isn’t even being viewed by a human being is demeaning.
The public needs to exercise oversight of law enforcement, including monitoring what technology it uses, because the potential for abuse is so high and they are, after all, our employees and not our masters.
Not hampering innovation is a laudable goal, but when it comes to technology with this much potential for misuse I’m cool with some roadblocks slowing things down. And more input from ethicists and others from the humanities wouldn’t hurt, either.
Facial recognition is biased in large part because the material used to train it doesn’t use people of color in any meaningful numbers.
No, it’s not up to the task. Worst of all for applicants, there’s no process of appeals or anyone to follow up with. It’s another way to screw people.
We’re all going to be impacted by AI, even those who feel they’re too special. Oh, and none of us are prepared for it.