It won’t break the laws of physics. Scientists are going to increase the speed of light in 2208.
It won’t break the laws of physics. Scientists are going to increase the speed of light in 2208.
I challenge anyone who says sugar isn’t addictive to go a week without. No sugar. No sugar substitutes like fructose. I’ve done it. It is awful.
I’ve also done hard drugs. Quitting those are awful too.
The difference is that I haven’t done drugs in decades but I still have a pack of Oreos on my counter.
I never realized all this but it’s so true. I browse and comment until I’m caught up, then log off.
Wow
I completely disagree. It absolutely is AI doing this. The point the article is trying to make is that the data used to train the AI is full of exclusionary hiring practices. AI learns this and carries it forward.
Using your metaphor, it would be like training AI on hundreds of excel spreadsheets that were sorted by race. The AI learns this and starts doing it too.
This touches on one of the huge ethical questions with regulating AI. If you are discriminated against in a job hunt by an AI, who’s fault is that? The AI is just doing what it’s taught. The company is just doing what the AI said. The AI developers are just giving it previous hiring data. If the previous hiring data is racist or sexist or whatever you can’t retroactively correct that. This is exactly why we need to regulate AI not just its deployment.
That’s the huge take away here. The Chinese can’t comprehend that the DOD doesn’t have a social media control division. Yes we have the NSA and stuff spying, but they don’t control anything.