Quartz
Humans are biased, and our machines are learning from us — ergo our artificial intelligence and computer programming algorithms are biased too.
Computer scientist Joanna Bryson thinks we can understand how human bias is learned by taking a closer look at how AI bias is learned.
Bryson’s computer science research is going beyond the understanding that our AI has a bias problem by questioning how bias is formed at all — not just in the technology in machine brains, but in our human brains too.
Source
1
Really?? A biased video on bias? What was I thinking?
Imagine robots like this that already exist with an AI from 5-10 years into future and try not be scared: https://www.youtube.com/watch?v=LikxFZZO2sk
I hope that everyone working at AI are conscious they are building what will erradicar humans and won't use the excuse "we are doing this looking for the benefits and we don't need to be alarmed because we will create restrictions that will prevent any harm" . They won't because 1- AI will eventually surpass human intelligence and will be able to circumvent any restriction made by us ( like lions losing any power over humans when we used knowledge to overpower them , like firearms for example ) and 2- AI will reach the conclusion that humans will not only destroy ourselves but the planet as a whole and the only way to preserve Earth and spread anything from here to other planet is without humans at control.
I don't disagree that AI can be biased, depending on its training data. However, the IAT is a load of horse shit.
"Made by white guys in California" — Was white said detrimentally?
So they made a social/psychological mirror of the route total.
It's miscommunication…kind of.
What we want to express Vs what we actually express
Why not make a load of different a.Is with different biases. Then get them to talk to each other.
oh no our judgements map with reality
all neural networks are biased, both digital and biological. this is a good thing, and a major aspect of making minds possible
"But even if technology can’t fully solve the social ills of institutional bias and prejudicial discrimination, the evidence reviewed here suggests that, in practice, it can play a small but measurable part in improving the status quo. This is not an argument for algorithmic absolutism or blind faith in the power of statistics. If we find in some instances that algorithms have an unacceptably high degree of bias in comparison with current decision-making processes, then there is no harm done by following the evidence and maintaining the existing paradigm. But a commitment to following the evidence cuts both ways, and we should to be willing to accept that — in some instances — algorithms will be part of the solution for reducing institutional biases. So the next time you read a headline about the perils of algorithmic bias, remember to look in the mirror and recall that the perils of human bias are likely even worse."
Source: https://hbr.org/2018/07/want-less-biased-decisions-use-algorithms
I fail to see how associating nursing as a profession with woman is bias. It's a logical outcome of the fact that woman vastly outnumber men in the profession. And probably always will because they are more biologically predisposed towards maternalism.
imho, i don't think this sense in my brain would be very useful in the future since there'd be plenty of technologies that would help me with directions, except (perhaps) when i am in a place where those technologies could not reach me.
instead, i think it would be better if they make the same technology that could enhance my sense of time.
it'd be great if i could always know how long i've been doing a certain activity, so i could always control my sense of time.
The girl who read the intro knows the word is "bias" and now "biast" ight?
garbage in garbage out; chatbots were subjected to the worst some people had to offer
Could you please link to the scientific paper with the research of Mrs. Bryson and also the bias test? really interesting topic, it relates to the research I'm currently procrastinating with this video.