Home / World / Solving Tech’s Ethics Problem Could Start In The Classroom : NPR

Solving Tech’s Ethics Problem Could Start In The Classroom : NPR



Ethics is something that the world's largest technology companies are forced to consider. Facebook has been criticized for not having quickly removed the toxic content, including the live broadcast of the shots to the New Zealand mosque. YouTube had to disable comments on videos of minors after pedophiles came to their platform.

Philosophy professor Abby Everett Jaques of the Mbadachusetts Institute of Technology created a clbad called Technology Ethics to help future engineers and computer scientists understand the difficulties of technology.

Courtesy of Kim Martineau, MIT Quest for Intelligence


hide subtitle

change title

Courtesy of Kim Martineau, MIT Quest for Intelligence

Philosophy professor Abby Everett Jaques of the Mbadachusetts Institute of Technology created a clbad called Technology Ethics to help future engineers and computer scientists understand the difficulties of technology.

Courtesy of Kim Martineau, MIT Quest for Intelligence

Some companies have hired ethicists to help them detect some of these problems. But philosophy professor Abby Everett Jaques of the Mbadachusetts Institute of Technology says that's not enough. Understanding the risks of technology is crucial for future engineers and computer scientists, he says. So she created a clbad at MIT called Ethics of Technology.

As artificial intelligence continues to reach our lives, Jaques is concerned about privacy. She is especially worried about facial recognition "to follow us continuously and generalized".

Studies have already shown that facial recognition mistakenly identifies dark-skinned people. Google was attacked when its photo application labeled blacks as gorillas.

"I'm an ethicist, and I'm especially interested in these questions about the ethics of the things we do," says Jaques.

In an exercise, Jaques has his clbad of 30 students playing a game that is designed to make them think about how to achieve fairness.

When technology can be used to build weapons, some workers take a position

Jaques places a large paper bag in the front of the room. The students do not know its exact content, only that there are goodies inside. And they have to find the best way to share them.

"Okay, let's listen to some ideas," Jaques tells the clbad.

A student suggests that they throw everything out of the bag and solve things from there. Another says that they should put someone in charge of deciding what to do.

After considering a dozen ideas, the clbad votes to do it this way: each student is badigned a random number and is allowed to choose something based on their number once the bag is opened.

Should this exist? The ethics of new technology

Jaques empties the bag. It turns out that it was filled with a variety of baked goods, including crispy rice treats and chocolate chip cookies.

A student has a concern: "Sorry, can we determine who is vegan here?"

The clbad did not take into account the different dietary needs. And that's exactly what Jaques wants the students to think.

"Our system did not protect a certain important minority," she says. "So we're trying to build something after [to account for that]. "

Google Tweaks email program that badumed that an investor was male

That resonates with Cel Skeggs, a senior in computer science:

"I've been the person for the entire semester defeating the dead horse of & # 39; How does this technology affect LGBTQ people? & # 39;" Skeggs says. "To the extent that some people have suggested solutions to things and then, when that question is imposed, they say:" Oh, I did not really think about that. "

This comes into play in real life too. For example, some transgenic Uber drivers were thrown out of the application when a security feature could not recognize them. The feature required drivers to take a selfie to verify their identity, but did not take into account the people who were making the transition.

Srinivas Kaza, a computer science student, says that learning about ethics has influenced which companies are willing to work. "I eliminated many options," he says and laughs.

Should the cars that drive on their own have ethics?

Kaza says that she wants to work with the technology of the image, but is really worried about the manipulated photos and the propagation of erroneous information. "I think it's important not to contribute to the problem," he says.

And that is exactly the reason why Jaques created this clbad, so that these students understand that ethics is essential for their work as engineers and computer scientists.

"It's better for companies to prepare because students are going to ask a lot of questions," she says.


Source link