Photo: Kiel Mutschelknaus |
Isabel Castañeda’s first words were in
Spanish. She spends every summer with relatives in Mexico. She speaks
Spanish with her family at home. When her school, Westminster High in
Colorado, closed for the pandemic in March, her Spanish literature class
had just finished analyzing an entire novel in translation, Albert
Camus’s “The Plague.” She
got a 5 out of 5 on her Advanced Placement Spanish exam last year,
following two straight years of A+ grades in Spanish class.
And yet, she failed her International Baccalaureate Spanish exam this year.
When she got her final results, Ms. Castañeda was shocked.
“Everybody believed that I was going to score very high,” she told me.
“Then, the scores came back and I didn’t even score a passing grade. I
scored well below passing.”
How did this happen? An algorithm assigned a grade to Ms. Castañeda and 160,000 other students...
Computers are excellent at doing math, but education is not math — it’s a social system. And algorithmic systems repeatedly fail at making social decisions. Algorithms can’t monitor or detect hate speech, they can’t replace social workers in public assistance programs, they can’t predict crime, they can’t determine which job applicants are more suited than others, they can’t do effective facial recognition, and they can’t grade essays or replace teachers...
The process worked like this: Data scientists took student information and fed it into a computer. The computer then constructed a model that outputted individual student grades, which International Baccalaureate claimed the students would have gotten if they had taken the standardized tests that didn’t happen. It’s a legitimate data science method, similar to the methods that predict which Netflix series you’ll want to watch next or which deodorant you’re likely to order from Amazon.
Read more...
Source: The New York Times