Young people are increasingly at ease consuming culture via digital avatars or made with artificial intelligence. Should the same moral guidelines and laws apply to those works?
The story of FN Meka — a fictitious character billed as the first musical artist partly powered by artificial intelligence to be signed by a major record label — might seem like a bizarre one-off. In August, Capitol Records dropped FN Meka, whose look, outlaw persona and suggestive lyrics were inspired by real-life music stars like Travis Scott, 6ix9ine and Lil Pump, amid criticism that the project trafficked in stereotypes.
But to seasoned observers of technology in pop music and the debate over cultural appropriation, the rise and fall of this so-called robot rapper, whose songs were actually written and voiced by humans, has raised important questions that are not going away anytime soon.
Last month alone, an A.I. artwork won a prize in Colorado and a computer program improvised a classical music solo in real time in New York City. From DALL-E 2, the technology that creates visual art on command, to Hatsune Miku, a Japanese software that does something similar for music, the arts world may be on the precipice of a sea change in how its products are created.
And young people feel increasingly at ease consuming culture via digital avatars like FN Meka. It has already been happening in hip-hop: A hologram of the rapper Tupac Shakur, who died in 1996, performed at a music festival in 2012; Travis Scott gave a concert through his avatar in the video game Fortnite in 2020; and Snoop Dogg and Eminem rapped as their digital selves and their Bored Ape avatars in a metaverse performance at the MTV Video Music Awards last month.
In this brave new world, do fake characters based on real people amount to unseemly borrowing, even theft, or just the kind of homage that has always defined pop music? Even when artificial intelligence does help write music, should the humans behind it be accountable for the machine-created lyrics? And as far as race is concerned, how do the rules of cultural appropriation work when the person doing the appropriating is not a human being with a unique cultural background but a fictitious identity backed by an anonymous, multiracial collective?
“A lot of our moral intuitions and codes as humans may have evolved for a context where we have discrete human actors,” said Ziv Epstein, a Ph.D. student at the M.I.T. Media Lab who studies the intersection of humans and technology. “These emerging technologies require new legal frameworks and research to understand how we reason about them.”
For FN Meka’s critics, the presence of more Black people or people of color in the rooms where the character was conceived, designed and promoted may have helped prevent the negative stereotypes that they say it furthered. Industry Blackout, a nonprofit advocacy group, said FN Meka “insulted” Black culture and leeched off the sounds, looks and life experiences of real Black artists. Capitol seemed to agree when it apologized for its “insensitivity” in a statement.
To the critics, FN Meka’s (exaggerated) debt to A.I. and its exclusively digital existence had the effect of absolving the people who were really calling the shots. “There are humans behind technology,” said Sinead Bovell, a futurist and the founder of WAYE, an organization that educates young people about technology. “When we disconnect the two, that’s where we could potentially risk harm for different marginalized groups.
“What concerns me about the world of avatars,” she added, “is we have a situation where people can create and profit off the ethnic group an avatar represents without being a part of that ethnic group.”
In pop music generally and especially in hip-hop, the culture most likely to be exploited is Black culture, said Imani Mosley, a professor of musicology at the University of Florida.
“There’s so much overlap between digital culture and Gen Z culture and Black culture, to the point where a lot of people don’t necessarily recognize that a lot of things Gen Z says are pulled from African American vernacular,” she said. “To interact with that culture, to be a part of that discourse, is to use certain digital and cultural markers, and if you don’t have access to that discourse because you’re not Black, one way to do that is to hide one’s own ethnicity behind the curtain of the internet.”
For some, though, vilifying FN Meka’s creators raised the specter of artistic censorship.
James O. Young, a professor of philosophy at the University of Victoria who studies cultural appropriation in art, acknowledged there is a long tradition in music of placing a premium on the artist’s lived experience. Young quoted the famous line attributed to the jazz legend Charlie Parker: “If you didn’t live it, it won’t come out of your horn.”
But recently the consensus has moved toward sanctioning only art that arises out of lived experience, to the detriment of both art and political solidarity, Young argued. He pointed to an episode five years ago in which a white artist was pilloried for painting the Black civil rights martyr Emmett Till’s corpse.
“One of the claims is, ‘This is digital blackface,’” Young said of FN Meka. “Maybe it is.” But he advocated for balanced examination, rather than swift reaction. “You’ve got to be very careful: I don’t think you want to claim that all representations of Black people are somehow morally offensive.”
The broader impoverishment highlighted by both sides of this debate is a lack of language and concepts for discussing art that is not, or not entirely, made by people.
Epstein, of the M.I.T. Media Lab, cited the thinking of Aaron Hertzmann, a scientist at Adobe Research. In a paper called “Can Computers Make Art?,” Hertzmann argued that at the moment art can be made only by humans, who are the only ones capable of interacting socially with other humans. In this understanding, machine learning is a tool; the artist behind a drawing made by DALL-E or the similar program Midjourney is not the software, but the person who gave it instructions.
However, Hertzmann allowed, “Someday, better A.I. could come to be viewed as true social agents.”
Meanwhile, as culture is increasingly mediated through the digital realm, questions of how to account for all of the other people who directly or indirectly touched that art will multiply, undermining the conventional notion of the artist as expressing her indivisible perspective.
Some art is now the result of “a complex and diffuse system of many human actors and computational processes interacting,” Epstein said. “If you generate a DALL-E 2 image, is that your artwork?” he added. “Can you be the social agent of that? Or are they scaffolded by other humans?”
A final question is deceptively profound: Does it even matter who, or what, composes the song, paints the painting, writes the book? Metaverse avatars and A.I. programs are intrinsically derivative: They are all but guaranteed to be riffs on already existing artists and their works.
Anthony Martini, a co-founder of Factory New, the virtual music company that created FN Meka, stands firmly on one side of that debate: “If you’re mad about the lyrical content because it supposedly was A.I.,” he said, “why not be mad about the lyrical content in general?”