When privacy expert Bill Fitzgerald tweeted about a conversation in which an Amazon representative said that the company’s voice assistant devices are not intended for classroom use, it got over 40 retweets and more than 100 likes.
“The person was crystal clear,” Fitzgerald expanded in another tweet, “that in the classroom Alexa and Dot posed compliance and privacy issues.”
Those sentiments may have come as a surprise to some in the education community, especially as Fitzgerald’s tweets came during ISTE 2018, a conference where vendors push tech into classrooms and several sessions explored the use of voice assistants from Amazon and Google in the classroom. It also raised questions around whether they have a place in schools given privacy and other concerns.
A spokesperson for Amazon tells EdSurge that the company “doesn’t have anything to add to this story.” For its part, Google tells EdSurge that the company is currently focused on making the Google Home “experience exceptional in the home.”
‘Googler of the Day’
But just because company reps are keeping quiet doesn’t mean that educators are. Rayna Freedman is a 5th-grade teacher at Jordan/Jackson Elementary School in Mansfield, Mass. This past school year, she started piloting a Google Home in her classroom. She decided to do so because she heard students using the voice assistant on their Chromebooks, and thought the Google Home would be a faster alternative.
“These kids are going to be working in environments where they’re going to be using assistive devices,” says Freedman, who is the president elect of MassCUE, a nonprofit. “And if we don’t teach them now how to talk to them, what’s going to happen?”
Freedman says that while her students didn’t use the device as much as she thought they would, one of the biggest surprises was how much it helped her English language learners. When students don’t know a word, she often suggests they ask her or consult a dictionary. That wasn’t happening. But the kids did ask the Google Home.
The device also came in handy during science lessons. Freedman explains that their science books are so outdated, they only discuss three states of matter, when there are actually five concrete ones, and more currently being discovered. Admittedly, it’s a confusing topic, so her students used Google Home to ask deeper questions about matter that Freedman didn’t have answers to right away.
“Those are things that I could easily Google and throw up on a smartboard,” Freedman says. “But to have somebody go over to the Google Home, it’s like they’re learning for themselves. So now they’re really understanding this self-directed learning.”
Freedman’s school has an existing responsible use policy, which asks students (and staff) to follow certain guidelines when using devices. Before introducing her voice assistant, she blogged about it and made both her principal and parents aware. Specifically, she told parents who had concerns to come in and speak with her directly. One of the biggest concerns of the parents she met with was regarding privacy. But Freedman wanted her students to have a say in the matter too.
Working in small groups, her students took the school’s responsible use policy and created their own version of it for the Google Home. Part of that policy? Unplugging the device when it’s not in use. (Her class this upcoming school year will create their own policy as well.)
The kids also decided to have a “Googler of the day” room job. Only one student was permitted to ask the device questions. If other students had questions, they had to ask them through the “Googler of the day.” That student also had to unplug the device once it was no longer in use. If he or she forgot, the consequence was that they didn’t get to hold the role the next time it was their turn.
Freedman adds that she developed a presentation for kids to understand that the device’s use in the classroom would be different from home use. The Google Home was connected to a “completely clean” Google account, and would be used to help them learn, not for music or shutting the lights off.
“For the teacher who just puts the Google Home on the table and says ‘let’s go,’ I disagree with that,” Freedman says, explaining that policies, procedures and purpose are important.
Voice Assistants in Higher Ed
If Fitzgerald’s tweet suggested that Amazon does not view the use of its voice assistant devices as appropriate in the K-12 space, the company’s efforts with the devices at the university level suggests a different tale for the higher education space. Last year, Amazon gifted Arizona State University 1,600 Echo Dots to engineering students living in a new dorm. Amazon also runs the Alexa Fund Fellowship, which four universities are currently participating in (Amazon gives them money, Alexa-enabled devices and mentorship to create a graduate or undergraduate class curriculum), as well as the Alexa Prize, a university-level contest focused on conversational artificial intelligence.
Most recently, Northeastern University announced its plans to give some students the option to connect an Echo Dot to their university accounts, starting this fall.
Shelly Sanchez, who teaches English at a community college in San Antonio, encourages students to use voice assistants such as Siri and Google on their own devices in the classroom to help them with questions that come up during peer editing and reading. She says that in addition to being engaging, the voice assistants offer a great way for language learners to develop their speaking and listening skills—something that traditional search lacks.
Sanchez cites privacy concerns around sharing information with third-parties as the reason she doesn’t use physical voice assistant devices in her classroom.
Jason Hong, an associate professor at Carnegie Mellon University’s Human-Computer Interaction Institute, has done research on Alexa in the home. He explains that Amazon’s devices don’t really record until a person says the keyword. He thinks Google’s devices probably work in a similar way.
Hong adds that the devices are always listening for the keyword that prompts them to execute a command. He says that data is probably stored at least partially on the devices themselves, just enough to determine if you used the keyword.
Once the devices think you’re done with a query, they'll “ship off that command, or that voice, over to the [company’s] servers to process it and then execute whatever demand there is.”
Hong points out that whenever data is being stored, there will always be questions surrounding how that data can be used. If voice assistant devices are used in the classroom, there’s a risk that people outside that classroom, such as parents, principals or superintendents, will want to see the dialogue history.
Potential emerging risks are also a factor, he believes. For instance, there’s a possibility that future iterations of voice assistant devices might have sensor data that would identify the people around them, or might be able to determine who is talking to who inside a class.
“These kinds of things are probably possible in the near future, probably within ten years,” he says, noting that he and his colleagues are trying to use sensor data to build out similar capabilities as part of their research.
Privacy aside, Hong doesn’t think voice assistant devices are really ready for educational environments. They are intended for home use, he says, and teachers should consider the potential for misfires (Alexa could be accidentally activated) and disruptions (a Kindergartner who keeps yelling out for Alexa to turn off the lights). Voice assistants could be useful in specific instances in a college setting, such as a lab where students need hands-free interaction, but even then, there are risks.
“These things are not geared for schools and for lots of people at the same time,” Hong says.
However, Hong contends that in the future, voice assistant devices could also help accomplish “cool things” in schools, such as offering kids interactive stories.
“I think it could be really exciting but also rather thorny.”