[Be My Eyes user Jesper Holten demonstrating Be My Eyes app’s new AI virtual volunteer, identifying a can of sardines]
Holten: “[inaudible] …to get it started.”
AI: “Virtual volunteer: This is a picture of a can of John West sardines. The label is green and yellow and there is a wrinkle on the top of the can.”
[Jesper Holten demonstrating the AI virtual volunteer on another canned food]
Holten: “[inaudible] …but maybe let’s say I want to know how much is in the packet. It could be interesting …[inaudible]”
AI: “Virtual volunteer: This is a picture of package of Sønderjysk Spegepølse, which is a type of Danish sausage.”
Holten (interview): “I use it on a fairly regular basis, maybe once or twice a week. It tends to be in daily situations like when I’m cooking food and want to have a look at tins, especially, at tins and cans with food stuff can be very hard to ascertain what’s in it. Of course, you can shake it, but it could be coconut milk or it could be canned tomatoes, I wouldn’t be able to tell.”
[Jesper Holten demonstrating the AI virtual volunteer on a bottle of wine]
AI: “[inaudible] …which is a type of South African sparkling wine. The label is green and gold and there is a red emblem on the top.”
Holten (interview): “I see a potential for people to have more independence. For some people, it can even be difficult to ask other people for help, but if you ask a machine, it’s a machine, so it’s there to serve you.”
[Hans Jørgen Wiberg, founder of Be My Eyes, demonstrating the AI virtual volunteer on a box of pasta]
Jørgen Wiberg (interview): “I have really had a hard time sleeping since this because I think there is so many possibilities where this could go. And right now, it is only a picture that you can get described, but if we can get this to work on a video stream where you can kind of walk and get things described to you.”
[Jørgen Wiberg identifying brand of pasta]
AI: “Virtual volunteer: Yes, the cooking instructions for Barilla Spaghetti.”
Jørgen Wiberg (interview): “They are blown away by what it can describe. I mean, we have seen before this, but that was like chair, a sofa, television or something like that. But now, it describes in details what it sees and the fact that you can ask further questions into the picture. Okay, what kind of TV is it? Or can you tell me what the show is that is on the TV and so on?”
Jesper Hvirring Henriksen (interview): “I think there are a lot of use cases where, you know, you may feel like you’re burdening someone, for something that seems not important. There’s lots of important use cases and maybe I shouldn’t take up the time of a volunteer with what seems like a silly little thing. Or maybe it’s just Monday morning, it’s early in the morning. You don’t feel like talking to a human being right now. In all those cases, you can just use AI now, and you’re talking to a computer.”
Holten (interview): “I want to have a level of confidence that I don’t necessarily have in unfamiliar spaces, that if the AI technology can help me gaining or regaining that level of confidence, that would really be something.”
This script was provided by The Associated Press.