SAN FRANCISCO — Wearing Amazon’s latest smart delivery glasses felt remarkably easy right from the start. Even though they were equipped with advanced technology and had a slightly bulkier design, they were immediately comfortable and only slightly heavier than my usual glasses.
A few lines of monochrome green text and a square target appeared in the right-hand lens, reminding me that these were not my ordinary frames.
Only occupying a portion of my field of view, the text displayed an address and a sorting code: “YLO 339.” I soon discovered that “YLO” referred to the yellow tote bag where the package would typically be located, and “339” was a unique code on the package label.
My objective was to locate the package with that specific code, or more precisely, allow the glasses to find it for me.
Once I directed my gaze at the correct package label, the glasses promptly recognized the code and scanned the label automatically. A checkmark appeared on a list of packages within the glasses.
Next, an audio alert emanated from the glasses: “Dog on property.”
Upon scanning all the packages, the small green display swiftly transitioned to wayfinding mode. A basic map materialized, pinpointing my location as a dot and marking the delivery destination with pins. In this simulation, there were two pins, indicating two stops at that location.
After navigating to the doorstep, the final step was proof of delivery. Rather than reaching for a phone, I focused on the package on the doorstep and pressed a button on the small controller unit — the “compute puck” — on my harness. The glasses snapped a photo, completing my simulated delivery without the need for a handheld device.
During my limited experience, my primary concern was the potential for distraction — concentrating on the text in front of my eyes rather than the environment around me. I now understand why the display automatically deactivates when the van is in motion.
However, when I expressed this concern to the Amazon representatives guiding me through the demo, they highlighted that the alternative would involve looking down at a device. With the glasses, your line of sight remains elevated and mostly unobstructed, making it easier to detect potential dangers.
Aside from not being intended for public release, the simplicity distinguishes Amazon’s practical design from other augmented reality devices like Meta Ray-Bans, Apple Vision Pro, and Magic Leap, which aim to enhance or overlay the user’s environment more extensively.