Meta's smart glasses can now tell you where you parked your car

Meta updates Ray-Ban glasses with car locator, reminders, but live translation is missing.

: Meta is enhancing its Ray-Ban smart glasses with new AI-powered features for US and Canada users. The update includes natural language recognition improvements, allowing more conversational interaction with the AI assistant. Users can now utilize features like voice messages, timers, reminders, and car location. However, the anticipated live translation feature is not yet available.

Meta is rolling out updates for its AI-powered Ray-Ban smart glasses in the US and Canada, enhancing their capabilities with new features. These features include the ability to locate a parked car using the glasses, as well as setting reminders and sending voice messages.

The update allows users to engage the AI assistant with a more conversational approach, eliminating the need for stilted commands. Additionally, users can use the glasses to call phone numbers or scan QR codes, expanding their functionality.

Despite these new features, the live translation capability is notably absent from this update. CTO Andrew Bosworth has not provided a timeline for when this feature might be available, leaving users anticipating its release.