Lunchbot

Eliminating Lunchtime Hangriness

What we’re Trying to Achieve

In order to eliminate the indecision about deciding what’s for lunch, we created Lunchbot, an Alexa skill that simulates conversation with human users, to sort through menu items at local lunch spots and propose a place to grab a bite.

How Did We Do This

We developed an Alexa skill that searches Yelp for nearby restaurants. When prompted, Lunchbot extracts the user’s request and sends it to Yelp’s search API. Lunchbot can understand requests based on natural language processing and conversational UI (the same approach used with chatbots).

Technology Overview

Amazon Alexa

The cloud-based voice service was critical in building a customized voice experience. This enabled users to have an intuitive experience and an easier way to order lunch.

Yelp API

By using the Yelp API, we were able to find the best match for the search query and locate open spots in the local area.

AWS Lambda

This serverless computing platform provided by Amazon allows us to map language to intent and connect the dots between the lunch ask and Yelp.

Business Application

It’s becoming an expectation for established companies to have voice interfaces to provide authentic human experiences and help solve user’s complex problems. To make life easier, a multi-client application (like this one) comes in handy to lessen the burden on the user, and place it back on the system to sort through endless information. While we used a lunch example, this technology could be applied across wider solutions and other interfaces. For example, it could be leveraged in hospitals where users could tap into the interface to find their way around the space, or by staff to locate a nurse or doctor. In the end, this application goes far beyond getting lunch, and is effortlessly presenting information that the user wants and needs in a timely fashion.

Want to work with us? Contact Us

Newsletter Signup

Got It