Announcing Tasted!

Regan Burns Cafiso, our Head of Content Strategy, doing a demo of Tasted

I wanted to give a bit of an update on a product I’ve been working on for the last year. Today, our company Pylon ai is launching a new conversational media brand called Tasted. Tasted wants to be your cooking companion that helps you discover great recipes and walk you step-by-step through preparation. You can try it today by asking your Alexa to “Enable Tasted”; or on Google Home say “Open Tasted”.

Tasted is a multimodal (that means you can use either your voice or hands to interact with it), cross-platform (that means it will know who you are whether you use it on Alexa, Google Home, Cortana, Slack, FB Messenger, etc) cooking companion service.

Pylon ai (just “Pylon” for short) was started last year with my friend, mentor and co-founder Shelby Bonnie. We started Pylon because we are very excited about conversational ai products and want to be part of the new wave of companies defining this new type of media. Having lived through major platform shifts like the web and mobile before it, we are incredibly excited to be in this early wave of conversational media companies. Conversational media happens when you can ask personalized questions of a media brand and it responds and remembers what you ask. We fundamentally believe conversational media will be the primary way consumers receive information, acquire goods and services as well as accomplish tasks over time.

Pylon has built a publishing platform that enables human experts (a.k.a. editors) to scale their knowledge to their customers individually on voice -enabled platforms like Amazon’s Alexa, Google Home, Microsoft Cortana; as well as chat platforms like Facebook Messenger and Slack. Instead of broadcasting, our experts are able to program to a consumer at the individual level. The ability to have a conversation with a computer which is trained by a category expert may be the most useful communication tool ever conceived.

Think about it. Why should everyone get the same recipe recommendations this week? Why should everyone get the same top 5 camera recommendations? In the conversational media future, you won’t. You’ll receive recommendations based on your requirements and your specialized needs. For example, my family has gluten, nut, shellfish and lactose allergies (yes, we’re that family!). Now add in a time and ingredient requirement like “pork for four people that will take 20 minutes” and you have a request even Google struggles to fulfill. Vertically focused, AI-powered brands like Tasted will be able to meet that request because they are domain specific and are able to personalize the request based on conversations with you — not just your clickstream data.

We can make these recipe recommendations because we have a recipe editor, Regan Burns Cafiso, who has worked at the Food Network and Martha Stewart curating and training our machine learning algorithms how to think about recipe suggestions and categorizations. When we make a recommendation, it starts with the logic Regan put into our system. Our development team then work on tools that take Regan’s recommendation and make them scale and customized for our users.

Speaking of our development team, we are so fortunate to have hired a fantastic team! Shelby and I initially started working with former friends from our CNET Networks days which includes Regan, but also Cliff Lyon and Stephen Maggs who had been working at places like StubHub and other startups after CNET. Soon after, we were able to hire a ridiculously talented team from OpenTable who happened to based in my hometown in Chattanooga, TN. More on that team and working in Chattanooga in a separate post.

One of the things you might enjoy with Tasted is the ability to cook hands-free. Ever been in the kitchen trying to cook a recipe off your phone, or heaven forbid your laptop? It’s kludgy at best. Cooking recipes off the internet are one of the few times I still use my printer because it’s easier to read a sheet of paper than using my phone or a laptop.

Not anymore. With Tasted, you can use your iPhone or Android device, or any web browser as a visual companion that moves based on what you tell your voice device. Telling Google Home to “Show me ingredients” or “next step” will now move along your recipe instructions on the screen and keep you and your greasy fingers off the screen. Try it. It feels like magic!

Special thanks to all of the friends and folks who have supported the development of Pylon and Tasted. Dick and Danny from Index for leading our round. Old friends like Kevin Bandy and Neil Ashe who participated in our angel round with a whole bunch of other folks. There’s no way we could have gone and put our heads down for almost a year and invent a multi-platform/modal ai service without their belief, advice and support.

If you have an Alexa or Google Home device, please try it and tell me what you think!


Posted

in

by

Tags:

Comments

Leave a Reply

Wanna keep in touch?

No spam, no sharing and all caring. Drop your email and I'll drop you a small number of updates each year.

Continue Reading