Amazon’s Alexa is miles ahead of rivals like Microsoft and Google with its “skills” — around 10,000 mini-apps that let you use your voice to control your lights or music, order an Uber, learn first aid and more. To help close that gap, Microsoft has finally launched the Cortana Skills Kit in a public preview, allowing developers to build new skills or convert them from Alexa or Microsoft’s new Bot Framework.
Developers will be able to use the kit to build skills and publish them to a new Cortana channel on its Bot Framework. Right now, the skills will work on Cortana for Windows 10, Android, iOS and the recently announced Cortana-enabled Harman Kardon Invoke speaker.
A key piece of Microsoft’s Cortana strategy is Echo-like hardware, but it hasn’t revealed its own device (yet). Instead, the first Echo-like device is from a third-party supplier, Harman Kardon, with the Invoke. At the Build conference today, HP also said it would build Cortana-specific devices, and Intel revealed that it would develop reference designs for the AI assistant.
Microsoft has been testing the Cortana Skills Kit in a private beta with select developers to work out the kinks. So far, early partners include Expedia, Capital One and handyman outfit TalkLocal. However, that list should grow rapidly — today at its Build conference, its holding a training session, showing developers how to design, build, test and publish Skills using the Azure Bot Framework. Other sessions will follow over the next two days on building voice-centric experiences and business-oriented apps.
In a demo during build, Microsoft’s Laura Jones showed that developers can create Cortana skills across various platforms, including PCs, Harman Kardon’s Invoke, your car’s infotainment system or mobile devices. For instance, you can tell Cortana to enter a business meeting appointment at home, and once on the road, it can inform you about an accident that will make you late and notify attendees. “Because Cortana is aware of the device I’m on, she can provide me with contextually-aware responses,” says Jones.