18 November 2019
All digital assistants including Google Assistant, Apple's Siri, Microsoft Cortana, and Amazon Alexa have come a long way since their initial versions during early days. They are now much smarter and faster at doing things but their respective makers are continuously working on improving them more. The major focus of these improvements is on the contextual conversation aspect to make these assistants feel more real/human and less robotic.
In the same direction, Microsoft took the stage at Build 2019 conference to show what exactly it has been doing to make Cortana feel more human. It played an about 3 minutes long video to show Cortana making contextual conversations and user interacting with it in a more natural way. The whole idea is to demonstrate that a human can interact with Cortana in the same way two humans interact with each other.
The video shows a female worker walking around her office while also talking to Cortana. She can be seen asking for her daily schedule and tweaking the same. She makes queries to Cortana about restaurant booking and much more. You can watch the whole video below to get a better idea.
Microsoft acquired Semantic Machines last year and has made use of its technologies to improve Cortana. It is primarily aiming at enterprise customers. It will be providing the new conversational engine to developers via Bot Framework.