Microsoft Virtual Academy is a website for Developers and IT Pros offering free Microsoft training delivere d by experts. Follow us on Twitter @MSVirtAcademy
With Microsoft Cognitive Services, your app could not only recognize someone, it could also tell you how old he is, where he is, what he meant by what he said, and maybe even how he feels.
Developers, get ready to be inspired by the all the cool services available in Microsoft Cognitive Services—an easy-to-use suite of APIs, SDKs, and services that enable you to integrate machine learning and artificial intelligence APIs into your apps. Why reinvent the AI wheel, when you can leverage the work that’s already been done? Cognitive Services are the product of years of ongoing research from data scientists and AI experts at Microsoft. From generating captions for photos and recognizing faces in videos, to performing sentiment analysis and translating speech from one language to another in real time, Cognitive Services can help you build smarter, richer, and more sophisticated apps.
MCT Scott J. Peterson, who loves writing software and has a gift for sharing developer insights, takes us on a tour of Microsoft Cognitive Services with a focus on how to employ them in your apps. Build a COGS Explorer app, step by step, starting with Computer Vision and Face APIs, adding Emotion and Text Analytics APIs, working with LUIS and QnA Maker, and incorporating Search. The courses only take about an hour each, and they include lots of demos. Follow along with Visual Studio 2017 and an Azure trial account. And watch this space for additional courses in the Cognitive Services series, including an exploration of real-time speech translation and a look at Custom Vision.
Part 1: Computer Vision API. Get the details on how this API can recognize objects in photos, caption photos, extract text from images, and more. Filter out images with adult content, and create apps that allow photos to be searched using computer-generated keywords.
Part 2: Face API. Detect faces in images and identify "features" of those faces. See how to use it to build intelligent apps that treat faces as just another type of data.
Part 3: Emotion API and Text Analytics API. Use these APIs to detect emotion in faces in photos and videos, analyze written communications, such as tweets and e-mails, for sentiment, and extract topics and key phrases from text documents.
Part 4: LUIS and QnA Maker. See how LUIS enables developers to build apps that respond to natural-language commands and how, combined with QnA Maker, it can be used to build bots and beyond.
Part 5: Search API. Find out how to use the Bing Search API to incorporate rich search functionality into your apps. Explore web, image, news, and video search, and enhance your search with Autosuggest.
Get inspired today!