Summary

A review of concepts that were covered in this course.

We'll cover the following

What have we covered?

Let’s quickly summarize what we’ve learned in this course.

  • First, we discussed the FastAPI web framework for Python. We covered the implementation of different types of parameters in FastAPI such as the path parameters, the query parameters, the default parameters, the optional parameters, and the request body.

  • Then, we discussed the concepts of cloud technology and Microsoft Azure. We also discussed Azure Cognitive Services. We started exploring the Azure Vision Cognitive service, and we learned the following:

    • We started our journey in building intelligent solutions using Azure Computer Vision service. We built many small projects to perform OCR, identify the objects in an image, identify the landmarks and brand logos in an image, and many more.

    • After that, we learned about building a custom image classification and object detection model using the Azure Custom Vision service.

    • Finally, we covered the Azure Face service to perform face detection and facial recognition.

    • We also developed our first capstone project to detect whether the person is wearing a face mask or not.

  • After covering the Azure Vision Cognitive service, we started exploring the Azure Language Cognitive services. We covered the following topics:

    • We explored the Azure Language Understanding Intelligent Service (LUIS) which lets us build conversational experiences and build apps using the web portal as well as the Python SDK. We also developed our second capstone project on LUIS.
    • Then, we discussed the Azure QnA maker service which helps us quickly build question-answering systems. We also built the third capstone project on building a question-answering chatbot and deploying the chatbot on your own website.
    • After that, we started exploring the Azure Text Analytics service and the Azure Translator service. We built a handfulof mini-projects (like translating the documents from one language to another, extracting key phrases from the text, understanding the sentiments, and opinions of the users about a product or service and much more).
  • Then, we started exploring the Azure Speech Cognitive Services and built some projects which converted speech to text, text to speech in different voices, and even translated speech from one language to another.

  • After this, we began to explore the Azure Decision Cognitive Services. We covered the following topics in that lesson:

    • First, we explored how to build anomaly detection systems, for both univariate and multi-variate datasets, using the Azure Anomaly Detectors.

    • Then, we learned about the content moderation service, the Azure Content Moderator. We talked about how the service could be utilized to moderate the content in any format for its inappropriateness.

    • Finally, we covered the Azure Personalizer service, which lets us build personalized recommendation systems and quickly took an introduction to reinforcement learning.

  • Then came the last topic of the course i.e., the Azure Bing Search services. In this chapter, we built search engines to extract the webpages, images, videos, and news from the internet and displayed the results in a similar format to Google search results. We also covered the autosuggestion and spell correction features of the service.

We hope you enjoyed the course and learned many new concepts. We would really love to hear your feedback on this course. It will help us to improve the course and deliver rich content in the future. We appreciate your time!

Get hands-on with 1400+ tech skills courses.