바로가기메뉴

본문 바로가기 주메뉴 바로가기

ACOMS+ 및 학술지 리포지터리 설명회

  • 한국과학기술정보연구원(KISTI) 서울분원 대회의실(별관 3층)
  • 2024년 07월 03일(수) 13:30
 

logo

A Case Study of Rapid AI Service Deployment - Iris Classification System

인공지능연구 / Korean Journal of Artificial Intelligence, (E)2508-7894
2023, v.11 no.4, pp.29-34
https://doi.org/https://doi.org/10.24225/kjai.2023.11.4.29
Yonghee LEE (Department of AI Software, Shingu College)
  • 다운로드 수
  • 조회수

Abstract

The flow from developing a machine learning model to deploying it in a production environment suffers challenges. Efficient and reliable deployment is critical for realizing the true value of machine learning models. Bridging this gap between development and publication has become a pivotal concern in the machine learning community. FastAPI, a modern and fast web framework for building APIs with Python, has gained substantial popularity for its speed, ease of use, and asynchronous capabilities. This paper focused on leveraging FastAPI for deploying machine learning models, addressing the potentials associated with integration, scalability, and performance in a production setting. In this work, we explored the seamless integration of machine learning models into FastAPI applications, enabling real-time predictions and showing a possibility of scaling up for a more diverse range of use cases. We discussed the intricacies of integrating popular machine learning frameworks with FastAPI, ensuring smooth interactions between data processing, model inference, and API responses. This study focused on elucidating the integration of machine learning models into production environments using FastAPI, exploring its capabilities, features, and best practices. We delved into the potential of FastAPI in providing a robust and efficient solution for deploying machine learning systems, handling real-time predictions, managing input/output data, and ensuring optimal performance and reliability.

keywords
Machine Learning, Classification, Web Service, FastAPI

인공지능연구