리셋 되지 말자

csapi 프로젝트 (9) - minio 연동 본문

프로젝트

csapi 프로젝트 (9) - minio 연동

kyeongjun-dev 2024. 3. 3. 00:57
https://not-to-be-reset.tistory.com/343 글에 작성한 대로, 모델 다운로드 링크가 만료되고 코드가 동작하지 않아서 다시 포스팅 합니다.
단일 인스턴스 부터 쿠버네티스 클러스터까지 확장해 나가려고 합니다.

개발환경

2024. 3. 2 기준 wsl 윈도우에서 진행

github 주소

https://github.com/kyeongjun-dev/csapi

브랜치: https://github.com/kyeongjun-dev/csapi/tree/minio

해당 글의 코드는 github 코드랑 다를수 있으니, 오류 발생시 깃허브 코드 확인 필수


Minio 추가

django 컨테이너와 worker 컨테이너를 동일한 volume으로 묶는것은 서비스 분리가 되지 않아서, aws s3와 호환되는 minio를 추가합니다. ( https://github.com/kyeongjun-dev/csapi/tree/minio 브랜치에서 확인 가능합니다.)

 

Dockerfile 수정

celery/Dockerfile, django/Dockerfile을 수정합니다 - 위 : 수정 전, 아래 : 수정 후

FROM python:3.7.16-slim
ENV PYTHONUNBUFFERED 1
WORKDIR /csapi
RUN apt-get update && apt-get install -y libgl1-mesa-dev libglib2.0-0
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt --no-cache-dir
COPY . .
---
FROM python:3.7.16-slim
ENV PYTHONUNBUFFERED 1
WORKDIR /csapi
RUN apt-get update && apt-get install -y libgl1-mesa-dev libglib2.0-0
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt --no-cache-dir
RUN mkdir /images
COPY . .
FROM python:3.7.16-slim
ENV PYTHONUNBUFFERED 1
WORKDIR /csapi
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt --no-cache-dir
COPY . .
---
FROM python:3.7.16-slim
ENV PYTHONUNBUFFERED 1
WORKDIR /csapi
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt --no-cache-dir
RUN mkdir /images
COPY . .

 

설치할 패키지 추가

celery/requirements.txt, django/requirements.txt에 minio를 사용하기 위한 boto3를 추가합니다

opencv-python==4.5.1.48
tensorflow==2.11.0
tensorflow-io==0.24.0
importlib-metadata==4.8.3
celery==5.1.2
---
opencv-python==4.5.1.48
tensorflow==2.11.0
tensorflow-io==0.24.0
importlib-metadata==4.8.3
celery==5.1.2
boto3==1.33.13
Django==3.2.20
importlib-metadata==4.8.3
celery==5.1.2
---
Django==3.2.20
importlib-metadata==4.8.3
celery==5.1.2
boto3==1.33.13

 

docker-compose.yml 수정

minio에서 사용하는 volume만 추가하고, 기존의  image volume은 제거합니다

version: "3"
services:
  django:
    build:
      context: ./django
    command: python manage.py runserver 0.0.0.0:8080
    ports:
      - "8080:8080"
    volumes:
      - image_volume:/csapi/images
    environment:
      - BROKER_ID=user
      - BROKER_PASSWORD=password
      - BROKER_URL=rabbit
      - BROKER_PORT=5672

  worker:
    build:
      context: ./celery
    command: celery -A csapi.tasks worker --loglevel=info --max-tasks-per-child 1 -c 1
    volumes:
      - image_volume:/csapi/images
    environment:
      - CELERY_BROKER_URL=pyamqp://user:password@rabbit:5672//
      - CELERY_RESULT_BACKEND=rpc://

  rabbit:
    image: rabbitmq:3-management
    expose:
      - "5672"
    ports:
      - "15672:15672"
    environment:
      - RABBITMQ_DEFAULT_USER=user
      - RABBITMQ_DEFAULT_PASS=password

volumes:
  image_volume:
---
version: "3"
services:
  django:
    deploy:
      replicas: 0
    build:
      context: ./django
    command: python manage.py runserver 0.0.0.0:8080
    ports:
      - "8080:8080"
    environment:
      - BROKER_ID=user
      - BROKER_PASSWORD=password
      - BROKER_URL=rabbit
      - BROKER_PORT=5672
      - AWS_ACCESS_KEY_ID=
      - AWS_SECRET_ACCESS_KEY=
      - MINIO_ENDPOINT=http://minio:9000

  worker:
    deploy:
      replicas: 0
    build:
      context: ./celery
    command: celery -A csapi.tasks worker --loglevel=info --max-tasks-per-child 1 -c 1
    environment:
      - CELERY_BROKER_URL=pyamqp://user:password@rabbit:5672//
      - CELERY_RESULT_BACKEND=rpc://
      - AWS_ACCESS_KEY_ID=
      - AWS_SECRET_ACCESS_KEY=
      - MINIO_ENDPOINT=http://minio:9000

  rabbit:
    deploy:
      replicas: 0
    image: rabbitmq:3-management
    expose:
      - "5672"
    ports:
      - "15672:15672"
    environment:
      - RABBITMQ_DEFAULT_USER=user
      - RABBITMQ_DEFAULT_PASS=password

  minio:
    deploy:
      replicas: 1
    image: minio/minio:latest
    command: server /data --console-address ":9001"
    volumes:
      - minio_volume:/data
    environment:
      - MINIO_ROOT_USER=minioroot
      - MINIO_ROOT_PASSWORD=minioroot
    ports:
      - "9000:9000"
      - "9001:9001"

volumes:
  minio_volume:

 

django 코드 수정

django/config/settings.py 에 minio 엔드포인트를 환경변수로 추가해줍니다

CELERY_BROKER_URL = f'amqp://{BROKER_ID}:{BROKER_PASSWORD}@{BROKER_URL}:{BROKER_PORT}//'
CELERY_RESULT_BACKEND = 'rpc://'
CELERY_TIMEZONE = 'Asia/Seoul'
---
CELERY_BROKER_URL = f'amqp://{BROKER_ID}:{BROKER_PASSWORD}@{BROKER_URL}:{BROKER_PORT}//'
CELERY_RESULT_BACKEND = 'rpc://'
CELERY_TIMEZONE = 'Asia/Seoul'

MINIO_ENDPOINT = os.getenv('MINIO_ENDPOINT')

 

django/csapi/views.py 코드를 minio를 사용하도록 수정합니다

from django.shortcuts import render
from django.http import HttpResponse
from celery import shared_task
from .tasks import api

from django.conf import settings

import uuid
# Create your views here.

def index(request):
    return render(request, 'index.html')

def run_api(request):
    if request.method == 'GET':
        return render(request, 'index.html')
    elif request.method == 'POST' and request.FILES['image']:
        uploaded_file = request.FILES['image']
        file_name = uploaded_file.name
        file_size = uploaded_file.size
        file_content_type = uploaded_file.content_type

        save_destination = str(settings.BASE_DIR) + '/images/'
        print(save_destination)
        input_image_name = str(uuid.uuid4()) + '.' + file_content_type.split('/')[1]
        output_image_name = str(uuid.uuid4()) + '.' + file_content_type.split('/')[1]
        print(input_image_name)
        with open(save_destination + input_image_name, 'wb') as destination:
            for chunk in uploaded_file.chunks():
                destination.write(chunk)

        print(file_name, file_size, file_content_type)
        api.delay(save_destination + input_image_name, save_destination + output_image_name)
        print(output_image_name)
        return HttpResponse(input_image_name + '\n' + output_image_name)

def show_image(request, image_name):
    context = {
        'image_name': image_name,
    }
    return render(request, 'show_image.html', context)

def serve_image(request, image_name):
    image_dir = str(settings.BASE_DIR) + '/images/' + image_name
    try:
        with open(image_dir, 'rb') as image_file:
            return HttpResponse(image_file.read(), content_type='image/png')
    except FileNotFoundError:
        return HttpResponse('Image Not Found')
---
from django.shortcuts import render
from django.http import HttpResponse
from celery import shared_task
from .tasks import api
import boto3
from django.conf import settings

import uuid
import os
# Create your views here.

def index(request):
    return render(request, 'index.html')

def run_api(request):
    if request.method == 'GET':
        return render(request, 'index.html')
    elif request.method == 'POST' and request.FILES['image']:
        uploaded_file = request.FILES['image']
        # file_name = uploaded_file.name
        # file_size = uploaded_file.size
        file_content_type = uploaded_file.content_type

        save_destination = '/images/'
        input_image_name = str(uuid.uuid4()) + '.' + file_content_type.split('/')[1]
        output_image_name = str(uuid.uuid4()) + '.' + file_content_type.split('/')[1]
        with open(save_destination + input_image_name, 'wb') as destination:
            for chunk in uploaded_file.chunks():
                destination.write(chunk)

        # minio client 생성
        minio_client = boto3.client('s3',
                                    endpoint_url=str(settings.MINIO_ENDPOINT),
                                    config=boto3.session.Config(signature_version='s3v4'))
        # upload 파일 이름 및 버킷 설정
        bucket_name = 'images'
        object_name = input_image_name

        with open(save_destination + input_image_name, 'rb') as file_data:
            minio_client.put_object(Bucket=bucket_name, Key=object_name, Body=file_data)

        api.delay(input_image_name, output_image_name)
        print(output_image_name)
        return HttpResponse(input_image_name + '\n' + output_image_name)

def show_image(request, image_name):
    context = {
        'image_name': image_name,
    }
    return render(request, 'show_image.html', context)

def serve_image(request, image_name):
    image_dir = '/images/' + image_name

    # 이미지가 로컬에 있는지 확인
    if os.path.isfile(image_dir):
        print(image_name, "exist in local")
        pass
    else:
        print(image_name, "not exist in local")
        # minio client 생성
        minio_client = boto3.client('s3',
                                    endpoint_url=str(settings.MINIO_ENDPOINT),
                                    config=boto3.session.Config(signature_version='s3v4'))
        response = minio_client.get_object(Bucket='images', Key=image_name)
        with open(image_dir, 'wb') as file:
            for chunk in response['Body'].iter_chunks():
                file.write(chunk)
    try:
        with open(image_dir, 'rb') as image_file:
            return HttpResponse(image_file.read(), content_type='image/png')
    except FileNotFoundError:
        return HttpResponse('Image Not Found')

 

celery 코드 수정

celery/csapi/tasks.py 를 minio를 사용하도록 수정합니다

import cv2
import numpy as np
from tensorflow.keras.models import load_model
import tensorflow as tf

class fashion_tools(object):
    def __init__(self,imageid,model,version=1.1):
        self.imageid = imageid
        self.model   = model
        self.version = version
        
    def get_dress(self,stack=False):
        """limited to top wear and full body dresses (wild and studio working)"""
        """takes input rgb----> return PNG"""
        name =  self.imageid
        file = cv2.imread(name)
        file = tf.image.resize_with_pad(file,target_height=512,target_width=512)
        rgb  = file.numpy()
        file = np.expand_dims(file,axis=0)/ 255.
        seq = self.model.predict(file)
        seq = seq[3][0,:,:,0]
        seq = np.expand_dims(seq,axis=-1)
        c1x = rgb*seq
        c2x = rgb*(1-seq)
        cfx = c1x+c2x
        dummy = np.ones((rgb.shape[0],rgb.shape[1],1))
        rgbx = np.concatenate((rgb,dummy*255),axis=-1)
        rgbs = np.concatenate((cfx,seq*255.),axis=-1)
        if stack:
            stacked = np.hstack((rgbx,rgbs))
            return stacked
        else:
            return rgbs

    def get_patch(self):
        return None

from celery import Celery, current_task

app = Celery('tasks')

import uuid

@app.task
def api(input_image_name, output_image_name):
    saved = load_model("./model.h5")
    api = fashion_tools(input_image_name, saved)
    image_ = api.get_dress(False)
    file_type = input_image_name.split('.')[1]
    cv2.imwrite(output_image_name, image_)
    #return current_task.request.id
    return input_image_name + output_image_name
---
import cv2
import numpy as np
from tensorflow.keras.models import load_model
import tensorflow as tf
import boto3
import os
class fashion_tools(object):
    def __init__(self,imageid,model,version=1.1):
        self.imageid = imageid
        self.model   = model
        self.version = version

    def get_dress(self,stack=False):
        """limited to top wear and full body dresses (wild and studio working)"""
        """takes input rgb----> return PNG"""
        name =  self.imageid
        file = cv2.imread(name)
        file = tf.image.resize_with_pad(file,target_height=512,target_width=512)
        rgb  = file.numpy()
        file = np.expand_dims(file,axis=0)/ 255.
        seq = self.model.predict(file)
        seq = seq[3][0,:,:,0]
        seq = np.expand_dims(seq,axis=-1)
        c1x = rgb*seq
        c2x = rgb*(1-seq)
        cfx = c1x+c2x
        dummy = np.ones((rgb.shape[0],rgb.shape[1],1))
        rgbx = np.concatenate((rgb,dummy*255),axis=-1)
        rgbs = np.concatenate((cfx,seq*255.),axis=-1)
        if stack:
            stacked = np.hstack((rgbx,rgbs))
            return stacked
        else:
            return rgbs

    def get_patch(self):
        return None

from celery import Celery, current_task

app = Celery('tasks')

import uuid

@app.task
def api(input_image_name, output_image_name):
    MINIO_ENDPOINT = os.getenv('MINIO_ENDPOINT')
    saved = load_model("./model.h5")
    # minio client 생성
    minio_client = boto3.client('s3',
                                endpoint_url=MINIO_ENDPOINT,
                                config=boto3.session.Config(signature_version='s3v4'))
    # upload 파일 이름 및 버킷 설정
    bucket_name = 'images'
    object_name = input_image_name

    response = minio_client.get_object(Bucket=bucket_name, Key=object_name)

    with open("/images/" + input_image_name, 'wb') as file:
        for chunk in response['Body'].iter_chunks():
            file.write(chunk)

    api = fashion_tools("/images/" + input_image_name, saved)
    image_ = api.get_dress(False)
    file_type = input_image_name.split('.')[1]
    cv2.imwrite("/images/" + output_image_name, image_)

    # 처리된 이미지를 minio에 저장
    with open("/images/" + output_image_name, 'rb') as file_data:
        minio_client.put_object(Bucket='images', Key=output_image_name, Body=file_data)
    #return current_task.request.id
    return input_image_name + output_image_name

 

실행방법

docker-compose.yml에서 minio만 replica를 1로 설정한 뒤 (나머지 컨테이너는 replica 0), docker-compose up 명령어로 수행합니다

localhost:9001 로 접속하여 docker-compose에서 minio 컨테이너의 환경변수로 설정한 계정 정보로 접속합니다

 

'images' 버킷을 생성합니다

 

access_keys 메뉴에서 access key를 생성합니다

 

생성한 access key를 docker-compose.yml에 기입해 준 뒤, replicas를 전부 1로 변경한 뒤 다시 docker-compose up 명령어를 실행합니다 - 아래는 예시로 access key 추가한 docker-compose.yml 파일

version: "3"
services:
  django:
    deploy:
      replicas: 1
    build:
      context: ./django
    command: python manage.py runserver 0.0.0.0:8080
    ports:
      - "8080:8080"
    environment:
      - BROKER_ID=user
      - BROKER_PASSWORD=password
      - BROKER_URL=rabbit
      - BROKER_PORT=5672
      - AWS_ACCESS_KEY_ID=r73CHi4MtGErG4ggSm7D
      - AWS_SECRET_ACCESS_KEY=6sEmueaJ18oWRnCtRcCW2Z5QGZX6tgNIgRFXqgFJ
      - MINIO_ENDPOINT=http://minio:9000

  worker:
    deploy:
      replicas: 1
    build:
      context: ./celery
    command: celery -A csapi.tasks worker --loglevel=info --max-tasks-per-child 1 -c 1
    environment:
      - CELERY_BROKER_URL=pyamqp://user:password@rabbit:5672//
      - CELERY_RESULT_BACKEND=rpc://
      - AWS_ACCESS_KEY_ID=r73CHi4MtGErG4ggSm7D
      - AWS_SECRET_ACCESS_KEY=6sEmueaJ18oWRnCtRcCW2Z5QGZX6tgNIgRFXqgFJ
      - MINIO_ENDPOINT=http://minio:9000

  rabbit:
    deploy:
      replicas: 1
    image: rabbitmq:3-management
    expose:
      - "5672"
    ports:
      - "15672:15672"
    environment:
      - RABBITMQ_DEFAULT_USER=user
      - RABBITMQ_DEFAULT_PASS=password

  minio:
    deploy:
      replicas: 1
    image: minio/minio:latest
    command: server /data --console-address ":9001"
    volumes:
      - minio_volume:/data
    environment:
      - MINIO_ROOT_USER=minioroot
      - MINIO_ROOT_PASSWORD=minioroot
    ports:
      - "9000:9000"
      - "9001:9001"

volumes:
  minio_volume:
Comments