Notice
Recent Posts
Recent Comments
Link
ยซ   2024/07   ยป
์ผ ์›” ํ™” ์ˆ˜ ๋ชฉ ๊ธˆ ํ† 
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31
Archives
Today
Total
๊ด€๋ฆฌ ๋ฉ”๋‰ด

lgvv98

ch19 ๐Ÿค– CreateML ์‚ฌ์šฉ ๋ฐ ์ฝ”๋“œ๋ฆฌ๋ทฐ ๋ณธ๋ฌธ

โš ๏ธ deprecated โš ๏ธ/ํŒจ์บ (์˜ฌ์ธ์›)

ch19 ๐Ÿค– CreateML ์‚ฌ์šฉ ๋ฐ ์ฝ”๋“œ๋ฆฌ๋ทฐ

๐Ÿฅ• ์บ๋Ÿฟ๋งจ 2021. 7. 19. 15:39

โœ… ์ด๋ฒˆ ์‹œ๊ฐ„์—๋Š” ๋งˆ์ง€๋ง‰ CreateML ๋งŒ๋“œ๋Š” ๋ฒ•๊ณผ ์ฝ”๋“œ๋ฆฌ๋ทฐ๋ฅผ ํ•จ๊ป˜ ๋ณด๋„๋ก ํ•˜์ž

CreateML ๋งŒ๋“œ๋Š” ๋ฒ•

 

์šฐ์„  Xcode์—์„œ CreateML์„ ๋ˆŒ๋Ÿฌ์„œ ์—ด์–ด๋ณด๋„๋ก ํ•˜์ž.

 

๋ˆ„๋ฅด๋ฉด ์ด๋ ‡๊ฒŒ ์—ด๋ฆฌ๊ฒŒ ๋˜๋Š”๋ฐ ๋‚ด๊ฐ€ ์ด๋ฏธ ๋ชจ๋ธ ์†Œ์Šค์— ๋งŒ๋“ค์–ด ๋‘์—ˆ๋Š”๋ฐ, ์ƒˆ๋กœ ์—ด๋ฉด ์ด๋Ÿฐ์‹์œผ๋กœ ๋งŒ๋“ค์–ด์ง€๊ฒŒ ๋œ๋‹ค. 

์ด๋ ‡๊ฒŒ ์—ด๋ฆฌ๊ฒŒ ๋œ๋‹ค. 

 

์ดํ›„ ํŠธ๋ ˆ์ด๋‹ ๋ฐ์ดํ„ฐ ๋ถ€๋ถ„์— ๋ฐ์ดํ„ฐ๋ฅผ ์ถ”๊ฐ€ํ•˜๋ฉด ๋œ๋‹ค.

ํŒŒ์ผ๋“ค

ํŠธ๋ ˆ์ด๋‹ ๋ฐ์ดํ„ฐ์—๋Š” pets-100์„ ๋„ฃ์–ด์ฃผ๊ณ , Testing Data์ชฝ์—๋Š” pets-testing ๋ฐ์ดํ„ฐ ๋„ฃ์–ด์ค€๋‹ค. 

pets-prctice๋Š” ๋‚˜์ค‘์— ์šฐ๋ฆฌ๊ฐ€ ํ•™์Šต์„ ๋งˆ์นœ ํ›„์— ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์ด๋‹ค.

 

โ—๏ธpets-100 ํด๋”๋ฅผ ๋„ฃ์–ด์•ผ์ง€ ํ•˜์œ„์˜ dog๋งŒ ๋„ฃ์œผ๋ฉด ์—๋Ÿฌ๊ฐ€ ๋ฐœ์ƒํ•˜๋‹ˆ ์ฃผ์˜ํ•˜๋„๋ก ํ•œ๋‹ค.

 

ํ…Œ์ดํ„ฐ๋ฅผ ๋„ฃ์€ ํ›„ ํŠธ๋ ˆ์ด๋‹์„ ์‹œํ‚ค๊ธฐ

 

ํŠธ๋ ˆ์ด๋‹์„ ์‹œํ‚ค๋ฉด ๋ผ!!! ์•„์ฃผ ๊ฐ„๋‹จํ•˜๋‹ค.

 

โœ… ํŒŒ๋ผ๋ฏธํ„ฐ (Maximum Iteratoins) : ์—ฌ๋Ÿฌ ๋ฒˆ ๋ฐ˜๋ณตํ•˜๋Š” ๊ฑด๋ฐ, ์˜ˆ๋ฅผ ๋“ค์–ด์„œ ๊ตฌ๊ธ€์˜ ๊ฒฝ์šฐ์—๋Š” ๋ฐ์ดํ„ฐ์˜ ์ˆ˜๊ฐ€ ์—„์ฒญ ๋งŽ์•„์„œ ์ด ๊ฐ’์ด ์ž‘์•„๋„ ๊ดœ์ฐฎ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์Šคํƒ€ํŠธ์—…์˜ ๊ฒฝ์šฐ ๋ฐ์ดํ„ฐ์˜ ์ˆ˜๊ฐ€ ์—„์ฒญ ๋งŽ์ด ์•Š๊ณ  ํ•œ์ •์ ์ธ๋ฐ ํ•œ์ •์  ๋ฐ์ดํ„ฐ์—์„œ ์ตœ์„ ์˜ ๊ฒฐ๊ณผ๋ฅผ ๋ฝ‘๊ธฐ ์œ„ํ•ด์„œ ๋ฐ˜๋ณต์ ์œผ๋กœ ํ•™์Šต์‹œํ‚จ๋‹ค.

โ—๏ธ์ฃผ์˜ํ•  ์‚ฌํ•ญ์œผ๋กœ๋Š” ์˜ค๋ฒ„ํ”ผํŒ…์ด๋ผ๊ณ  ์‹คํ—˜ ํ™˜๊ฒฝ์—์„œ ๊ฐ™์€ ๋ฐ์ดํ„ฐ๋ฅผ ๋„ˆ๋ฌด ๋งŽ์ด ๋ฐ˜๋ณต์‹œํ‚ค๋ฉด ์‹ค์ œ ๋ฐ์ดํ„ฐ๊ฐ€ ๋“ค์–ด์™”์„ ์‹œ, ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ํŒ์ •ํ•˜์ง€ ๋ชปํ•˜๋Š” ์‹ค์ˆ˜๋ฅผ ๋ฒ”ํ•˜๋Š”๋ฐ, ์˜คํžˆ๋ ค ๋ฐ˜๋ณต์„ ๋งŽ์ด ์‹œ์ผฐ์Œ์—๋„ ์ •ํ™•๋„๊ฐ€ ๋‚ฎ์•„์ง€๊ฒŒ ๋œ๋‹ค. ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์„ ์˜ค๋ฒ„ํ”ผํŒ…์ด๋ผ๊ณ  ๋ถ€๋ฅธ๋‹ค.

 

โœ… Augmentations

Augment Data๋Š” ์ด๋ฏธ์ง€์— ๋…ธ์ด์ฆˆ๋ฅผ ์ฃผ๊ณ ๋‚˜ ํšŒ์ „ ๋“ฑ ๋‹ค์–‘ํ•œ ์กฐ๊ฑด์„ ์ค˜์„œ ํ…Œ์ŠคํŒ… ์‹œํ‚ค๋Š”๋ฐ, ์šฐ๋ฆฌ๋Š” ์ด๊ฒƒ์„ ํ†ตํ•ด ์ •ํ™•์„ฑ์„ ๋” ๋†’์ผ ์ˆ˜๋„ ์žˆ๋‹ค.

 

ํŠธ๋ ˆ์ด๋‹ ๋ฐ ๊ฒฐ๊ณผ

 

๊ฒฐ๊ณผ๊ฐ€ ์ด๋ ‡๊ฒŒ ๋œจ๋Š”๋ฐ, ์—ฌ๊ธฐ ๋ณด์—ฌ์ง€๋Š” ์ผ€์ด์Šค๋Š” ๋ชจ๋‘ 100%๋ผ๊ณ  ๋‚˜์˜ค๋‚˜ ์‹ค์ œ๋กœ๋Š” ์™„์ „ํžˆ 100%๊ฐ€ ์•„๋‹ ์ˆ˜๋„ ์žˆ๋‹ค.

Precision : true๋ผ๊ณ  ํ•œ ๊ฒƒ์ค‘์— ์ •๋ง true๋Š” ๋ช‡ํ”„๋กœ์•ผ?

Recall : ์ „์ฒด์ค‘์— ํ”„๋ฆฌ์‹œ์ „

 

์œ„ํ‚ค๋””ํ”ผ์•„ ๋ฌธ์„œ๋ฅด ๋ณด๋ฉด ์•„์ฃผ ์ž˜ ๋‚˜์™€์žˆ๋‹ค!! 

๊ฒฐ๊ตญ์€ ์ˆ˜์น˜๊ฐ€ ๋‘˜ ๋‹ค ๋†’์œผ๋ฉด ์ข‹๋‹ค๋Š” ๊ฒƒ

์—ฌ๋Ÿฌ๊ฐœ์˜ ํด๋ž˜์Šค ํŒŒ์ผ์„ ๋งŒ๋“ค์–ด์„œ ํ…Œ์ŠคํŠธ ํ•ด๋ณด๊ณ  ๊ทธ ์ค‘์—์„œ ๊ฐ€์žฅ ๋ฐธ๋ฅ˜๊ฐ€ ๋†’์€ ๊ฒƒ์„ ์‚ฌ์šฉํ•œ๋‹ค.

 

Roc curve(๊ฐœ๋…) : ์ด๊ฒƒ๋„ ๊ตฌ๊ธ€์— ์น˜๋ฉด ๋‚˜์˜ค๋Š”๋ฐ, ์˜์—ญ์ด ๋„“์–ด์•ผ์ง€ ์ •ํ™•๋„๊ฐ€ ๋†’๋‹ค.

 

CoreML์—์„œ ์ œ๊ณตํ•ด์ฃผ๋Š” ๊ฒƒ์—๋Š” ํ•œ๊ณ„๊ฐ€ ์žˆ๊ธฐ ๋•Œ๋ฌธ์— AI ์—”์ง€๋‹ˆ์–ด๋Š” ๋‹ค๋ฅธ ๊ฒƒ๋“ค๋„ ์“ธ ์ˆ˜ ์žˆ๋‹ค.

Turi Create : ์• ํ”Œ ๊ณต์‹๋ฌธ์„œ์—์„œ ์ œ๊ณตํ•ด์ฃผ๋Š” ํˆด์ด๋‹ค.

tensorflow : ์–˜๋Š” ์Šค์œ„ํ”„ํŠธ๋กœ๋ณ€ํ˜•ํ•ด์„œ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋‹ค

Pytorch : ์–˜๋Š” ์ž์ฒด์ ์œผ๋กœ iOS๋ฅผ ์ง€์›ํ•ด์ฃผ๋Š”๋ฐ, ์ด๊ฒƒ๋„ ์ข‹๋‹ค.

 

 โœ… ์ฝ”๋“œ ๋ฆฌ๋ทฐ!

์• ํ”Œ ๊ณต์‹๋ฌธ์„œ

โœ… ์• ํ”Œ์—์„œ ์ œ๊ณตํ•ด์ฃผ๋Š” ํŒŒ์ผ์ธ๋ฐ, ์ด๊ฑธ ๊ฐ–๊ณ  ์‹ค์Šตํ•  ์˜ˆ์ •์ด๋‹ค.

DogCatClassifier๋Š” ์šฐ๋ฆฌ๊ฐ€ ๋งŒ๋“  mlํŒŒ์ผ์„ ํŠธ๋ ˆ์ด๋‹์„ ๋งˆ์นœ ๊ฒƒ์„ ๋„ฃ๋Š”๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ž.

 

โœ… ImageClassificationView ๋ฅผ ํ•œ๋ฒˆ ๋ณผ๊นŒ?

/*
See LICENSE folder for this sample’s licensing information.

Abstract:
View controller for selecting images and applying Vision + Core ML processing.
*/

import UIKit
import CoreML
import Vision
import ImageIO

class ImageClassificationViewController: UIViewController {
    // MARK: - IBOutlets
    
    @IBOutlet weak var imageView: UIImageView!
    @IBOutlet weak var cameraButton: UIBarButtonItem!
    @IBOutlet weak var classificationLabel: UILabel!
    
    // MARK: - Image Classification
    
    /// - Tag: MLModelSetup
    lazy var classificationRequest: VNCoreMLRequest = {
        do {
            /*
             Use the Swift class `MobileNet` Core ML generates from the model.
             To use a different Core ML classifier model, add it to the project
             and replace `MobileNet` with that model's generated Swift class.
             */
            let model = try VNCoreMLModel(for: DogCatClassifier().model) // ๋น„์ „์ฝ”์–ดML๋ชจ๋ธ๋กœ ๋งŒ๋“ค๊ธฐ
            
            let request = VNCoreMLRequest(model: model, completionHandler: { [weak self] request, error in
                self?.processClassifications(for: request, error: error) // ๋ชจ๋ธํ•œํ…Œ ์ด๋ฏธ์ง€ ๋ถˆ๋Ÿฌ์˜จ๊ฑฐ ์š”์ฒญ์„ ํ•ด์•ผํ•ด
                // ๋ฆฌํ€˜์ŠคํŠธ๊ฐ€ ๋ณด๋‚ด์ง€๊ณ  ๋‚˜์„œ ๊ทธ์— ํ•ด๋‹นํ•˜๋Š” ํ•ธ๋“ค๋กœ๋Š” ์ด๋ ‡๊ฒŒ ํ•œ๋‹ค. CoreML๋ชจ๋ธ ์…‹์—… ํ•ด์ฃผ๊ธฐ
            })
            request.imageCropAndScaleOption = .centerCrop // ๋จธ์‹ ๋ชจ๋ธ ์ด๋ฏธ์ง€๋ณด๋‹ค ๋น„์œจ์ด ๋” ํด์ˆ˜๋„ ์ž‘์„์ˆ˜๋„ ์žˆ๊ณ  ๋“ฑ ๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ๋งŒ๋“ค๋•Œ ์กฐ๊ฑด์„ ์„ค์ •ํ•˜๋Š”๋ฐ, ์ด๊ฑฐ๋Š” ํฌ๊ฒŒ 3๊ฐ€์ง€๊ฐ€ ์žˆ๋‹ค.
            // ์„ผํ„ฐํฌ๋กญ, ์Šค์ผ€์ผ ํ•ํ•ด์„œ ์•ˆ์— ๋”ฑ ๋งž๊ฒŒํ•ด์„œ ํŒ์ •, ํ•„์„ ์‹œ์ผœ์„œ ๊ฝ‰ ์ฑ„์›Œ์„œ ํŒ์ •
            // ์ •๋‹ต์ด ์žˆ๊ธด ๋ณด๋‹ค๋Š” ๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ ๋งŒ๋“ค๋•Œ ์‚ฌ์šฉํ•œ ์กฐ๊ฑด์„ ์‚ฌ์šฉํ•˜๋Š”๋ฐ, ์ฃผ๋กœ ์„ผํ„ฐํฌ๋กญ ๋งŽ์ด ์‚ฌ์šฉํ•œ๋‹ค๊ณ  ํ•œ๋‹ค.
            return request
        } catch {
            fatalError("Failed to load Vision ML model: \(error)")
        }
    }()
    
    /// - Tag: PerformRequests
    func updateClassifications(for image: UIImage) {
        classificationLabel.text = "Classifying..."
        
        let orientation = CGImagePropertyOrientation(image.imageOrientation)
        guard let ciImage = CIImage(image: image) else { fatalError("Unable to create \(CIImage.self) from \(image).") }
        
        DispatchQueue.global(qos: .userInitiated).async {
            let handler = VNImageRequestHandler(ciImage: ciImage, orientation: orientation)
            do {
                try handler.perform([self.classificationRequest])
            } catch {
                /*
                 This handler catches general image processing errors. The `classificationRequest`'s
                 completion handler `processClassifications(_:error:)` catches errors specific
                 to processing that request.
                 */
                print("Failed to perform classification.\n\(error.localizedDescription)")
            }
        }
    }
    
    /// Updates the UI with the results of the classification.
    /// - Tag: ProcessClassifications
    // ์š”์ฒญํ•  ๋•Œ ๋ถˆ๋ฆฌ๋Š” ํ•จ์ˆ˜ ์ฆ‰, ๊ฐค๋Ÿฌ๋ฆฌ์—์„œ ์‚ฌ์ง„ ์„ ํƒํ–ˆ์„๋•Œ, ๊ทธ๊ฑธ ๋ถ„์„ํ•ด์„œ ํ…์ŠคํŠธ๋ฅผ ๋‚ด๋ณด๋‚ด๋Š” ๊ฒƒ ๊นŒ์ง€๊ฐ€ ์—ฌ๊ธฐ์„œ ํ•˜๋Š” ๊ณผ์ •
    // ์ˆ˜ํ–‰๋˜๋Š” ์‹œ๊ฐ„์€ ์ด๋ฏธ์ง€๊ฐ€ ์„ ํƒํ–ˆ์„ ๋•Œ๋ผ๋Š” ๊ฒƒ์„ ์žŠ์ง€๋ง๊ธฐ!! ๊ทธ๋•Œ ์ค‘์š”ํ•œ๊ฒŒ ๋ญ๋ƒ๋ฉด ๋น„์ „์— ์žˆ๋Š” ํ”„๋ ˆ์ž„์›ค์— ์žˆ๋Š” ๋ฆฌํ€˜์ŠคํŠธ ํ•ธ๋“ค๋Ÿฌ๋ฅผ ๋งŒ๋“ค์–ด์„œ ์ด๊ฑธ ๋งŒ๋“ค๊ณ  ์‹ค์ œ๋กœ ์ˆ˜ํ–‰์„ ์‹œ์ผœ์ฃผ๋Š” ๋…€์„!
    func processClassifications(for request: VNRequest, error: Error?) {
        DispatchQueue.main.async {
            guard let results = request.results else {
                self.classificationLabel.text = "Unable to classify image.\n\(error!.localizedDescription)"
                return
            }
            // The `results` will always be `VNClassificationObservation`s, as specified by the Core ML model in this project.
            let classifications = results as! [VNClassificationObservation]
        
            // ์—ฌ๊ธฐ ์•„๋ž˜๊ฐ€ ์‹ค์ œ ๊ฒฐ๊ณผ๋ฌผ์ด ์žˆ๋Š” ๋ถ€๋ถ„
            // ์šฐ๋ฆฌ๋Š” ํด๋ž˜์Šค๊ฐ€ 2๊ฐœ์ง€๋งŒ MobileNet์˜ ๊ฒฝ์šฐ์—๋Š” ํด๋ž˜์Šค๊ฐ€ ๋ฌด๋ ค 1000๊ฐœ๊ฐ€ ๋œ๋‹ค.
            // mlํŒŒ์ผ์—์„œ ํด๋ž˜์Šค์— ๋Œ€ํ•œ ๊ฒฐ๊ณผ๋ฌผ์„ ๊ฐ€์ ธ์™€์„œ description์„ ์“ด๋‹ค.
            if classifications.isEmpty {
                self.classificationLabel.text = "Nothing recognized."
            } else {
                // Display top classifications ranked by confidence in the UI.
                let topClassifications = classifications.prefix(2)
                let descriptions = topClassifications.map { classification in
                    // Formats the classification for display; e.g. "(0.37) cliff, drop, drop-off".
                   return String(format: "  (%.2f) %@", classification.confidence, classification.identifier) // ๊ฒฐ๊ณผ ์•ˆ์— ํด๋ž˜์Šค์™€ ์ •ํ™•๋„์— ๋Œ€ํ•œ ์ •๋ณด๊ฐ€ ์žˆ๋Š”๋ฐ ๊ทธ๊ฒƒ์„ ์šฐ๋ฆฌ๋Š” ์ŠคํŠธ๋ง์œผ๋กœ ๋งŒ๋“ค์–ด์ค€ ๊ฒƒ์ด๋‹ค.
                }
                self.classificationLabel.text = "Classification:\n" + descriptions.joined(separator: "\n")
            }
        }
    }
    
    // MARK: - Photo Actions
    
    @IBAction func takePicture() {
        // Show options for the source picker only if the camera is available.
        guard UIImagePickerController.isSourceTypeAvailable(.camera) else {
            presentPhotoPicker(sourceType: .photoLibrary)
            return
        }
        
        let photoSourcePicker = UIAlertController()
        let takePhoto = UIAlertAction(title: "Take Photo", style: .default) { [unowned self] _ in
            self.presentPhotoPicker(sourceType: .camera)
        }
        let choosePhoto = UIAlertAction(title: "Choose Photo", style: .default) { [unowned self] _ in
            self.presentPhotoPicker(sourceType: .photoLibrary)
        }
        
        photoSourcePicker.addAction(takePhoto)
        photoSourcePicker.addAction(choosePhoto)
        photoSourcePicker.addAction(UIAlertAction(title: "Cancel", style: .cancel, handler: nil))
        
        present(photoSourcePicker, animated: true)
    }
    
    func presentPhotoPicker(sourceType: UIImagePickerControllerSourceType) {
        let picker = UIImagePickerController()
        picker.delegate = self
        picker.sourceType = sourceType
        present(picker, animated: true)
    }
}

extension ImageClassificationViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate {
    // MARK: - Handling Image Picker Selection

    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]) {
        picker.dismiss(animated: true)
        
        // We always expect `imagePickerController(:didFinishPickingMediaWithInfo:)` to supply the original image.
        let image = info[UIImagePickerControllerOriginalImage] as! UIImage
        imageView.image = image
        updateClassifications(for: image)
    }
}

์ฝ”๋“œ๋กœ ์ž‘์„ฑํ•ด ๋‘์—ˆ๋‹ค!! ์ฃผ์„์„ ๋‹ฌ์•„ ๋‘์—ˆ์œผ๋‹ˆ ์ฝ์–ด๋ณด๋„๋ก ํ•œ๋‹ค.

Comments