diff --git a/mediapipe/tasks/ios/vision/object_detector/sources/MPPObjectDetector.h b/mediapipe/tasks/ios/vision/object_detector/sources/MPPObjectDetector.h index 4ac8cc773..82404849b 100644 --- a/mediapipe/tasks/ios/vision/object_detector/sources/MPPObjectDetector.h +++ b/mediapipe/tasks/ios/vision/object_detector/sources/MPPObjectDetector.h @@ -95,6 +95,15 @@ NS_SWIFT_NAME(ObjectDetector) * interest. Rotation will be applied according to the `orientation` property of the provided * `MPPImage`. Only use this method when the `MPPObjectDetector` is created with * `MPPRunningModeImage`. + * + * This method supports classification of RGBA images. If your `MPPImage` has a source type of + * `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer + * must have one of the following pixel format types: + * 1. kCVPixelFormatType_32BGRA + * 2. kCVPixelFormatType_32RGBA + * + * If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is + * RGB with an Alpha channel. * * @param image The `MPPImage` on which object detection is to be performed. * @param error An optional error parameter populated when there is an error in performing object @@ -114,6 +123,15 @@ NS_SWIFT_NAME(ObjectDetector) * image as region of interest. Rotation will be applied according to the `orientation` property of * the provided `MPPImage`. Only use this method when the `MPPObjectDetector` is created with * `MPPRunningModeVideo`. + * + * This method supports classification of RGBA images. If your `MPPImage` has a source type of + * `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer + * must have one of the following pixel format types: + * 1. kCVPixelFormatType_32BGRA + * 2. kCVPixelFormatType_32RGBA + * + * If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is + * RGB with an Alpha channel. * * @param image The `MPPImage` on which object detection is to be performed. * @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input @@ -136,6 +154,7 @@ NS_SWIFT_NAME(ObjectDetector) * image as region of interest. Rotation will be applied according to the `orientation` property of * the provided `MPPImage`. Only use this method when the `MPPObjectDetector` is created with * `MPPRunningModeLiveStream`. + * * The object which needs to be continuously notified of the available results of object * detection must confirm to `MPPObjectDetectorLiveStreamDelegate` protocol and implement the * `objectDetector:didFinishDetectionWithResult:timestampInMilliseconds:error:` delegate method. @@ -143,6 +162,19 @@ NS_SWIFT_NAME(ObjectDetector) * It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent * to the object detector. The input timestamps must be monotonically increasing. * + * This method supports classification of RGBA images. If your `MPPImage` has a source type of + * `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer + * must have one of the following pixel format types: + * 1. kCVPixelFormatType_32BGRA + * 2. kCVPixelFormatType_32RGBA + * + * If the input `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color + * space is RGB with an Alpha channel. + * + * If this method is used for classifying live camera frames using `AVFoundation`, ensure that you + * request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its + * `videoSettings` property. + * * @param image A live stream image data of type `MPPImage` on which object detection is to be * performed. * @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input