Merge pull request #4775 from priankakariatyml:ios-doc-updates-part2

PiperOrigin-RevId: 564429285
This commit is contained in:
Copybara-Service 2023-09-11 10:31:13 -07:00
commit cdf199cf96
15 changed files with 202 additions and 235 deletions

View File

@ -44,15 +44,15 @@ NS_SWIFT_NAME(ResultCategory)
@property(nonatomic, readonly, nullable) NSString *displayName; @property(nonatomic, readonly, nullable) NSString *displayName;
/** /**
* Initializes a new `Category` with the given index, score, category name and display name. * Initializes a new `ResultCategory` with the given index, score, category name and display name.
* *
* @param index The index of the label in the corresponding label file. * @param index The index of the label in the corresponding label file.
* @param score The probability score of this label category. * @param score The probability score of this label category.
* @param categoryName The label of this category object. * @param categoryName The label of this category object.
* @param displayName The display name of the label. * @param displayName The display name of the label.
* *
* @return An instance of `Category` initialized with the given index, score, category name and * @return An instance of `ResultCategory` initialized with the given index, score, category name
* display name. * and display name.
*/ */
- (instancetype)initWithIndex:(NSInteger)index - (instancetype)initWithIndex:(NSInteger)index
score:(float)score score:(float)score

View File

@ -80,10 +80,9 @@ NS_SWIFT_NAME(FaceDetector)
error:(NSError **)error NS_DESIGNATED_INITIALIZER; error:(NSError **)error NS_DESIGNATED_INITIALIZER;
/** /**
* Performs face detection on the provided MPPImage using the whole image as region of * Performs face detection on the provided `MPImage` using the whole image as region of
* interest. Rotation will be applied according to the `orientation` property of the provided * interest. Rotation will be applied according to the `orientation` property of the provided
* `MPImage`. Only use this method when the `MPPFaceDetector` is created with running mode * `MPImage`. Only use this method when the `FaceDetector` is created with running mode `.image`.
* `.image`.
* *
* This method supports classification of RGBA images. If your `MPImage` has a source type of * This method supports classification of RGBA images. If your `MPImage` has a source type of
* `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the * `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the

View File

@ -44,7 +44,7 @@ NS_SWIFT_NAME(FaceLandmarker)
* Creates a new instance of `FaceLandmarker` from the given `FaceLandmarkerOptions`. * Creates a new instance of `FaceLandmarker` from the given `FaceLandmarkerOptions`.
* *
* @param options The options of type `FaceLandmarkerOptions` to use for configuring the * @param options The options of type `FaceLandmarkerOptions` to use for configuring the
* `MPPFaceLandmarker`. * `FaceLandmarker`.
* *
* @return A new instance of `FaceLandmarker` with the given options. `nil` if there is an error * @return A new instance of `FaceLandmarker` with the given options. `nil` if there is an error
* in initializing the face landmaker. * in initializing the face landmaker.
@ -53,11 +53,11 @@ NS_SWIFT_NAME(FaceLandmarker)
error:(NSError **)error NS_DESIGNATED_INITIALIZER; error:(NSError **)error NS_DESIGNATED_INITIALIZER;
/** /**
* Performs face landmark detection on the provided MPPImage using the whole image as region of * Performs face landmark detection on the provided `MPImage` using the whole image as region of
* interest. Rotation will be applied according to the `orientation` property of the provided * interest. Rotation will be applied according to the `orientation` property of the provided
* `MPImage`. Only use this method when the `FaceLandmarker` is created with `.image`. * `MPImage`. Only use this method when the `FaceLandmarker` is created with `.image`.
* *
* This method supports RGBA images. If your `MPPImage` has a source type of `.pixelBuffer` or * This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or
* `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format * `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format
* types: * types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
@ -68,8 +68,8 @@ NS_SWIFT_NAME(FaceLandmarker)
* *
* @param image The `MPImage` on which face landmark detection is to be performed. * @param image The `MPImage` on which face landmark detection is to be performed.
* *
* @return An `MPPFaceLandmarkerResult` that contains a list of landmarks. `nil` if there is an * @return An `FaceLandmarkerResult` that contains a list of landmarks. `nil` if there is an error
* error in initializing the face landmaker. * in initializing the face landmaker.
*/ */
- (nullable MPPFaceLandmarkerResult *)detectInImage:(MPPImage *)image - (nullable MPPFaceLandmarkerResult *)detectInImage:(MPPImage *)image
error:(NSError **)error NS_SWIFT_NAME(detect(image:)); error:(NSError **)error NS_SWIFT_NAME(detect(image:));
@ -77,8 +77,8 @@ NS_SWIFT_NAME(FaceLandmarker)
/** /**
* Performs face landmark detection on the provided video frame of type `MPImage` using the whole * Performs face landmark detection on the provided video frame of type `MPImage` using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPImage`. Only use this method when the `MPPFaceLandmarker` is created with * the provided `MPImage`. Only use this method when the `FaceLandmarker` is created with running
* running mode `.video`. * mode `.video`.
* *
* This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or * This method supports RGBA images. If your `MPImage` has a source type of `.pixelBuffer` or
* `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format types: * `.sampleBuffer`, the underlying pixel buffer must have one of the following pixel format types:

View File

@ -31,51 +31,52 @@ NS_SWIFT_NAME(GestureRecognizer)
@interface MPPGestureRecognizer : NSObject @interface MPPGestureRecognizer : NSObject
/** /**
* Creates a new instance of `MPPGestureRecognizer` from an absolute path to a TensorFlow Lite model * Creates a new instance of `GestureRecognizer` from an absolute path to a TensorFlow Lite model
* file stored locally on the device and the default `MPPGestureRecognizerOptions`. * file stored locally on the device and the default `GestureRecognizerOptions`.
* *
* @param modelPath An absolute path to a TensorFlow Lite model file stored locally on the device. * @param modelPath An absolute path to a TensorFlow Lite model file stored locally on the device.
* @param error An optional error parameter populated when there is an error in initializing the * @param error An optional error parameter populated when there is an error in initializing the
* gesture recognizer. * gesture recognizer.
* *
* @return A new instance of `MPPGestureRecognizer` with the given model path. `nil` if there is an * @return A new instance of `GestureRecognizer` with the given model path. `nil` if there is an
* error in initializing the gesture recognizer. * error in initializing the gesture recognizer.
*/ */
- (nullable instancetype)initWithModelPath:(NSString *)modelPath error:(NSError **)error; - (nullable instancetype)initWithModelPath:(NSString *)modelPath error:(NSError **)error;
/** /**
* Creates a new instance of `MPPGestureRecognizer` from the given `MPPGestureRecognizerOptions`. * Creates a new instance of `GestureRecognizer` from the given `GestureRecognizerOptions`.
* *
* @param options The options of type `MPPGestureRecognizerOptions` to use for configuring the * @param options The options of type `GestureRecognizerOptions` to use for configuring the
* `MPPGestureRecognizer`. * `GestureRecognizer`.
* @param error An optional error parameter populated when there is an error in initializing the * @param error An optional error parameter populated when there is an error in initializing the
* gesture recognizer. * gesture recognizer.
* *
* @return A new instance of `MPPGestureRecognizer` with the given options. `nil` if there is an * @return A new instance of `GestureRecognizer` with the given options. `nil` if there is an error
* error in initializing the gesture recognizer. * in initializing the gesture recognizer.
*/ */
- (nullable instancetype)initWithOptions:(MPPGestureRecognizerOptions *)options - (nullable instancetype)initWithOptions:(MPPGestureRecognizerOptions *)options
error:(NSError **)error NS_DESIGNATED_INITIALIZER; error:(NSError **)error NS_DESIGNATED_INITIALIZER;
/** /**
* Performs gesture recognition on the provided MPPImage using the whole image as region of * Performs gesture recognition on the provided `MPImage` using the whole image as region of
* interest. Rotation will be applied according to the `orientation` property of the provided * interest. Rotation will be applied according to the `orientation` property of the provided
* `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with * `MPImage`. Only use this method when the `GestureRecognizer` is created with running mode,
* `MPPRunningModeImage`. * `.image`.
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of *
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer * This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* must have one of the following pixel format types: * `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* RGB with an Alpha channel. * channel.
* *
* @param image The `MPPImage` on which gesture recognition is to be performed. * @param image The `MPImage` on which gesture recognition is to be performed.
* @param error An optional error parameter populated when there is an error in performing gesture * @param error An optional error parameter populated when there is an error in performing gesture
* recognition on the input image. * recognition on the input image.
* *
* @return An `MPPGestureRecognizerResult` object that contains the hand gesture recognition * @return An `GestureRecognizerResult` object that contains the hand gesture recognition
* results. * results.
*/ */
- (nullable MPPGestureRecognizerResult *)recognizeImage:(MPPImage *)image - (nullable MPPGestureRecognizerResult *)recognizeImage:(MPPImage *)image
@ -83,30 +84,30 @@ NS_SWIFT_NAME(GestureRecognizer)
NS_SWIFT_NAME(recognize(image:)); NS_SWIFT_NAME(recognize(image:));
/** /**
* Performs gesture recognition on the provided video frame of type `MPPImage` using the whole * Performs gesture recognition on the provided video frame of type `MPImage` using the whole image
* image as region of interest. Rotation will be applied according to the `orientation` property of * as region of interest. Rotation will be applied according to the `orientation` property of the
* the provided `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with * provided `MPImage`. Only use this method when the `GestureRecognizer` is created with running
* `MPPRunningModeVideo`. * mode, `.video`.
* *
* It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must * It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must
* be monotonically increasing. * be monotonically increasing.
* *
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of * This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer * `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* must have one of the following pixel format types: * pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* RGB with an Alpha channel. * channel.
* *
* @param image The `MPPImage` on which gesture recognition is to be performed. * @param image The `MPImage` on which gesture recognition is to be performed.
* @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input * @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input
* timestamps must be monotonically increasing. * timestamps must be monotonically increasing.
* @param error An optional error parameter populated when there is an error in performing gesture * @param error An optional error parameter populated when there is an error in performing gesture
* recognition on the input video frame. * recognition on the input video frame.
* *
* @return An `MPPGestureRecognizerResult` object that contains the hand gesture recognition * @return An `GestureRecognizerResult` object that contains the hand gesture recognition
* results. * results.
*/ */
- (nullable MPPGestureRecognizerResult *)recognizeVideoFrame:(MPPImage *)image - (nullable MPPGestureRecognizerResult *)recognizeVideoFrame:(MPPImage *)image
@ -115,33 +116,33 @@ NS_SWIFT_NAME(GestureRecognizer)
NS_SWIFT_NAME(recognize(videoFrame:timestampInMilliseconds:)); NS_SWIFT_NAME(recognize(videoFrame:timestampInMilliseconds:));
/** /**
* Sends live stream image data of type `MPPImage` to perform gesture recognition using the whole * Sends live stream image data of type `MPImage` to perform gesture recognition using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with * the provided `MPImage`. Only use this method when the `GestureRecognizer` is created with running
* `MPPRunningModeLiveStream`. * mode, `.liveStream`.
* *
* The object which needs to be continuously notified of the available results of gesture * The object which needs to be continuously notified of the available results of gesture
* recognition must confirm to `MPPGestureRecognizerLiveStreamDelegate` protocol and implement the * recognition must confirm to `GestureRecognizerLiveStreamDelegate` protocol and implement the
* `gestureRecognizer:didFinishRecognitionWithResult:timestampInMilliseconds:error:` * `gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)`
* delegate method. * delegate method.
* *
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent * It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
* to the gesture recognizer. The input timestamps must be monotonically increasing. * to the gesture recognizer. The input timestamps must be monotonically increasing.
* *
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of * This method supports gesture recognition of RGBA images. If your `MPImage` has a source type of
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer * `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
* must have one of the following pixel format types: * pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If the input `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color * If the input `MPImage` has a source type of `.image` ensure that the color space is RGB with an
* space is RGB with an Alpha channel. * Alpha channel.
* *
* If this method is used for performing gesture recognition on live camera frames using * If this method is used for performing gesture recognition on live camera frames using
* `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in * `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in
* `kCMPixelFormat_32RGBA` using its `videoSettings` property. * `kCMPixelFormat_32RGBA` using its `videoSettings` property.
* *
* @param image A live stream image data of type `MPPImage` on which gesture recognition is to be * @param image A live stream image data of type `MPImage` on which gesture recognition is to be
* performed. * performed.
* @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input * @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input
* image is sent to the gesture recognizer. The input timestamps must be monotonically increasing. * image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.

View File

@ -24,12 +24,12 @@ NS_ASSUME_NONNULL_BEGIN
@class MPPGestureRecognizer; @class MPPGestureRecognizer;
/** /**
* This protocol defines an interface for the delegates of `MPPGestureRecognizer` object to receive * This protocol defines an interface for the delegates of `GestureRecognizer` object to receive
* results of performing asynchronous gesture recognition on images (i.e, when `runningMode` = * results of performing asynchronous gesture recognition on images (i.e, when `runningMode` =
* `MPPRunningModeLiveStream`). * `.liveStream`).
* *
* The delegate of `MPPGestureRecognizer` must adopt `MPPGestureRecognizerLiveStreamDelegate` * The delegate of `GestureRecognizer` must adopt `GestureRecognizerLiveStreamDelegate` protocol.
* protocol. The methods in this protocol are optional. * The methods in this protocol are optional.
*/ */
NS_SWIFT_NAME(GestureRecognizerLiveStreamDelegate) NS_SWIFT_NAME(GestureRecognizerLiveStreamDelegate)
@protocol MPPGestureRecognizerLiveStreamDelegate <NSObject> @protocol MPPGestureRecognizerLiveStreamDelegate <NSObject>
@ -37,15 +37,15 @@ NS_SWIFT_NAME(GestureRecognizerLiveStreamDelegate)
@optional @optional
/** /**
* This method notifies a delegate that the results of asynchronous gesture recognition of * This method notifies a delegate that the results of asynchronous gesture recognition of an image
* an image submitted to the `MPPGestureRecognizer` is available. * submitted to the `GestureRecognizer` is available.
* *
* This method is called on a private serial dispatch queue created by the `MPPGestureRecognizer` * This method is called on a private serial dispatch queue created by the `GestureRecognizer` for
* for performing the asynchronous delegates calls. * performing the asynchronous delegates calls.
* *
* @param gestureRecognizer The gesture recognizer which performed the gesture recognition. * @param gestureRecognizer The gesture recognizer which performed the gesture recognition. This is
* This is useful to test equality when there are multiple instances of `MPPGestureRecognizer`. * useful to test equality when there are multiple instances of `GestureRecognizer`.
* @param result The `MPPGestureRecognizerResult` object that contains a list of detections, each * @param result The `GestureRecognizerResult` object that contains a list of detections, each
* detection has a bounding box that is expressed in the unrotated input frame of reference * detection has a bounding box that is expressed in the unrotated input frame of reference
* coordinates system, i.e. in `[0,image_width) x [0,image_height)`, which are the dimensions of the * coordinates system, i.e. in `[0,image_width) x [0,image_height)`, which are the dimensions of the
* underlying image data. * underlying image data.
@ -62,26 +62,25 @@ NS_SWIFT_NAME(GestureRecognizerLiveStreamDelegate)
NS_SWIFT_NAME(gestureRecognizer(_:didFinishGestureRecognition:timestampInMilliseconds:error:)); NS_SWIFT_NAME(gestureRecognizer(_:didFinishGestureRecognition:timestampInMilliseconds:error:));
@end @end
/** Options for setting up a `MPPGestureRecognizer`. */ /** Options for setting up a `GestureRecognizer`. */
NS_SWIFT_NAME(GestureRecognizerOptions) NS_SWIFT_NAME(GestureRecognizerOptions)
@interface MPPGestureRecognizerOptions : MPPTaskOptions <NSCopying> @interface MPPGestureRecognizerOptions : MPPTaskOptions <NSCopying>
/** /**
* Running mode of the gesture recognizer task. Defaults to `MPPRunningModeImage`. * Running mode of the gesture recognizer task. Defaults to `.video`.
* `MPPGestureRecognizer` can be created with one of the following running modes: * `GestureRecognizer` can be created with one of the following running modes:
* 1. `MPPRunningModeImage`: The mode for performing gesture recognition on single image inputs. * 1. `image`: The mode for performing gesture recognition on single image inputs.
* 2. `MPPRunningModeVideo`: The mode for performing gesture recognition on the decoded frames of a * 2. `video`: The mode for performing gesture recognition on the decoded frames of a video.
* video. * 3. `liveStream`: The mode for performing gesture recognition on a live stream of input data,
* 3. `MPPRunningModeLiveStream`: The mode for performing gesture recognition on a live stream of * such as from the camera.
* input data, such as from the camera.
*/ */
@property(nonatomic) MPPRunningMode runningMode; @property(nonatomic) MPPRunningMode runningMode;
/** /**
* An object that confirms to `MPPGestureRecognizerLiveStreamDelegate` protocol. This object must * An object that confirms to `GestureRecognizerLiveStreamDelegate` protocol. This object must
* implement `gestureRecognizer:didFinishRecognitionWithResult:timestampInMilliseconds:error:` to * implement `gestureRecognizer(_:didFinishRecognitionWithResult:timestampInMilliseconds:error:)` to
* receive the results of performing asynchronous gesture recognition on images (i.e, when * receive the results of performing asynchronous gesture recognition on images (i.e, when
* `runningMode` = `MPPRunningModeLiveStream`). * `runningMode` = `.liveStream`).
*/ */
@property(nonatomic, weak, nullable) id<MPPGestureRecognizerLiveStreamDelegate> @property(nonatomic, weak, nullable) id<MPPGestureRecognizerLiveStreamDelegate>
gestureRecognizerLiveStreamDelegate; gestureRecognizerLiveStreamDelegate;
@ -99,18 +98,18 @@ NS_SWIFT_NAME(GestureRecognizerOptions)
@property(nonatomic) float minTrackingConfidence; @property(nonatomic) float minTrackingConfidence;
/** /**
* Sets the optional `MPPClassifierOptions` controlling the canned gestures classifier, such as * Sets the optional `ClassifierOptions` controlling the canned gestures classifier, such as score
* score threshold, allow list and deny list of gestures. The categories for canned gesture * threshold, allow list and deny list of gestures. The categories for canned gesture classifiers
* classifiers are: ["None", "Closed_Fist", "Open_Palm", "Pointing_Up", "Thumb_Down", "Thumb_Up", * are: ["None", "Closed_Fist", "Open_Palm", "Pointing_Up", "Thumb_Down", "Thumb_Up", "Victory",
* "Victory", "ILoveYou"]. * "ILoveYou"].
* *
* TODO: Note this option is subject to change, after scoring merging calculator is implemented. * TODO: Note this option is subject to change, after scoring merging calculator is implemented.
*/ */
@property(nonatomic, copy, nullable) MPPClassifierOptions *cannedGesturesClassifierOptions; @property(nonatomic, copy, nullable) MPPClassifierOptions *cannedGesturesClassifierOptions;
/** /**
* Sets the optional {@link ClassifierOptions} controlling the custom gestures classifier, such as * Sets the optional `ClassifierOptions` controlling the custom gestures classifier, such as score
* score threshold, allow list and deny list of gestures. * threshold, allow list and deny list of gestures.
* *
* TODO: Note this option is subject to change, after scoring merging calculator is implemented. * TODO: Note this option is subject to change, after scoring merging calculator is implemented.
*/ */

View File

@ -20,7 +20,7 @@
NS_ASSUME_NONNULL_BEGIN NS_ASSUME_NONNULL_BEGIN
/** Represents the gesture recognition results generated by MPPGestureRecognizer. */ /** Represents the gesture recognition results generated by `GestureRecognizer`. */
NS_SWIFT_NAME(GestureRecognizerResult) NS_SWIFT_NAME(GestureRecognizerResult)
@interface MPPGestureRecognizerResult : MPPTaskResult @interface MPPGestureRecognizerResult : MPPTaskResult
@ -41,7 +41,7 @@ NS_SWIFT_NAME(GestureRecognizerResult)
@property(nonatomic, readonly) NSArray<NSArray<MPPCategory *> *> *gestures; @property(nonatomic, readonly) NSArray<NSArray<MPPCategory *> *> *gestures;
/** /**
* Initializes a new `MPPGestureRecognizerResult` with the given landmarks, world landmarks, * Initializes a new `GestureRecognizerResult` with the given landmarks, world landmarks,
* handedness, gestures and timestamp (in milliseconds). * handedness, gestures and timestamp (in milliseconds).
* *
* @param landmarks The hand landmarks of detected hands. * @param landmarks The hand landmarks of detected hands.
@ -50,7 +50,7 @@ NS_SWIFT_NAME(GestureRecognizerResult)
* @param handedness The recognized hand gestures of detected hands. * @param handedness The recognized hand gestures of detected hands.
* @param timestampInMilliseconds The timestamp for this result. * @param timestampInMilliseconds The timestamp for this result.
* *
* @return An instance of `MPPGestureRecognizerResult` initialized with the given landmarks, world * @return An instance of `GestureRecognizerResult` initialized with the given landmarks, world
* landmarks, handedness and gestures. * landmarks, handedness and gestures.
* *
*/ */

View File

@ -71,9 +71,3 @@ objc_library(
"//mediapipe/tasks/ios/vision/hand_landmarker/utils:MPPHandLandmarkerResultHelpers", "//mediapipe/tasks/ios/vision/hand_landmarker/utils:MPPHandLandmarkerResultHelpers",
], ],
) )
objc_library(
name = "MPPHandLandmark",
hdrs = ["sources/MPPHandLandmark.h"],
module_name = "MPPHandLandmark",
)

View File

@ -1,65 +0,0 @@
// Copyright 2023 The MediaPipe Authors.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
/**
* The enum containing the 21 hand landmarks.
*/
typedef NS_ENUM(NSUInteger, MPPHandLandmark) {
MPPHandLandmarkWrist,
MPPHandLandmarkThumbCMC,
MPPHandLandmarkThumbMCP,
MPPHandLandmarkThumbIP,
MPPHandLandmarkIndexFingerMCP,
MPPHandLandmarkIndexFingerPIP,
MPPHandLandmarkIndexFingerDIP,
MPPHandLandmarkIndexFingerTIP,
MPPHandLandmarkMiddleFingerMCP,
MPPHandLandmarkMiddleFingerPIP,
MPPHandLandmarkMiddleFingerDIP,
MPPHandLandmarkMiddleFingerTIP,
MPPHandLandmarkRingFingerMCP,
MPPHandLandmarkRingFingerPIP,
MPPHandLandmarkRingFingerDIP,
MPPHandLandmarkRingFingerTIP,
MPPHandLandmarkPinkyMCP,
MPPHandLandmarkPinkyPIP,
MPPHandLandmarkPinkyDIP,
MPPHandLandmarkPinkyTIP,
} NS_SWIFT_NAME(HandLandmark);
NS_ASSUME_NONNULL_END

View File

@ -21,6 +21,52 @@
NS_ASSUME_NONNULL_BEGIN NS_ASSUME_NONNULL_BEGIN
/**
* The enum containing the 21 hand landmarks.
*/
typedef NS_ENUM(NSUInteger, MPPHandLandmark) {
MPPHandLandmarkWrist,
MPPHandLandmarkThumbCMC,
MPPHandLandmarkThumbMCP,
MPPHandLandmarkThumbIP,
MPPHandLandmarkIndexFingerMCP,
MPPHandLandmarkIndexFingerPIP,
MPPHandLandmarkIndexFingerDIP,
MPPHandLandmarkIndexFingerTIP,
MPPHandLandmarkMiddleFingerMCP,
MPPHandLandmarkMiddleFingerPIP,
MPPHandLandmarkMiddleFingerDIP,
MPPHandLandmarkMiddleFingerTIP,
MPPHandLandmarkRingFingerMCP,
MPPHandLandmarkRingFingerPIP,
MPPHandLandmarkRingFingerDIP,
MPPHandLandmarkRingFingerTIP,
MPPHandLandmarkPinkyMCP,
MPPHandLandmarkPinkyPIP,
MPPHandLandmarkPinkyDIP,
MPPHandLandmarkPinkyTIP,
} NS_SWIFT_NAME(HandLandmark);
/** /**
* @brief Performs hand landmarks detection on images. * @brief Performs hand landmarks detection on images.
* *
@ -48,82 +94,81 @@ NS_SWIFT_NAME(HandLandmarker)
@property(class, nonatomic, readonly) NSArray<MPPConnection *> *handConnections; @property(class, nonatomic, readonly) NSArray<MPPConnection *> *handConnections;
/** /**
* Creates a new instance of `MPPHandLandmarker` from an absolute path to a model asset bundle * Creates a new instance of `HandLandmarker` from an absolute path to a model asset bundle stored
* stored locally on the device and the default `MPPHandLandmarkerOptions`. * locally on the device and the default `HandLandmarkerOptions`.
* *
* @param modelPath An absolute path to a model asset bundle stored locally on the device. * @param modelPath An absolute path to a model asset bundle stored locally on the device.
* @param error An optional error parameter populated when there is an error in initializing the * @param error An optional error parameter populated when there is an error in initializing the
* hand landmarker. * hand landmarker.
* *
* @return A new instance of `MPPHandLandmarker` with the given model path. `nil` if there is an * @return A new instance of `HandLandmarker` with the given model path. `nil` if there is an error
* error in initializing the hand landmarker. * in initializing the hand landmarker.
*/ */
- (nullable instancetype)initWithModelPath:(NSString *)modelPath error:(NSError **)error; - (nullable instancetype)initWithModelPath:(NSString *)modelPath error:(NSError **)error;
/** /**
* Creates a new instance of `MPPHandLandmarker` from the given `MPPHandLandmarkerOptions`. * Creates a new instance of `HandLandmarker` from the given `HandLandmarkerOptions`.
* *
* @param options The options of type `MPPHandLandmarkerOptions` to use for configuring the * @param options The options of type `HandLandmarkerOptions` to use for configuring the
* `MPPHandLandmarker`. * `HandLandmarker`.
* @param error An optional error parameter populated when there is an error in initializing the * @param error An optional error parameter populated when there is an error in initializing the
* hand landmarker. * hand landmarker.
* *
* @return A new instance of `MPPHandLandmarker` with the given options. `nil` if there is an * @return A new instance of `HandLandmarker` with the given options. `nil` if there is an error in
* error in initializing the hand landmarker. * initializing the hand landmarker.
*/ */
- (nullable instancetype)initWithOptions:(MPPHandLandmarkerOptions *)options - (nullable instancetype)initWithOptions:(MPPHandLandmarkerOptions *)options
error:(NSError **)error NS_DESIGNATED_INITIALIZER; error:(NSError **)error NS_DESIGNATED_INITIALIZER;
/** /**
* Performs hand landmarks detection on the provided `MPPImage` using the whole image as region of * Performs hand landmarks detection on the provided `MPImage` using the whole image as region of
* interest. Rotation will be applied according to the `orientation` property of the provided * interest. Rotation will be applied according to the `orientation` property of the provided
* `MPPImage`. Only use this method when the `MPPHandLandmarker` is created with * `MPImage`. Only use this method when the `HandLandmarker` is created with running mode, `.image`.
* `MPPRunningModeImage`.
* *
* This method supports performing hand landmarks detection on RGBA images. If your `MPPImage` has a * This method supports performing hand landmarks detection on RGBA images. If your `MPImage` has a
* source type of `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the * source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of
* underlying pixel buffer must have one of the following pixel format types: * the following pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* RGB with an Alpha channel. * channel.
* *
* @param image The `MPPImage` on which hand landmarks detection is to be performed. * @param image The `MPImage` on which hand landmarks detection is to be performed.
* @param error An optional error parameter populated when there is an error in performing hand * @param error An optional error parameter populated when there is an error in performing hand
* landmarks detection on the input image. * landmarks detection on the input image.
* *
* @return An `MPPHandLandmarkerResult` object that contains the hand hand landmarks detection * @return An `HandLandmarkerResult` object that contains the hand hand landmarks detection
* results. * results.
*/ */
- (nullable MPPHandLandmarkerResult *)detectInImage:(MPPImage *)image - (nullable MPPHandLandmarkerResult *)detectInImage:(MPPImage *)image
error:(NSError **)error NS_SWIFT_NAME(detect(image:)); error:(NSError **)error NS_SWIFT_NAME(detect(image:));
/** /**
* Performs hand landmarks detection on the provided video frame of type `MPPImage` using the whole * Performs hand landmarks detection on the provided video frame of type `MPImage` using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPPImage`. Only use this method when the `MPPHandLandmarker` is created with * the provided `MPImage`. Only use this method when the `HandLandmarker` is created with running
* `MPPRunningModeVideo`. * mode, `.video`.
* *
* It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must * It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must
* be monotonically increasing. * be monotonically increasing.
* *
* This method supports performing hand landmarks detection on RGBA images. If your `MPPImage` has a * This method supports performing hand landmarks detection on RGBA images. If your `MPImage` has a
* source type of `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the * source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of
* underlying pixel buffer must have one of the following pixel format types: * the following pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* RGB with an Alpha channel. * channel.
* *
* @param image The `MPPImage` on which hand landmarks detection is to be performed. * @param image The `MPImage` on which hand landmarks detection is to be performed.
* @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input * @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input
* timestamps must be monotonically increasing. * timestamps must be monotonically increasing.
* @param error An optional error parameter populated when there is an error in performing hand * @param error An optional error parameter populated when there is an error in performing hand
* landmarks detection on the input video frame. * landmarks detection on the input video frame.
* *
* @return An `MPPHandLandmarkerResult` object that contains the hand hand landmarks detection * @return An `HandLandmarkerResult` object that contains the hand hand landmarks detection
* results. * results.
*/ */
- (nullable MPPHandLandmarkerResult *)detectInVideoFrame:(MPPImage *)image - (nullable MPPHandLandmarkerResult *)detectInVideoFrame:(MPPImage *)image
@ -132,33 +177,32 @@ NS_SWIFT_NAME(HandLandmarker)
NS_SWIFT_NAME(detect(videoFrame:timestampInMilliseconds:)); NS_SWIFT_NAME(detect(videoFrame:timestampInMilliseconds:));
/** /**
* Sends live stream image data of type `MPPImage` to perform hand landmarks detection using the * Sends live stream image data of type `MPImage` to perform hand landmarks detection using the
* whole image as region of interest. Rotation will be applied according to the `orientation` * whole image as region of interest. Rotation will be applied according to the `orientation`
* property of the provided `MPPImage`. Only use this method when the `MPPHandLandmarker` is created * property of the provided `MPImage`. Only use this method when the `HandLandmarker` is created
* with `MPPRunningModeLiveStream`. * with running mode, `.liveStream`.
* *
* The object which needs to be continuously notified of the available results of hand landmarks * The object which needs to be continuously notified of the available results of hand landmarks
* detection must confirm to `MPPHandLandmarkerLiveStreamDelegate` protocol and implement the * detection must confirm to `HandLandmarkerLiveStreamDelegate` protocol and implement the
* `handLandmarker:didFinishDetectionWithResult:timestampInMilliseconds:error:` * `handLandmarker(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)` delegate method.
* delegate method.
* *
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent * It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
* to the hand landmarker. The input timestamps must be monotonically increasing. * to the hand landmarker. The input timestamps must be monotonically increasing.
* *
* This method supports performing hand landmarks detection on RGBA images. If your `MPPImage` has a * This method supports performing hand landmarks detection on RGBA images. If your `MPImage` has a
* source type of `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the * source type of `.pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of
* underlying pixel buffer must have one of the following pixel format types: * the following pixel format types:
* 1. kCVPixelFormatType_32BGRA * 1. kCVPixelFormatType_32BGRA
* 2. kCVPixelFormatType_32RGBA * 2. kCVPixelFormatType_32RGBA
* *
* If the input `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color * If the input `MPImage` has a source type of `.image` ensure that the color space is RGB with an
* space is RGB with an Alpha channel. * Alpha channel.
* *
* If this method is used for performing hand landmarks detection on live camera frames using * If this method is used for performing hand landmarks detection on live camera frames using
* `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in * `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in
* `kCMPixelFormat_32RGBA` using its `videoSettings` property. * `kCMPixelFormat_32RGBA` using its `videoSettings` property.
* *
* @param image A live stream image data of type `MPPImage` on which hand landmarks detection is to * @param image A live stream image data of type `MPImage` on which hand landmarks detection is to
* be performed. * be performed.
* @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input * @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input
* image is sent to the hand landmarker. The input timestamps must be monotonically increasing. * image is sent to the hand landmarker. The input timestamps must be monotonically increasing.

View File

@ -23,11 +23,11 @@ NS_ASSUME_NONNULL_BEGIN
@class MPPHandLandmarker; @class MPPHandLandmarker;
/** /**
* This protocol defines an interface for the delegates of `MPPHandLandmarker` object to receive * This protocol defines an interface for the delegates of `HandLandmarker` object to receive
* results of performing asynchronous hand landmark detection on images (i.e, when `runningMode` = * results of performing asynchronous hand landmark detection on images (i.e, when
* `MPPRunningModeLiveStream`). * `runningMode` = `.liveStream`).
* *
* The delegate of `MPPHandLandmarker` must adopt `MPPHandLandmarkerLiveStreamDelegate` protocol. * The delegate of `HandLandmarker` must adopt `HandLandmarkerLiveStreamDelegate` protocol.
* The methods in this protocol are optional. * The methods in this protocol are optional.
*/ */
NS_SWIFT_NAME(HandLandmarkerLiveStreamDelegate) NS_SWIFT_NAME(HandLandmarkerLiveStreamDelegate)
@ -37,14 +37,14 @@ NS_SWIFT_NAME(HandLandmarkerLiveStreamDelegate)
/** /**
* This method notifies a delegate that the results of asynchronous hand landmark detection of an * This method notifies a delegate that the results of asynchronous hand landmark detection of an
* image submitted to the `MPPHandLandmarker` is available. * image submitted to the `HandLandmarker` is available.
* *
* This method is called on a private serial dispatch queue created by the `MPPHandLandmarker` * This method is called on a private serial dispatch queue created by the `HandLandmarker` for
* for performing the asynchronous delegates calls. * performing the asynchronous delegates calls.
* *
* @param handLandmarker The hand landmarker which performed the hand landmarking. * @param handLandmarker The hand landmarker which performed the hand landmarking.
* This is useful to test equality when there are multiple instances of `MPPHandLandmarker`. * This is useful to test equality when there are multiple instances of `HandLandmarker`.
* @param result The `MPPHandLandmarkerResult` object that contains a list of detections, each * @param result The `HandLandmarkerResult` object that contains a list of detections, each
* detection has a bounding box that is expressed in the unrotated input frame of reference * detection has a bounding box that is expressed in the unrotated input frame of reference
* coordinates system, i.e. in `[0,image_width) x [0,image_height)`, which are the dimensions of the * coordinates system, i.e. in `[0,image_width) x [0,image_height)`, which are the dimensions of the
* underlying image data. * underlying image data.
@ -60,32 +60,30 @@ NS_SWIFT_NAME(HandLandmarkerLiveStreamDelegate)
NS_SWIFT_NAME(handLandmarker(_:didFinishDetection:timestampInMilliseconds:error:)); NS_SWIFT_NAME(handLandmarker(_:didFinishDetection:timestampInMilliseconds:error:));
@end @end
/** Options for setting up a `MPPHandLandmarker`. */ /** Options for setting up a `HandLandmarker`. */
NS_SWIFT_NAME(HandLandmarkerOptions) NS_SWIFT_NAME(HandLandmarkerOptions)
@interface MPPHandLandmarkerOptions : MPPTaskOptions <NSCopying> @interface MPPHandLandmarkerOptions : MPPTaskOptions <NSCopying>
/** /**
* Running mode of the hand landmarker task. Defaults to `MPPRunningModeImage`. * Running mode of the hand landmarker task. Defaults to `.image`.
* `MPPHandLandmarker` can be created with one of the following running modes: * `HandLandmarker` can be created with one of the following running modes:
* 1. `MPPRunningModeImage`: The mode for performing hand landmark detection on single image * 1. `image`: The mode for performing hand landmark detection on single image inputs.
* inputs. * 2. `video`: The mode for performing hand landmark detection on the decoded frames of a video.
* 2. `MPPRunningModeVideo`: The mode for performing hand landmark detection on the decoded frames * 3. `liveStream`: The mode for performing hand landmark detection on a live stream of input data,
* of a video. * such as from the camera.
* 3. `MPPRunningModeLiveStream`: The mode for performing hand landmark detection on a live stream
* of input data, such as from the camera.
*/ */
@property(nonatomic) MPPRunningMode runningMode; @property(nonatomic) MPPRunningMode runningMode;
/** /**
* An object that confirms to `MPPHandLandmarkerLiveStreamDelegate` protocol. This object must * An object that confirms to `HandLandmarkerLiveStreamDelegate` protocol. This object must
* implement `handLandmarker:didFinishDetectionWithResult:timestampInMilliseconds:error:` to * implement `handLandmarker:didFinishDetectionWithResult:timestampInMilliseconds:error:` to
* receive the results of performing asynchronous hand landmark detection on images (i.e, when * receive the results of performing asynchronous hand landmark detection on images (i.e, when
* `runningMode` = `MPPRunningModeLiveStream`). * `runningMode` = `.liveStream`).
*/ */
@property(nonatomic, weak, nullable) id<MPPHandLandmarkerLiveStreamDelegate> @property(nonatomic, weak, nullable) id<MPPHandLandmarkerLiveStreamDelegate>
handLandmarkerLiveStreamDelegate; handLandmarkerLiveStreamDelegate;
/** The maximum number of hands that can be detected by the `MPPHandLandmarker`. */ /** The maximum number of hands that can be detected by the `HandLandmarker`. */
@property(nonatomic) NSInteger numHands; @property(nonatomic) NSInteger numHands;
/** The minimum confidence score for the hand detection to be considered successful. */ /** The minimum confidence score for the hand detection to be considered successful. */

View File

@ -20,7 +20,7 @@
NS_ASSUME_NONNULL_BEGIN NS_ASSUME_NONNULL_BEGIN
/** Represents the hand landmarker results generated by MPPHandLandmarker. */ /** Represents the hand landmarker results generated by `HandLandmarker`. */
NS_SWIFT_NAME(HandLandmarkerResult) NS_SWIFT_NAME(HandLandmarkerResult)
@interface MPPHandLandmarkerResult : MPPTaskResult @interface MPPHandLandmarkerResult : MPPTaskResult
@ -34,15 +34,15 @@ NS_SWIFT_NAME(HandLandmarkerResult)
@property(nonatomic, readonly) NSArray<NSArray<MPPCategory *> *> *handedness; @property(nonatomic, readonly) NSArray<NSArray<MPPCategory *> *> *handedness;
/** /**
* Initializes a new `MPPHandLandmarkerResult` with the given landmarks, world landmarks, * Initializes a new `HandLandmarkerResult` with the given landmarks, world landmarks, handedness
* handedness and timestamp (in milliseconds). * and timestamp (in milliseconds).
* *
* @param landmarks The hand landmarks of detected hands. * @param landmarks The hand landmarks of detected hands.
* @param worldLandmarks The hand landmarks in world coordniates of detected hands. * @param worldLandmarks The hand landmarks in world coordniates of detected hands.
* @param handedness The handedness of detected hands. * @param handedness The handedness of detected hands.
* @param timestampInMilliseconds The timestamp for this result. * @param timestampInMilliseconds The timestamp for this result.
* *
* @return An instance of `MPPGHandLandmarkerResult` initialized with the given landmarks, world * @return An instance of `HandLandmarkerResult` initialized with the given landmarks, world
* landmarks, handedness and timestamp (in milliseconds). * landmarks, handedness and timestamp (in milliseconds).
* *
*/ */

View File

@ -76,7 +76,7 @@ NS_SWIFT_NAME(ImageClassifier)
error:(NSError **)error NS_DESIGNATED_INITIALIZER; error:(NSError **)error NS_DESIGNATED_INITIALIZER;
/** /**
* Performs image classification on the provided MPPImage using the whole image as region of * Performs image classification on the provided `MPImage` using the whole image as region of
* interest. Rotation will be applied according to the `orientation` property of the provided * interest. Rotation will be applied according to the `orientation` property of the provided
* `MPImage`. Only use this method when the `ImageClassifier` is created with running mode, * `MPImage`. Only use this method when the `ImageClassifier` is created with running mode,
* `.image`. * `.image`.
@ -90,7 +90,7 @@ NS_SWIFT_NAME(ImageClassifier)
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel. * channel.
* *
* @param image The `MPPImage` on which image classification is to be performed. * @param image The `MPImage` on which image classification is to be performed.
* *
* @return An `ImageClassifierResult` object that contains a list of image classifications. * @return An `ImageClassifierResult` object that contains a list of image classifications.
*/ */
@ -101,7 +101,7 @@ NS_SWIFT_NAME(ImageClassifier)
/** /**
* Performs image classification on the provided `MPImage` cropped to the specified region of * Performs image classification on the provided `MPImage` cropped to the specified region of
* interest. Rotation will be applied on the cropped image according to the `orientation` property * interest. Rotation will be applied on the cropped image according to the `orientation` property
* of the provided `MPImage`. Only use this method when the `MPPImageClassifier` is created with * of the provided `MPImage`. Only use this method when the `ImageClassifier` is created with
* running mode, `.image`. * running mode, `.image`.
* *
* This method supports classification of RGBA images. If your `MPImage` has a source type of * This method supports classification of RGBA images. If your `MPImage` has a source type of
@ -127,7 +127,7 @@ NS_SWIFT_NAME(ImageClassifier)
/** /**
* Performs image classification on the provided video frame of type `MPImage` using the whole * Performs image classification on the provided video frame of type `MPImage` using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPImage`. Only use this method when the `MPPImageClassifier` is created with * the provided `MPImage`. Only use this method when the `ImageClassifier` is created with
* running mode `.video`. * running mode `.video`.
* *
* It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must * It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must
@ -142,7 +142,7 @@ NS_SWIFT_NAME(ImageClassifier)
* If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha * If your `MPImage` has a source type of `.image` ensure that the color space is RGB with an Alpha
* channel. * channel.
* *
* @param image The `MPPImage` on which image classification is to be performed. * @param image The `MPImage` on which image classification is to be performed.
* @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input * @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input
* timestamps must be monotonically increasing. * timestamps must be monotonically increasing.
* *
@ -188,8 +188,8 @@ NS_SWIFT_NAME(ImageClassifier)
/** /**
* Sends live stream image data of type `MPImage` to perform image classification using the whole * Sends live stream image data of type `MPImage` to perform image classification using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPImage`. Only use this method when the `ImageClassifier` is created with * the provided `MPImage`. Only use this method when the `ImageClassifier` is created with running
* `MPPRunningModeLiveStream`. * mode `.liveStream`.
* *
* The object which needs to be continuously notified of the available results of image * The object which needs to be continuously notified of the available results of image
* classification must confirm to `ImageClassifierLiveStreamDelegate` protocol and implement the * classification must confirm to `ImageClassifierLiveStreamDelegate` protocol and implement the

View File

@ -24,8 +24,7 @@ NS_ASSUME_NONNULL_BEGIN
/** /**
* This protocol defines an interface for the delegates of `ImageClassifier` object to receive * This protocol defines an interface for the delegates of `ImageClassifier` object to receive
* results of asynchronous classification of images (i.e, when `runningMode = * results of asynchronous classification of images (i.e, when `runningMode` = `.liveStream`).
* .liveStream`).
* *
* The delegate of `ImageClassifier` must adopt `ImageClassifierLiveStreamDelegate` protocol. * The delegate of `ImageClassifier` must adopt `ImageClassifierLiveStreamDelegate` protocol.
* The methods in this protocol are optional. * The methods in this protocol are optional.

View File

@ -118,8 +118,7 @@ NS_SWIFT_NAME(ObjectDetector)
/** /**
* Performs object detection on the provided video frame of type `MPImage` using the whole * Performs object detection on the provided video frame of type `MPImage` using the whole
* image as region of interest. Rotation will be applied according to the `orientation` property of * image as region of interest. Rotation will be applied according to the `orientation` property of
* the provided `MPImage`. Only use this method when the `MPPObjectDetector` is created with * the provided `MPImage`. Only use this method when the `ObjectDetector` is created with `.video`.
* `.video`.
* *
* This method supports detecting objects in of RGBA images. If your `MPImage` has a source type of * This method supports detecting objects in of RGBA images. If your `MPImage` has a source type of
* .pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following * .pixelBuffer` or `.sampleBuffer`, the underlying pixel buffer must have one of the following
@ -170,7 +169,7 @@ NS_SWIFT_NAME(ObjectDetector)
* that you request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its * that you request `AVCaptureVideoDataOutput` to output frames in `kCMPixelFormat_32RGBA` using its
* `videoSettings` property. * `videoSettings` property.
* *
* @param image A live stream image data of type `MPPImage` on which object detection is to be * @param image A live stream image data of type `MPImage` on which object detection is to be
* performed. * performed.
* @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input * @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input
* image is sent to the object detector. The input timestamps must be monotonically increasing. * image is sent to the object detector. The input timestamps must be monotonically increasing.

View File

@ -79,8 +79,7 @@ NS_SWIFT_NAME(ObjectDetectorOptions)
* An object that confirms to `ObjectDetectorLiveStreamDelegate` protocol. This object must * An object that confirms to `ObjectDetectorLiveStreamDelegate` protocol. This object must
* implement `objectDetector(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)` to * implement `objectDetector(_:didFinishDetectionWithResult:timestampInMilliseconds:error:)` to
* receive the results of performing asynchronous object detection on images (i.e, when * receive the results of performing asynchronous object detection on images (i.e, when
* `runningMode` = * `runningMode` = `.liveStream`).
* `.liveStream`).
*/ */
@property(nonatomic, weak, nullable) id<MPPObjectDetectorLiveStreamDelegate> @property(nonatomic, weak, nullable) id<MPPObjectDetectorLiveStreamDelegate>
objectDetectorLiveStreamDelegate; objectDetectorLiveStreamDelegate;