Removed roi methods from MPPGestureRecognizer
This commit is contained in:
parent
dc7c018b39
commit
e905a9fe39
|
@ -83,35 +83,6 @@ NS_SWIFT_NAME(GestureRecognizer)
|
|||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognize(image:));
|
||||
|
||||
/**
|
||||
* Performs gesture recognition on the provided `MPPImage` cropped to the specified region of
|
||||
* interest. Rotation will be applied on the cropped image according to the `orientation` property
|
||||
* of the provided `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with
|
||||
* `MPPRunningModeImage`.
|
||||
*
|
||||
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of
|
||||
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer
|
||||
* must have one of the following pixel format types:
|
||||
* 1. kCVPixelFormatType_32BGRA
|
||||
* 2. kCVPixelFormatType_32RGBA
|
||||
*
|
||||
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is
|
||||
* RGB with an Alpha channel.
|
||||
*
|
||||
* @param image The `MPPImage` on which gesture recognition is to be performed.
|
||||
* @param roi A `CGRect` specifying the region of interest within the given `MPPImage`, on which
|
||||
* gesture recognition should be performed.
|
||||
* @param error An optional error parameter populated when there is an error in performing gesture
|
||||
* recognition on the input image.
|
||||
*
|
||||
* @return An `MPPGestureRecognizerResult` object that contains the hand gesture recognition
|
||||
* results.
|
||||
*/
|
||||
- (nullable MPPGestureRecognizerResult *)recognizeImage:(MPPImage *)image
|
||||
regionOfInterest:(CGRect)roi
|
||||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognize(image:regionOfInterest:));
|
||||
|
||||
/**
|
||||
* Performs gesture recognition on the provided video frame of type `MPPImage` using the whole
|
||||
* image as region of interest. Rotation will be applied according to the `orientation` property of
|
||||
|
@ -144,42 +115,6 @@ NS_SWIFT_NAME(GestureRecognizer)
|
|||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognize(videoFrame:timestampInMilliseconds:));
|
||||
|
||||
/**
|
||||
* Performs gesture recognition on the provided video frame of type `MPPImage` cropped to the
|
||||
* specified region of interest. Rotation will be applied according to the `orientation` property of
|
||||
* the provided `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with
|
||||
* `MPPRunningModeVideo`.
|
||||
*
|
||||
* It's required to provide the video frame's timestamp (in milliseconds). The input timestamps must
|
||||
* be monotonically increasing.
|
||||
*
|
||||
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of
|
||||
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer
|
||||
* must have one of the following pixel format types:
|
||||
* 1. kCVPixelFormatType_32BGRA
|
||||
* 2. kCVPixelFormatType_32RGBA
|
||||
*
|
||||
* If your `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color space is
|
||||
* RGB with an Alpha channel.
|
||||
*
|
||||
* @param image A live stream image data of type `MPPImage` on which gesture recognition is to be
|
||||
* performed.
|
||||
* @param timestampInMilliseconds The video frame's timestamp (in milliseconds). The input
|
||||
* timestamps must be monotonically increasing.
|
||||
* @param roi A `CGRect` specifying the region of interest within the video frame of type
|
||||
* `MPPImage`, on which gesture recognition should be performed.
|
||||
* @param error An optional error parameter populated when there is an error in performing gesture
|
||||
* recognition on the input video frame.
|
||||
*
|
||||
* @return An `MPPGestureRecognizerResult` object that contains the hand gesture recognition
|
||||
* results.
|
||||
*/
|
||||
- (nullable MPPGestureRecognizerResult *)recognizeVideoFrame:(MPPImage *)image
|
||||
timestampInMilliseconds:(NSInteger)timestampInMilliseconds
|
||||
regionOfInterest:(CGRect)roi
|
||||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognize(videoFrame:timestampInMilliseconds:regionOfInterest:));
|
||||
|
||||
/**
|
||||
* Sends live stream image data of type `MPPImage` to perform gesture recognition using the whole
|
||||
* image as region of interest. Rotation will be applied according to the `orientation` property of
|
||||
|
@ -221,50 +156,6 @@ NS_SWIFT_NAME(GestureRecognizer)
|
|||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognizeAsync(image:timestampInMilliseconds:));
|
||||
|
||||
/**
|
||||
* Sends live stream image data of type `MPPImage` to perform gesture recognition, cropped to the
|
||||
* specified region of interest.. Rotation will be applied according to the `orientation` property
|
||||
* of the provided `MPPImage`. Only use this method when the `MPPGestureRecognizer` is created with
|
||||
* `MPPRunningModeLiveStream`.
|
||||
*
|
||||
* The object which needs to be continuously notified of the available results of gesture
|
||||
* recognition must confirm to `MPPGestureRecognizerLiveStreamDelegate` protocol and implement the
|
||||
* `gestureRecognizer:didFinishRecognitionWithResult:timestampInMilliseconds:error:` delegate
|
||||
* method.
|
||||
*
|
||||
* It's required to provide a timestamp (in milliseconds) to indicate when the input image is sent
|
||||
* to the gesture recognizer. The input timestamps must be monotonically increasing.
|
||||
*
|
||||
* This method supports gesture recognition of RGBA images. If your `MPPImage` has a source type of
|
||||
* `MPPImageSourceTypePixelBuffer` or `MPPImageSourceTypeSampleBuffer`, the underlying pixel buffer
|
||||
* must have one of the following pixel format types:
|
||||
* 1. kCVPixelFormatType_32BGRA
|
||||
* 2. kCVPixelFormatType_32RGBA
|
||||
*
|
||||
* If the input `MPPImage` has a source type of `MPPImageSourceTypeImage` ensure that the color
|
||||
* space is RGB with an Alpha channel.
|
||||
*
|
||||
* If this method is used for preforming gesture recognition on live camera frames using
|
||||
* `AVFoundation`, ensure that you request `AVCaptureVideoDataOutput` to output frames in
|
||||
* `kCMPixelFormat_32RGBA` using its `videoSettings` property.
|
||||
*
|
||||
* @param image A live stream image data of type `MPPImage` on which gesture recognition is to be
|
||||
* performed.
|
||||
* @param timestampInMilliseconds The timestamp (in milliseconds) which indicates when the input
|
||||
* image is sent to the gesture recognizer. The input timestamps must be monotonically increasing.
|
||||
* @param roi A `CGRect` specifying the region of interest within the given live stream image data
|
||||
* of type `MPPImage`, on which gesture recognition should be performed.
|
||||
* @param error An optional error parameter populated when there is an error in performing gesture
|
||||
* recognition on the input live stream image data.
|
||||
*
|
||||
* @return `YES` if the image was sent to the task successfully, otherwise `NO`.
|
||||
*/
|
||||
- (BOOL)recognizeAsyncImage:(MPPImage *)image
|
||||
timestampInMilliseconds:(NSInteger)timestampInMilliseconds
|
||||
regionOfInterest:(CGRect)roi
|
||||
error:(NSError **)error
|
||||
NS_SWIFT_NAME(recognizeAsync(image:timestampInMilliseconds:regionOfInterest:));
|
||||
|
||||
- (instancetype)init NS_UNAVAILABLE;
|
||||
|
||||
+ (instancetype)new NS_UNAVAILABLE;
|
||||
|
|
Loading…
Reference in New Issue
Block a user